Friday, August 16, 2019

"5G or 4G" Will Not Matter for Most Users, in Practice

It already is tough to determine what “5G speed” actually means, as early 5G often is based as much on 4G as 5G.

And the problem of assessing 5G speed is going to get worse. We will have to compare 5G primarily using low-band assets, which will have good coverage but restrained speeds; 5G using millimeter wave, which will have extraordinary speed but limited coverage; 5G using mid-band that offers a mix of coverage and capacity, but might also be using 4G representing nearly half of total performance; and 5G using a mix of frequencies and spectrum aggregation.

Beyond all that, 5G devices might be connecting to two or more radio sites at once, further complicating our understanding of which network (5G, 4G, unlicensed or licensed) is being used, at a moment in time.

It soon will only be clear that a particular 5G device, on a particular network, at a specific location, works well, or does not work so well. The actual mix of networks (5G, 4G, licensed and unlicensed; cell locations used simulaneously) might vary quite a lot.

Speed and cost measurements on a cross-country basis--both fixed and mobile--have been contingent. Choices have to be made about what and how  to measure (which plans, across all countries, at what speeds, price points, uptake volumes, including promotions and other buying behavior). 

Then adjustments might have to be made based on household sizes (to get per-user metrics); geography (relatively more urban or rural; large or small country) or pricing power differentials between countries. 

All of that will become more complicated in the 5G era, when virtually any spectrum can be used to support 5G services, with clear and distinctive coverage and capacity profiles, depending on which frequencies are used, and in what mix. 

5G can be used in a legacy-free manner, though perhaps rarely, using only “new” millimeter and mid-band frequencies. It might use a combination of new and legacy frequencies (high, mid and low band assets). 

5G might use spectrum within the low bands (new and legacy), or combine low and mid-band assets. Perhaps the most-common approach will be a mix of spectrum bands. 


Both 4G and 5G spectrum also can be used to support a 5G device, further complicating matters. 

That perhaps already is clear in South Korea, where 5G uses the  mid-band frequencies to support 5G, but where, in many cases, it is a combination of mid-band and 4G spectrum that actually supports usage, although 28-GHz also is authorized and will be used, at some point. 


Some recent tests have used devices able to access 1.5 Gbps of 5G bandwidth using SK Telecom’s 3.5 GHz spectrum, plus 1.15 Gbps of 4G bandwidth at 1.8Ghz, 2.1Ghz, and 2.8GHz frequencies. 

The point is that 5G access is going to be quite heterogenous. There will be many ways of supplying 5G access, and performance will vary based on how the access is supplied. Even when 4G spectrum is not used (dynamic spectrum sharing, spectrum aggregation), 5G capacity will vary based on which bands of spectrum are used, and especially when millimeter wave or mid-band spectrum is available. 

Low-band 5G will be faster than 4G, but less so than when mid-band and high-band assets are used. 

But many early 5G deployments will aggregate 4G with 5G. In other cases 5G might be aggregated with unlicensed spectrum. In other cases, access might default entirely to 4G, when on 5G handsets in rural areas. 

And 4G will keep getting faster, closing the gap with 5G using the coverage frequencies (low-band and mid-band). So even when a 5G device defaults to 4G, the speed experience might not vary too much from 5G. 

The point is that interpreting 5G speeds is going to become highly contingent. Stand-alone 5G is going to be different than non-stand-alone (using 4G). 5G experience will hinge on which frequency bands are used, and what types of spectrum aggregation are possible at specific locations. 

Australia NBN Seems to be Breaking Business Models

Telecom service provider wholesale policies can make or break business models. Back in the early years of the 21st century, U.S. regulators briefly relied on robust wholesale discounts--below actual cost, in many instances--for network service customers, in hopes of stimulating more competition in telecom services markets. 

The policies allowed competitors to buy and use complete services--provisioned lines--to support competitive voice services. The framework allowed firms such as the then-independent AT&T and MCI to grow their retail phone services businesses. 

It all collapsed when the Federal Communications Commission changed its rules and allowed network services suppliers to negotiate market-rate prices. 

Australia seems to be suffering from a related problem with its National Broadband Network, as there is but one supplier of wholesale network services, leading Telstra to experience a 40-percent drop in profits, year over year.

The country’s largest service provider (which accounts for 41.5 percent of the telco industry’s market share) saw its profits plummet 40 per cent in the 2018-2019 financial year, largely because of NBN wholesale tariffs. 

Telstra’s largest rival, Optus, similarly saw profits fall by 32 per cent in the same period. 

That is one clear danger for all telecom regimes relying on a single wholesale network services supplier.The brief U.S. policy reliance on wholesale service supply--seen as a stepping stone to facilities investment--rather quickly was replaced by the alternative of facilities-based competition, and essentially worked because cable TV operators were able to use their own networks to compete in voice and internet access services. 

The 1996 Telecom Act's focus had two goals: to open the local exchange market to competition (by stimulating facilities-based investment) and to promote expanded competition within the long-distance marketplace.

In retrospect, the focus on voice services--at a time when the internet was emerging--seems misplaced. A fundamental change in policy aiming to change voice market share was unveiled precisely at the point that voice services began a long-term decline. 

The primary goal was to provide residential customers with choice and innovation in their local voice telephone service. After nearly seven years, though choice increased for urban customers, investment by the incumbent and competitive carriers was virtually nonexistent.

The problem is compounded by the decline of every legacy revenue stream the wholesale infrastructure is supposed to enable, with declining average revenue per user now a global trend. 

Under such conditions, wholesale prices “need” to be reduced, as retail value is less, so retail price needs to decline. Whether that is possible, and to what extent, is the issue for the NBN.

Thursday, August 15, 2019

Perhaps 4% of U.S. Households Buy Gigabit Internet Access

A new study of U.S. consumer internet access behavior might shed some light on typical speed tiers purchased by customers. The monthly weighted average data consumed by subscribers in the second quarter of 2019 was 271 gigabytes, according to OpenVault.

The median monthly weighted average usage in the second quarter of 2019 (half use more, half use less) was 144.5 GB, up nearly 32 percent from 109.6 GB in the second quarter of 2018. 

About 43 percent of customers using less than 250 GB per month use services operating between 100 Mbps and 150 Mbps, according to OpenVault. 

About 15 percent buy services running between 200 Mbps and 300 Mbps. Another 19 percent purchase services running between 50 Mbps and 75 Mbps. 

Altogether, some 77 percent of U.S. households buy internet access running between 50 Mbps and 300 Mbps. 

About four percent buy gigabit services. 


Wednesday, August 14, 2019

Value Too Low, Price Too High for Linear Video Subscriptions

Almost by definition, consumers who choose not to buy a popular product such as linear subscription video have objections based on value, price, or both. “Nearly half of American cord cutters/nevers said price was the main reason they cut the cord or never connected, the highest rate among all seven countries we reviewed,” say analysts at S&P Global.


On the other hand, over-the-air television is seen as a viable way to satisfy video entertainment needs by just 10 percent of poll respondents who choose not to buy a linear subscription service. 



Is U.S. Consumer Internet Access Market Near Saturation?

If roughly 100 million locations buy fixed network broadband service, and if six percent of those accounts are purchased by business customers, then consumer broadband might be bought by about 94 million U.S. households, out of a base of perhaps 122 million to 135 million total homes. Using the 122 million figure, that implies an adoption rate of about 77 percent. 

Mobile-only households might represent as much as 20 percent of U.S. homes, or perhaps 24.4 million households. Taking the 94 million U.S. homes buying fixed network broadband, and adding the mobile-only households, perhaps 117 million U.S. homes buy either mobile or fixed broadband, excluding satellite broadband customers, which might represent an additional two million accounts. 

If so, then about 119 million U.S. households buy internet access from telcos, cable companies or satellite internet firms. But one also must add customers of wireless ISPs, serving perhaps four million customers. Adding those, one reaches a subscriber base of about 123 million homes.

In other words, we are very nearly at the point where every household that wants to boy internet access already does so.

Telco Execs Say AI Already Used to Support Service, Reduce Costs

Artificial intelligence is a bit like broader elements of computing, in that all networks and business processes now rely on such capabilities, even when few executives or workers actually think much about it. 

The highest current use of AI among service providers worldwide is in service quality management (17 percent) and operational cost savings (16 percent), according to Ericsson. 


Service provider executives expect that their networks will use AI before the end of 2020, as 53 percent of survey respondents report. 

About 55 percent of respondents believe the benefits will be evident within a year or two. The majority of service providers are at the stage of testing AI, with 48 percent focusing on AI to reduce capital expenditure. A further 41 percent are focusing on using AI for optimizing network performance, and 35 percent for new revenue streams.

The responses are based on data from 165 senior executives from 132 mobile communications service providers globally.

Are Communications Services Commodities?

Is communications a commodity? The answer matters, as commodity products tend to be priced at levels just barely above the marginal cost to provide the product. 

One can see this clearly in voice pricing, text messaging and even internet access (easier to explain in terms of cost per bit, but even absolute pricing levels have declined).

In fact, telecom product prices have a tendency to drop towards zero

That, in turn, posed key questions for business models, especially relating to the cost of producing the products sold. The core problem with pricing at marginal cost (the cost to produce the next unit), or close to it, is that the actual recovery of sunk costs does not happen. 


As the point of depreciation is to recover the cost of replacing an asset, so communications service providers must replace the cost of the underlying networks. In essence, that means pricing, longer term, at prices that allow recovery of the cost of rebuilding the network. 

Strictly speaking, pricing at marginal cost does not allow that recovery of sunk network investments.  

In fact, one of the biggest long-term trends in the communications business is the tendency for connectivity services to constantly drop towards “zero” levels. 

That is arguably most true in the capacity supplier portions of the business (bandwidth), the cost of discrete computing operations, the cost of storage or many applications.

In large part, marginal cost pricing is at work. Products that are "services," and perishable, are particularly important settings for such pricing. Airline seats and hotel room stays provide clear examples.

Seats or rooms not sold are highly "perishable." They cannot ever be sold as a flight leaves or a day passes. So it can be a rational practice to monetize those assets at almost any positive price.

Whether marginal cost pricing is “good” for traditional telecom services suppliers is a good question, as the marginal cost of supplying one more megabyte of Internet access, voice or text messaging might well be very close to zero.

Such “near zero pricing” is pretty much what we see with major VoIP services such as Skype. Whether the traditional telecom business can survive such pricing is a big question.

That is hard to square with the capital intensity of building any big network, which mandates a cost quite a lot higher than “zero.”

Surplus is part of the reason for pricing at marginal cost, though. That was why it once made sense for mobile service providers to offer reduced cost, or then eventually unlimited calling “off peak,” when the network was largely unused. 

Surplus capacity caused by declining demand also applies to text messaging, where people are using alternatives. If there is plenty of capacity, offering lower prices to “fill up the pipe” makes sense. And even if most consumers do not actually use those resources, they are presented by value propositions of higher value.

Video entertainment and internet access are the next products to watch. Video is more complicated, as it is an “up the stack” application, not a connectivity service. Retail pricing has to include the cost of content rights, which have not historically varied based on demand, but on supply issues.  

Linear video already has past its peak, while streaming alternatives are in the growth phase.

Internet access, meanwhile, is approaching saturation. That suggests more price pressure on linear video and internet access, as less demand means stranded supply, and therefore incentives to cut prices to boost sales volume.

Marketing practices also play a big part, as the economics of usage on a digital network can be quite different than on an analog network. And some competitors might have assets they can leverage in new ways.

In 1998, AT&T revolutionized the industry with its “Digital One Rate” plan, which eliminated roaming and long-distance charges, effectively eliminating the difference between “extra cost” long distance and flat-fee local calling.

Digital One Rate did not offer unlimited calling at first, but that came soon afterwards. In the near term, lots of people figured out they could use their mobiles to make all “long distance” calls, using their local lines for inbound and local calling only.

With unlimited calling, it became possible to consider abandoning landline service entirely.

At least in part, the growth of mobile subscriptions from 44 million in 1996 to 182 million by the end of 2004 is a result of the higher value of mobile services, based in part on “all distance” calling.

Mobile revenue increased by more than 750 percent, from 10.2 billion dollars in 1993 to more than 88 billion dollars in 2003.


During this same time period, long distance revenue fell by 67 percent to 4.3 billion dollars, down from 13.0 billion dollars.

The point is that connectivity prices and some application (voice, messaging) prices have had a tendency to drop closer to zero over time. Moore’s Law plays a part. Open source also allows lower costs, and therefore more-competitive prices.

That is why the question of whether communications products are commodities does matter. Commodity prices drop, over time, to just above marginal cost. And that implies pressure on business models.

Directv-Dish Merger Fails

Directv’’s termination of its deal to merge with EchoStar, apparently because EchoStar bondholders did not approve, means EchoStar continue...