Tuesday, November 19, 2019

Why Revenue per Bit Falls Toward Zero

The fateful decision to build all global telecom networks on internet protocol, creating multipurpose networks, essentially means every network now has to be dimensioned (in terms of capacity) to carry the most-bandwidth intensive, low revenue-per-bit service, namely entertainment video, almost always a service that the access provider does not own, and therefore derives no direct revenue from supplying. And such video now dominates data volume on global networks.

That is but one reason why capacity prices tend, over time, to fall towards zero. Essentially, consumer service business models require low prices. The salient example are internet access services, where the internet service provider does not own the actual video services being watched. 

In the U.S. market, for example, consumers might use 300 Gbytes a month, with monthly revenue being perhaps $50, implying gigabyte revenue of 16 cents, and less if consumption is higher. 

Even if the access provider owns a subscription video service, it has the absolute lowest revenue per bit of any other owned service,  and therefore potential profit per bit, of any traffic type. 

Some might argue that an owned subscription video service has revenue per bit two orders of magnitude (100 times) less than voice, for example, in part because voice and text messaging use such little bandwidth, compared to video.

Text messaging has in the past had the highest revenue per bit, followed by voice services. More recently, as text messaging prices have collapsed, voice probably has the highest revenue per bit.

Subscription video always has had low revenue per bit, in large part because, as a media type, it requires so much bandwidth, while revenue is capped by consumer willingness to pay. Assume the average TV viewer has the screen turned on for five hours a day.

That works out to 150 hours a month. Assume an hour of standard definition video streaming (or broadcasting, in the analog world) consumes about one gigabyte per hour. That represents, for one person, consumption of perhaps 150 Gbytes. Assume overall household consumption of 200 Gbytes, and a monthly data cost of $50 per month.

Bump quality levels up to high definition and one easily can double the bandwidth consumption, up to perhaps 300 GB.  

That suggests a “cost”--to watch 150 hours of video--of about 33 cents per gigabyte, with retail price in the dollars per gigabyte range. 

Voice is something else. Assume a mobile or fixed line account represents about 350 minutes a month of usage. Assume the monthly recurring cost of having voice features on a mobile phone is about $20.

Assume data consumption for 350 minutes (5.8 hours a month) is about 21 MB per hour, or roughly 122 MB per month. That implies revenue of about $164 per consumed gigabyte. 

The point is that implied revenue per bit varies tremendously, even if networks now are sized to handle video, not the other media types. 

Voice and messaging arguably have the highest revenue per bit profiles (perhaps as high as hundreds of dollars per gigabyte). Both are applications sold by the mobile access provider and both consumer very little bandwidth. 

Entertainment video subscriptions sold by access providers do generate app revenue for providers, but relatively little, on a revenue per bit basis, compared to the narrowband voice and messaging services. Lots of bandwidth is required, but revenue is flat rate, so the actual revenue per bit hinges on usage. 

Then there are the “bandwidth” products supporting internet access, where the access provider generally earns zero money from app use, and rings the register only on gigabytes purchased. 

Entertainment video arguably generates cents per gigabyte in subscription revenue. That is a business model problem, as retail prices are in the dollars per gigabyte range. 

Mobile bandwidth  generally costs $5 to $8 per per gigabyte (retail prices), lower than the $9 to $10 it cost in 2016 or so, for popular plans, and less than that for plans containing higher amounts of usage. Higher usage plans might feature costs per gigabyte closer to $3.


The big point is that there is a reason why bandwidth prices tend to fall towards zero: consumer willingness and ability to pay for services and apps using bandwidth is relatively low, and does not change much over time.


China, U.S. Lead Smart Speaker Shipments

This analysis of smart speaker shipments in the third quarter of 2019 illustrates the concern some have in Europe that the region is “behind” China and the United States in technology or app innovation. 





Monday, November 18, 2019

FCC Says it Will Auction C-Band Spectrum Itself

It looks like C-band spectrum will be auctioned by the Federal Communications Commission, not a group of satellite firms that want to auction the spectrum themselves. “I 've outlined four principles that the FCC must advance regarding C-band spectrum: we must (1) free up significant spectrum for 5G, (2) do so quickly, (3) generate revenue for the federal government, and (4) ensure continued delivery of services currently using the C-band.

Of those clauses, the tipping point was “generate revenue for the federal government.” The C-Band Alliance, composed of Intelsat, SES and Telesat, has proposed a private auction that allows them to keep the proceeds. Their argument has been that this would free up new spectrum faster than a government auction, in large part because of the court challenges by the firms themselves. 

The counter argument has been that the spectrum belongs to the public, and the public treasury should benefit from its sale. 

So it appears a public auction of 280 megahertz of the C-band is coming, though court challenges are probably inevitable on the part of the C-Band Alliance. As always, there are corresponding private interests whenever any public policy is proposed. 

Digital Realty Platform Aims for Zero Gravity

Digital Realty’s new PlatformDigital architecture essentially tries to ride two opposing trends, data gravity that centralizes processing and storage, at the same time new requirements for local processing at the edge to counteract data gravity.  

The objective is a computing infrastructure matched to business needs irrespective of data center size, scale, location, configuration or ecosystem interconnections, Digital Realty says. 

In essence, the new platform attempts to manage data gravity by creating a compute fabric that uses centralized and decentralized computing as required. 

Gravity and dark matter (dark energy) might now be analogies for information technology forces, concentrating and attracting on one hand, repelling and distributing on the other. 

Data gravity is the notion that data becomes harder to move as its scale and volume increases. The implication is that processing and apps move to where the data resides. 

As the Law of Gravity states that the attraction between objects is directly proportional to their weight or mass, so big data is said also to tend to attract applications and services. 

But we might be overstating the data gravity argument, as perhaps it is the availability of affordable processing or storage at scale that attracts the data, not vice versa. But as in the known universe gravity concentrates, dark matter is seen as pushing the universe to expand. 

At the same time, scale and performance requirements seem also to be migrating closer to the places where apps are used, at the edge, either for reasons of end user experience (performance), the cost of moving data, security and governance reasons, the cost of processing or scope effects (ability to wring more value from the same data). 

Some might call this a form of “anti-gravity” or “dark energy” or “zero gravity” at work, where processing happens not only at remote big data centers but also importantly locally, on the device, on the premises, in the metro area, distributing data stores and processing. 

"Gartner predicts that by 2022, 60 percent of enterprise IT infrastructures will focus on centers of data, rather than traditional data centers,” Digital Reality says. 

It remains to be seen how computing architecture evolves. In principle, either data gravity or zero gravity could develop. In fact, some of us might argue zero gravity is counterbalanced by the  likely emergence of edge computing as key trends. 


Zero gravity might be said to be a scenario where processing happens so efficiently wherever it is needed that gravitational pull, no matter what causes it, is nearly zero. In other words, processing and storage grow everywhere, at the same time, at the edge and center. 

A better way of imagining the architecture might be local versus remote, as, in principle, even a hyperscale data center sits at the network edge. We seem to be heading back towards a balance of remote and local, centralized and decentralized. Big data arguably pushes to decentralization while micro or localized functions tend to create demand for edge and local processing. 

Thursday, November 14, 2019

Will 5G Really Change Video Experience?

It is not hard to find arguments that 5G will create a different and new platform for consumption of entertainment video. Precisely how remains to be seen, but there are some common hopes. 


“Greater realism” has been the main objective in the video entertainment business for some time, rivaled only by the “more choice” positioning. The whole point of high-definition television, 4K and 8K is greater realism. TV in 3D formats supposedly also results in greater realism, and has been attempted for decades, with mixed success. 


That also arguably is the driver behind augmented reality or virtual reality features as well, arguably well suited to 5G, allowing an untethered experience, indoors or outside. 


On the other hand, as gaming and video entertainment fuse, the advantages of 3D, augmented or virtual reality, in an untethered setting, become more interesting. Early applications might emerge around e-sports, for example. 


Some believe venues might be able to locally show many different camera angles to attendees, for example. Of course, that particular feature has been talked about for many decades, and has not yet really taken off. 


Yet others imagine that social elements might well develop, such as shared viewing by people at remote locations, with direct social interactions part of the experience. That is sort of an extrapolation of what people already do when chatting during a show. 


At a more prosaic level, 5G should support mobile video experiences even better than 4G did, though changing mostly the “screen” used and “location where content is viewed” parts of the experience. It is not so clear that the basic business model changes. 


The issue, as always, is which of these develops as a significant commercial reality, if at all. Many of the ideas are not new. But few have really gained commercial traction.

Tuesday, November 12, 2019

Network Effects Mean Telecom Always Has a "Chicken or Egg" Strategy Issue

We sometimes rhetorically ask “which came first, the chicken or the egg?” when attempting to explain how some change, requiring multiple changes to create a useful new product. So it is with many potential innovations 5G might help enable, from precision farming to parking.

Nor does it seem an unusual or uncommon problem in the connectivity or application businesses. #Chicken and egg” strategy problems occur whenever the value proposition for two separate groups is dependent on adoption and use by the other. 

In other words, where network effects exist--such as for communications networks--there always is a chicken-and-egg problem. Investments precede actual use; the network and capabilities have to exist before customers can buy and use them.  So chickens come before eggs. 

In the applications business, this is particularly important if the new business has a winner-take-all character, as so many platform businesses seem to have. That roughly explains the strategy of investing to gain share fast, instead of investing more slowly to produce operating profits. 

Amazon, eBay, YouTube, Facebook and even the Microsoft operating system required or benefitted from network effects or scale. 

The general problem is that a critical mass of potential customers and users is required before investments make sense, but availability ot the infrastructure is required before customers can use the product. 

Consider connected parking, where vehicles with microservices and connectivity can pay for city parking without the use of dedicated parking meters. The whole system does not work autonomously until every car is equipped with connectivity, sensors and software, which likely means a 20-year transition period, until every non-equipped vehicle is retired. 


There are some partial workarounds, such as using smartphones to conduct the transactions, but that does not dispense with the need to support the existing infrastructure as well as the new. 

Chicken-and-egg strategies seem generally to require some period of investment before critical mass is achieved; never an easy sell to investors. Printers cannot be sold without ample access to ink refills, or consumer goods without access to shelf space and good placement. So payments to retailers to stock the products, or assure prominent shelf placement, might be necessary. 

Blackberry by Research in Motion never seemed to solve that problem of creating a robust base of app developers for its hardware. In addition, Blackberry’s operating system never worked as well as newer operating systems optimized for mobile web interactions. 

In other cases it might be possible to create scale by identifying an existing mass of creators, suppliers or products,  and then building on them to create scale. Many note that Microsoft DOS profited by its compatibility with word processing app WordStar and spreadsheets VisiCalc and  Lotus 1-2-3. 

Some might note that Airbnb gained by allowing its listing to be shown on Craigslist, while Google allowed Android to be used by many handset vendors. 

The point is that telecom services represent classic chicken-and-egg problems, and 5G will not be different. The network has to be in place before innovators and developers can start to figure out how to take advantage of the platform.

Monday, November 11, 2019

The Existential Threat to Telecom

Next to demand, illustrated by revenue growth, pricing arguably is the existential threat to any connectivity provider. That probably will not change in the 5G era. Though there is not yet enough evidence to discern what might happen to pricing levels as 5G is deployed, few expect too much upside in the form of higher prices per line or account, long term. 

In the near term, in some markets with lowest use of mobile internet access, mobile internet access revenue can continue to grow. Still, the long term challenge is how to sustain a modest amount of revenue growth over time as legacy sources atrophy. 

For a business that has been driven by “connecting more people” in emerging markets, growth prospects will shift to “higher average revenue per account,” as the number of unconnected people reaches low levels. In other words, mobile service providers will have to sell greater quantities of existing products (more gigabytes of data, principally), or higher-value versions of existing products (faster speeds, bigger usage allowances, higher quality or higher value).

As revenue per unit sold continues to drop, and as new account growth stalls, service providers will have to wring more revenue out of existing accounts.


One fundamental rule I use when analyzing telecom service provider business models is to assume that half of current revenue has to be replaced every decade. One example is the change in composition of Verizon revenue between 1999 and 2013. In 1999, 82 percent of revenue was earned from the fixed network. 

By 2013, 68 percent of revenue was earned by the mobile network. The same sort of change happened with cash flow (“earnings”). In 1999, the fixed network produced 82 percent of cash flow. By 2013, mobility was producing 89 percent of cash flow. The fixed network was creating only 11 percent of cash flow. 


The picture at AT&T was similar. In 2000, AT&T earned 81 percent of revenue from fixed network services. By 2013, AT&T was earning 54 percent of total revenue from mobility services. 


Also, consider CenturyLink. In 2017 (assuming the acquisition of Level 3 Communications is approved), CenturyLink will earn at least 76 percent of revenue from business customers. In the past, CenturyLink, like other rural carriers, earned most of its money from consumer accounts.

The point is that CenturyLink now is unusually positioned with respect to business revenue, earning a far greater percentage of total revenue from enterprise, small or mid-sized businesses and wholesale services, compared to other major providers. 

After the combination with Level 3, CenturyLink will earn no more than 24 percent of total revenue from all consumer sources, and that contribution is likely to keep shrinking. 


Cable operators have done so as well. Where once video entertainment was 100 percent of revenue, Comcast now generates 63 percent of total revenue from other sources. You can see the same process at work in the mobile business. Where 100 percent of revenues essentially came from voice, today about 80 percent of total U.S. mobile operator revenues come from data services, according to Chetan Sharma. 

That trend has been building for some time. Early on, text messaging was the driver. But mobile internet access now drives growth. But saturation is coming. In the first quarter of 2017, mobile data revenues actually declined for the first time, ever.  



The big strategic issue is how revenue drivers will change over the next decade. As impossible as it seems, today’s mobility services are not likely to produce half of total revenues in a decade.

And that means the search for big new revenue sources is imperative. 

Net AI Sustainability Footprint Might be Lower, Even if Data Center Footprint is Higher

Nobody knows yet whether higher energy consumption to support artificial intelligence compute operations will ultimately be offset by lower ...