Tuesday, November 19, 2019

U.S. Streaming Video Customers Consume 520 GBytes Per Month

OpenVault now reports that U.S. consumers of streaming video now consume more than half a terabyte of data per month. Average consumption by cord-cutter subscribers was 520.8 GB, an increase of seven percent in the third quarter of 2019 alone.

Average U.S. subscriber usage on fixed networks was about 275.1 GB in the third quarter, a year-over-year increase of 21 percent.

During the same period, the median monthly weighted average usage increased nearly 25 percent from 118.2 GB to 147.4 GB, indicating that consumption is increasing across the market as a whole.

Some believe increased usage is a business opportunity for retail internet access providers. Others are not so sure. Usage once mattered greatly for telecom service provider revenues, as most revenue (and most of the profit) was generated by products billed “by the unit.” 

Capital investment was fairly easy to model, and profits from incremental investment likewise were easy to predict.

All that has changed, as usage (data consumption) of communications networks is not related in a linear way to revenue or profit, all observers will acknowledge. And that fact has huge implications for business models, as virtually all communication networks are recast as video transport and delivery networks, whatever other media types are carried.

Something on the order of 75 percent of total mobile network traffic in 2021, Cisco predicts. Globally, IP video traffic will be 82 percent of all consumer internet traffic by 2021, up from 73 percent in 2016, Cisco also says.

The basic problem is that entertainment video generates the absolute lowest revenue per bit, and entertainment video will dominate usage on all networks. Conversely while all narrowband services generate the highest revenue per bit, the “value” of narrowband services, expressed as retail price per bit, keeps falling, and usage is declining, in mature markets.

Some even argue that cost per bit exceeds revenue per bit, a long term recipe for failure. That has been cited as a key problem for emerging market mobile providers, where retail prices per megabyte must be quite low, requiring cost per bit levels of perhaps 0.01 cents per megabyte.

Of course, we have to avoid thinking in a linear way. Better technology, new architectures, huge new increases in mobile spectrum, shared spectrum, dynamic use of licensed spectrum and unlicensed spectrum all will change revenue per bit metrics.

Yet others argue that revenue per application now is what counts, not revenue per bit  or cost per bit. In other words, as for products sold in a grocery store, each particular product might have a different profit margin on sales, and what matters really is overall sales, and overall profit levels, not the specific profit levels of products sold.

So the basic business problem for network service providers is that their networks now must be built to support low revenue per bit services. That has key implications for the amount of capital that can be spent on networks to support the expected number of customers, average revenue per account and the amount of stranded assets.


Why Revenue per Bit Falls Toward Zero

The fateful decision to build all global telecom networks on internet protocol, creating multipurpose networks, essentially means every network now has to be dimensioned (in terms of capacity) to carry the most-bandwidth intensive, low revenue-per-bit service, namely entertainment video, almost always a service that the access provider does not own, and therefore derives no direct revenue from supplying. And such video now dominates data volume on global networks.

That is but one reason why capacity prices tend, over time, to fall towards zero. Essentially, consumer service business models require low prices. The salient example are internet access services, where the internet service provider does not own the actual video services being watched. 

In the U.S. market, for example, consumers might use 300 Gbytes a month, with monthly revenue being perhaps $50, implying gigabyte revenue of 16 cents, and less if consumption is higher. 

Even if the access provider owns a subscription video service, it has the absolute lowest revenue per bit of any other owned service,  and therefore potential profit per bit, of any traffic type. 

Some might argue that an owned subscription video service has revenue per bit two orders of magnitude (100 times) less than voice, for example, in part because voice and text messaging use such little bandwidth, compared to video.

Text messaging has in the past had the highest revenue per bit, followed by voice services. More recently, as text messaging prices have collapsed, voice probably has the highest revenue per bit.

Subscription video always has had low revenue per bit, in large part because, as a media type, it requires so much bandwidth, while revenue is capped by consumer willingness to pay. Assume the average TV viewer has the screen turned on for five hours a day.

That works out to 150 hours a month. Assume an hour of standard definition video streaming (or broadcasting, in the analog world) consumes about one gigabyte per hour. That represents, for one person, consumption of perhaps 150 Gbytes. Assume overall household consumption of 200 Gbytes, and a monthly data cost of $50 per month.

Bump quality levels up to high definition and one easily can double the bandwidth consumption, up to perhaps 300 GB.  

That suggests a “cost”--to watch 150 hours of video--of about 33 cents per gigabyte, with retail price in the dollars per gigabyte range. 

Voice is something else. Assume a mobile or fixed line account represents about 350 minutes a month of usage. Assume the monthly recurring cost of having voice features on a mobile phone is about $20.

Assume data consumption for 350 minutes (5.8 hours a month) is about 21 MB per hour, or roughly 122 MB per month. That implies revenue of about $164 per consumed gigabyte. 

The point is that implied revenue per bit varies tremendously, even if networks now are sized to handle video, not the other media types. 

Voice and messaging arguably have the highest revenue per bit profiles (perhaps as high as hundreds of dollars per gigabyte). Both are applications sold by the mobile access provider and both consumer very little bandwidth. 

Entertainment video subscriptions sold by access providers do generate app revenue for providers, but relatively little, on a revenue per bit basis, compared to the narrowband voice and messaging services. Lots of bandwidth is required, but revenue is flat rate, so the actual revenue per bit hinges on usage. 

Then there are the “bandwidth” products supporting internet access, where the access provider generally earns zero money from app use, and rings the register only on gigabytes purchased. 

Entertainment video arguably generates cents per gigabyte in subscription revenue. That is a business model problem, as retail prices are in the dollars per gigabyte range. 

Mobile bandwidth  generally costs $5 to $8 per per gigabyte (retail prices), lower than the $9 to $10 it cost in 2016 or so, for popular plans, and less than that for plans containing higher amounts of usage. Higher usage plans might feature costs per gigabyte closer to $3.


The big point is that there is a reason why bandwidth prices tend to fall towards zero: consumer willingness and ability to pay for services and apps using bandwidth is relatively low, and does not change much over time.


China, U.S. Lead Smart Speaker Shipments

This analysis of smart speaker shipments in the third quarter of 2019 illustrates the concern some have in Europe that the region is “behind” China and the United States in technology or app innovation. 





Monday, November 18, 2019

FCC Says it Will Auction C-Band Spectrum Itself

It looks like C-band spectrum will be auctioned by the Federal Communications Commission, not a group of satellite firms that want to auction the spectrum themselves. “I 've outlined four principles that the FCC must advance regarding C-band spectrum: we must (1) free up significant spectrum for 5G, (2) do so quickly, (3) generate revenue for the federal government, and (4) ensure continued delivery of services currently using the C-band.

Of those clauses, the tipping point was “generate revenue for the federal government.” The C-Band Alliance, composed of Intelsat, SES and Telesat, has proposed a private auction that allows them to keep the proceeds. Their argument has been that this would free up new spectrum faster than a government auction, in large part because of the court challenges by the firms themselves. 

The counter argument has been that the spectrum belongs to the public, and the public treasury should benefit from its sale. 

So it appears a public auction of 280 megahertz of the C-band is coming, though court challenges are probably inevitable on the part of the C-Band Alliance. As always, there are corresponding private interests whenever any public policy is proposed. 

Digital Realty Platform Aims for Zero Gravity

Digital Realty’s new PlatformDigital architecture essentially tries to ride two opposing trends, data gravity that centralizes processing and storage, at the same time new requirements for local processing at the edge to counteract data gravity.  

The objective is a computing infrastructure matched to business needs irrespective of data center size, scale, location, configuration or ecosystem interconnections, Digital Realty says. 

In essence, the new platform attempts to manage data gravity by creating a compute fabric that uses centralized and decentralized computing as required. 

Gravity and dark matter (dark energy) might now be analogies for information technology forces, concentrating and attracting on one hand, repelling and distributing on the other. 

Data gravity is the notion that data becomes harder to move as its scale and volume increases. The implication is that processing and apps move to where the data resides. 

As the Law of Gravity states that the attraction between objects is directly proportional to their weight or mass, so big data is said also to tend to attract applications and services. 

But we might be overstating the data gravity argument, as perhaps it is the availability of affordable processing or storage at scale that attracts the data, not vice versa. But as in the known universe gravity concentrates, dark matter is seen as pushing the universe to expand. 

At the same time, scale and performance requirements seem also to be migrating closer to the places where apps are used, at the edge, either for reasons of end user experience (performance), the cost of moving data, security and governance reasons, the cost of processing or scope effects (ability to wring more value from the same data). 

Some might call this a form of “anti-gravity” or “dark energy” or “zero gravity” at work, where processing happens not only at remote big data centers but also importantly locally, on the device, on the premises, in the metro area, distributing data stores and processing. 

"Gartner predicts that by 2022, 60 percent of enterprise IT infrastructures will focus on centers of data, rather than traditional data centers,” Digital Reality says. 

It remains to be seen how computing architecture evolves. In principle, either data gravity or zero gravity could develop. In fact, some of us might argue zero gravity is counterbalanced by the  likely emergence of edge computing as key trends. 


Zero gravity might be said to be a scenario where processing happens so efficiently wherever it is needed that gravitational pull, no matter what causes it, is nearly zero. In other words, processing and storage grow everywhere, at the same time, at the edge and center. 

A better way of imagining the architecture might be local versus remote, as, in principle, even a hyperscale data center sits at the network edge. We seem to be heading back towards a balance of remote and local, centralized and decentralized. Big data arguably pushes to decentralization while micro or localized functions tend to create demand for edge and local processing. 

Thursday, November 14, 2019

Will 5G Really Change Video Experience?

It is not hard to find arguments that 5G will create a different and new platform for consumption of entertainment video. Precisely how remains to be seen, but there are some common hopes. 


“Greater realism” has been the main objective in the video entertainment business for some time, rivaled only by the “more choice” positioning. The whole point of high-definition television, 4K and 8K is greater realism. TV in 3D formats supposedly also results in greater realism, and has been attempted for decades, with mixed success. 


That also arguably is the driver behind augmented reality or virtual reality features as well, arguably well suited to 5G, allowing an untethered experience, indoors or outside. 


On the other hand, as gaming and video entertainment fuse, the advantages of 3D, augmented or virtual reality, in an untethered setting, become more interesting. Early applications might emerge around e-sports, for example. 


Some believe venues might be able to locally show many different camera angles to attendees, for example. Of course, that particular feature has been talked about for many decades, and has not yet really taken off. 


Yet others imagine that social elements might well develop, such as shared viewing by people at remote locations, with direct social interactions part of the experience. That is sort of an extrapolation of what people already do when chatting during a show. 


At a more prosaic level, 5G should support mobile video experiences even better than 4G did, though changing mostly the “screen” used and “location where content is viewed” parts of the experience. It is not so clear that the basic business model changes. 


The issue, as always, is which of these develops as a significant commercial reality, if at all. Many of the ideas are not new. But few have really gained commercial traction.

Tuesday, November 12, 2019

Network Effects Mean Telecom Always Has a "Chicken or Egg" Strategy Issue

We sometimes rhetorically ask “which came first, the chicken or the egg?” when attempting to explain how some change, requiring multiple changes to create a useful new product. So it is with many potential innovations 5G might help enable, from precision farming to parking.

Nor does it seem an unusual or uncommon problem in the connectivity or application businesses. #Chicken and egg” strategy problems occur whenever the value proposition for two separate groups is dependent on adoption and use by the other. 

In other words, where network effects exist--such as for communications networks--there always is a chicken-and-egg problem. Investments precede actual use; the network and capabilities have to exist before customers can buy and use them.  So chickens come before eggs. 

In the applications business, this is particularly important if the new business has a winner-take-all character, as so many platform businesses seem to have. That roughly explains the strategy of investing to gain share fast, instead of investing more slowly to produce operating profits. 

Amazon, eBay, YouTube, Facebook and even the Microsoft operating system required or benefitted from network effects or scale. 

The general problem is that a critical mass of potential customers and users is required before investments make sense, but availability ot the infrastructure is required before customers can use the product. 

Consider connected parking, where vehicles with microservices and connectivity can pay for city parking without the use of dedicated parking meters. The whole system does not work autonomously until every car is equipped with connectivity, sensors and software, which likely means a 20-year transition period, until every non-equipped vehicle is retired. 


There are some partial workarounds, such as using smartphones to conduct the transactions, but that does not dispense with the need to support the existing infrastructure as well as the new. 

Chicken-and-egg strategies seem generally to require some period of investment before critical mass is achieved; never an easy sell to investors. Printers cannot be sold without ample access to ink refills, or consumer goods without access to shelf space and good placement. So payments to retailers to stock the products, or assure prominent shelf placement, might be necessary. 

Blackberry by Research in Motion never seemed to solve that problem of creating a robust base of app developers for its hardware. In addition, Blackberry’s operating system never worked as well as newer operating systems optimized for mobile web interactions. 

In other cases it might be possible to create scale by identifying an existing mass of creators, suppliers or products,  and then building on them to create scale. Many note that Microsoft DOS profited by its compatibility with word processing app WordStar and spreadsheets VisiCalc and  Lotus 1-2-3. 

Some might note that Airbnb gained by allowing its listing to be shown on Craigslist, while Google allowed Android to be used by many handset vendors. 

The point is that telecom services represent classic chicken-and-egg problems, and 5G will not be different. The network has to be in place before innovators and developers can start to figure out how to take advantage of the platform.

DIY and Licensed GenAI Patterns Will Continue

As always with software, firms are going to opt for a mix of "do it yourself" owned technology and licensed third party offerings....