Tuesday, November 19, 2019

China, U.S. Lead Smart Speaker Shipments

This analysis of smart speaker shipments in the third quarter of 2019 illustrates the concern some have in Europe that the region is “behind” China and the United States in technology or app innovation. 





Monday, November 18, 2019

FCC Says it Will Auction C-Band Spectrum Itself

It looks like C-band spectrum will be auctioned by the Federal Communications Commission, not a group of satellite firms that want to auction the spectrum themselves. “I 've outlined four principles that the FCC must advance regarding C-band spectrum: we must (1) free up significant spectrum for 5G, (2) do so quickly, (3) generate revenue for the federal government, and (4) ensure continued delivery of services currently using the C-band.

Of those clauses, the tipping point was “generate revenue for the federal government.” The C-Band Alliance, composed of Intelsat, SES and Telesat, has proposed a private auction that allows them to keep the proceeds. Their argument has been that this would free up new spectrum faster than a government auction, in large part because of the court challenges by the firms themselves. 

The counter argument has been that the spectrum belongs to the public, and the public treasury should benefit from its sale. 

So it appears a public auction of 280 megahertz of the C-band is coming, though court challenges are probably inevitable on the part of the C-Band Alliance. As always, there are corresponding private interests whenever any public policy is proposed. 

Digital Realty Platform Aims for Zero Gravity

Digital Realty’s new PlatformDigital architecture essentially tries to ride two opposing trends, data gravity that centralizes processing and storage, at the same time new requirements for local processing at the edge to counteract data gravity.  

The objective is a computing infrastructure matched to business needs irrespective of data center size, scale, location, configuration or ecosystem interconnections, Digital Realty says. 

In essence, the new platform attempts to manage data gravity by creating a compute fabric that uses centralized and decentralized computing as required. 

Gravity and dark matter (dark energy) might now be analogies for information technology forces, concentrating and attracting on one hand, repelling and distributing on the other. 

Data gravity is the notion that data becomes harder to move as its scale and volume increases. The implication is that processing and apps move to where the data resides. 

As the Law of Gravity states that the attraction between objects is directly proportional to their weight or mass, so big data is said also to tend to attract applications and services. 

But we might be overstating the data gravity argument, as perhaps it is the availability of affordable processing or storage at scale that attracts the data, not vice versa. But as in the known universe gravity concentrates, dark matter is seen as pushing the universe to expand. 

At the same time, scale and performance requirements seem also to be migrating closer to the places where apps are used, at the edge, either for reasons of end user experience (performance), the cost of moving data, security and governance reasons, the cost of processing or scope effects (ability to wring more value from the same data). 

Some might call this a form of “anti-gravity” or “dark energy” or “zero gravity” at work, where processing happens not only at remote big data centers but also importantly locally, on the device, on the premises, in the metro area, distributing data stores and processing. 

"Gartner predicts that by 2022, 60 percent of enterprise IT infrastructures will focus on centers of data, rather than traditional data centers,” Digital Reality says. 

It remains to be seen how computing architecture evolves. In principle, either data gravity or zero gravity could develop. In fact, some of us might argue zero gravity is counterbalanced by the  likely emergence of edge computing as key trends. 


Zero gravity might be said to be a scenario where processing happens so efficiently wherever it is needed that gravitational pull, no matter what causes it, is nearly zero. In other words, processing and storage grow everywhere, at the same time, at the edge and center. 

A better way of imagining the architecture might be local versus remote, as, in principle, even a hyperscale data center sits at the network edge. We seem to be heading back towards a balance of remote and local, centralized and decentralized. Big data arguably pushes to decentralization while micro or localized functions tend to create demand for edge and local processing. 

Thursday, November 14, 2019

Will 5G Really Change Video Experience?

It is not hard to find arguments that 5G will create a different and new platform for consumption of entertainment video. Precisely how remains to be seen, but there are some common hopes. 


“Greater realism” has been the main objective in the video entertainment business for some time, rivaled only by the “more choice” positioning. The whole point of high-definition television, 4K and 8K is greater realism. TV in 3D formats supposedly also results in greater realism, and has been attempted for decades, with mixed success. 


That also arguably is the driver behind augmented reality or virtual reality features as well, arguably well suited to 5G, allowing an untethered experience, indoors or outside. 


On the other hand, as gaming and video entertainment fuse, the advantages of 3D, augmented or virtual reality, in an untethered setting, become more interesting. Early applications might emerge around e-sports, for example. 


Some believe venues might be able to locally show many different camera angles to attendees, for example. Of course, that particular feature has been talked about for many decades, and has not yet really taken off. 


Yet others imagine that social elements might well develop, such as shared viewing by people at remote locations, with direct social interactions part of the experience. That is sort of an extrapolation of what people already do when chatting during a show. 


At a more prosaic level, 5G should support mobile video experiences even better than 4G did, though changing mostly the “screen” used and “location where content is viewed” parts of the experience. It is not so clear that the basic business model changes. 


The issue, as always, is which of these develops as a significant commercial reality, if at all. Many of the ideas are not new. But few have really gained commercial traction.

Tuesday, November 12, 2019

Network Effects Mean Telecom Always Has a "Chicken or Egg" Strategy Issue

We sometimes rhetorically ask “which came first, the chicken or the egg?” when attempting to explain how some change, requiring multiple changes to create a useful new product. So it is with many potential innovations 5G might help enable, from precision farming to parking.

Nor does it seem an unusual or uncommon problem in the connectivity or application businesses. #Chicken and egg” strategy problems occur whenever the value proposition for two separate groups is dependent on adoption and use by the other. 

In other words, where network effects exist--such as for communications networks--there always is a chicken-and-egg problem. Investments precede actual use; the network and capabilities have to exist before customers can buy and use them.  So chickens come before eggs. 

In the applications business, this is particularly important if the new business has a winner-take-all character, as so many platform businesses seem to have. That roughly explains the strategy of investing to gain share fast, instead of investing more slowly to produce operating profits. 

Amazon, eBay, YouTube, Facebook and even the Microsoft operating system required or benefitted from network effects or scale. 

The general problem is that a critical mass of potential customers and users is required before investments make sense, but availability ot the infrastructure is required before customers can use the product. 

Consider connected parking, where vehicles with microservices and connectivity can pay for city parking without the use of dedicated parking meters. The whole system does not work autonomously until every car is equipped with connectivity, sensors and software, which likely means a 20-year transition period, until every non-equipped vehicle is retired. 


There are some partial workarounds, such as using smartphones to conduct the transactions, but that does not dispense with the need to support the existing infrastructure as well as the new. 

Chicken-and-egg strategies seem generally to require some period of investment before critical mass is achieved; never an easy sell to investors. Printers cannot be sold without ample access to ink refills, or consumer goods without access to shelf space and good placement. So payments to retailers to stock the products, or assure prominent shelf placement, might be necessary. 

Blackberry by Research in Motion never seemed to solve that problem of creating a robust base of app developers for its hardware. In addition, Blackberry’s operating system never worked as well as newer operating systems optimized for mobile web interactions. 

In other cases it might be possible to create scale by identifying an existing mass of creators, suppliers or products,  and then building on them to create scale. Many note that Microsoft DOS profited by its compatibility with word processing app WordStar and spreadsheets VisiCalc and  Lotus 1-2-3. 

Some might note that Airbnb gained by allowing its listing to be shown on Craigslist, while Google allowed Android to be used by many handset vendors. 

The point is that telecom services represent classic chicken-and-egg problems, and 5G will not be different. The network has to be in place before innovators and developers can start to figure out how to take advantage of the platform.

Monday, November 11, 2019

The Existential Threat to Telecom

Next to demand, illustrated by revenue growth, pricing arguably is the existential threat to any connectivity provider. That probably will not change in the 5G era. Though there is not yet enough evidence to discern what might happen to pricing levels as 5G is deployed, few expect too much upside in the form of higher prices per line or account, long term. 

In the near term, in some markets with lowest use of mobile internet access, mobile internet access revenue can continue to grow. Still, the long term challenge is how to sustain a modest amount of revenue growth over time as legacy sources atrophy. 

For a business that has been driven by “connecting more people” in emerging markets, growth prospects will shift to “higher average revenue per account,” as the number of unconnected people reaches low levels. In other words, mobile service providers will have to sell greater quantities of existing products (more gigabytes of data, principally), or higher-value versions of existing products (faster speeds, bigger usage allowances, higher quality or higher value).

As revenue per unit sold continues to drop, and as new account growth stalls, service providers will have to wring more revenue out of existing accounts.


One fundamental rule I use when analyzing telecom service provider business models is to assume that half of current revenue has to be replaced every decade. One example is the change in composition of Verizon revenue between 1999 and 2013. In 1999, 82 percent of revenue was earned from the fixed network. 

By 2013, 68 percent of revenue was earned by the mobile network. The same sort of change happened with cash flow (“earnings”). In 1999, the fixed network produced 82 percent of cash flow. By 2013, mobility was producing 89 percent of cash flow. The fixed network was creating only 11 percent of cash flow. 


The picture at AT&T was similar. In 2000, AT&T earned 81 percent of revenue from fixed network services. By 2013, AT&T was earning 54 percent of total revenue from mobility services. 


Also, consider CenturyLink. In 2017 (assuming the acquisition of Level 3 Communications is approved), CenturyLink will earn at least 76 percent of revenue from business customers. In the past, CenturyLink, like other rural carriers, earned most of its money from consumer accounts.

The point is that CenturyLink now is unusually positioned with respect to business revenue, earning a far greater percentage of total revenue from enterprise, small or mid-sized businesses and wholesale services, compared to other major providers. 

After the combination with Level 3, CenturyLink will earn no more than 24 percent of total revenue from all consumer sources, and that contribution is likely to keep shrinking. 


Cable operators have done so as well. Where once video entertainment was 100 percent of revenue, Comcast now generates 63 percent of total revenue from other sources. You can see the same process at work in the mobile business. Where 100 percent of revenues essentially came from voice, today about 80 percent of total U.S. mobile operator revenues come from data services, according to Chetan Sharma. 

That trend has been building for some time. Early on, text messaging was the driver. But mobile internet access now drives growth. But saturation is coming. In the first quarter of 2017, mobile data revenues actually declined for the first time, ever.  



The big strategic issue is how revenue drivers will change over the next decade. As impossible as it seems, today’s mobility services are not likely to produce half of total revenues in a decade.

And that means the search for big new revenue sources is imperative. 

Tuesday, November 5, 2019

Is Mass Market Virtual Reality a 10-Year or 20-Year Issue?

Virtual reality with a resolution equivalent to 4K TV is presently said to require data speeds of about 1 Gbps for smooth play or 2.5 Gbps for interactive sessions, both requiring a minimal latency of 10 milliseconds, round-trip.

And is why many believe 5G--by itself--will not enable such new use cases. Edge computing and artificial intelligence are intimately required as well. 


The need for synchronizing all those elements, plus advances on the device side, might mean that much of the actual commercial upside does not happen within the next decade, but takes longer. 

As frustrating as that might be, 3G and 4G have tended to show that important and novel use cases take more than 10 years to develop. 

So 5G, edge computing and artificial intelligence, plus advances in devices, applications and monetization models, are intrinsically related where it comes to enabling widespread VR and AR use cases.

Directv-Dish Merger Fails

Directv’’s termination of its deal to merge with EchoStar, apparently because EchoStar bondholders did not approve, means EchoStar continue...