Tuesday, November 12, 2019

Network Effects Mean Telecom Always Has a "Chicken or Egg" Strategy Issue

We sometimes rhetorically ask “which came first, the chicken or the egg?” when attempting to explain how some change, requiring multiple changes to create a useful new product. So it is with many potential innovations 5G might help enable, from precision farming to parking.

Nor does it seem an unusual or uncommon problem in the connectivity or application businesses. #Chicken and egg” strategy problems occur whenever the value proposition for two separate groups is dependent on adoption and use by the other. 

In other words, where network effects exist--such as for communications networks--there always is a chicken-and-egg problem. Investments precede actual use; the network and capabilities have to exist before customers can buy and use them.  So chickens come before eggs. 

In the applications business, this is particularly important if the new business has a winner-take-all character, as so many platform businesses seem to have. That roughly explains the strategy of investing to gain share fast, instead of investing more slowly to produce operating profits. 

Amazon, eBay, YouTube, Facebook and even the Microsoft operating system required or benefitted from network effects or scale. 

The general problem is that a critical mass of potential customers and users is required before investments make sense, but availability ot the infrastructure is required before customers can use the product. 

Consider connected parking, where vehicles with microservices and connectivity can pay for city parking without the use of dedicated parking meters. The whole system does not work autonomously until every car is equipped with connectivity, sensors and software, which likely means a 20-year transition period, until every non-equipped vehicle is retired. 


There are some partial workarounds, such as using smartphones to conduct the transactions, but that does not dispense with the need to support the existing infrastructure as well as the new. 

Chicken-and-egg strategies seem generally to require some period of investment before critical mass is achieved; never an easy sell to investors. Printers cannot be sold without ample access to ink refills, or consumer goods without access to shelf space and good placement. So payments to retailers to stock the products, or assure prominent shelf placement, might be necessary. 

Blackberry by Research in Motion never seemed to solve that problem of creating a robust base of app developers for its hardware. In addition, Blackberry’s operating system never worked as well as newer operating systems optimized for mobile web interactions. 

In other cases it might be possible to create scale by identifying an existing mass of creators, suppliers or products,  and then building on them to create scale. Many note that Microsoft DOS profited by its compatibility with word processing app WordStar and spreadsheets VisiCalc and  Lotus 1-2-3. 

Some might note that Airbnb gained by allowing its listing to be shown on Craigslist, while Google allowed Android to be used by many handset vendors. 

The point is that telecom services represent classic chicken-and-egg problems, and 5G will not be different. The network has to be in place before innovators and developers can start to figure out how to take advantage of the platform.

Monday, November 11, 2019

The Existential Threat to Telecom

Next to demand, illustrated by revenue growth, pricing arguably is the existential threat to any connectivity provider. That probably will not change in the 5G era. Though there is not yet enough evidence to discern what might happen to pricing levels as 5G is deployed, few expect too much upside in the form of higher prices per line or account, long term. 

In the near term, in some markets with lowest use of mobile internet access, mobile internet access revenue can continue to grow. Still, the long term challenge is how to sustain a modest amount of revenue growth over time as legacy sources atrophy. 

For a business that has been driven by “connecting more people” in emerging markets, growth prospects will shift to “higher average revenue per account,” as the number of unconnected people reaches low levels. In other words, mobile service providers will have to sell greater quantities of existing products (more gigabytes of data, principally), or higher-value versions of existing products (faster speeds, bigger usage allowances, higher quality or higher value).

As revenue per unit sold continues to drop, and as new account growth stalls, service providers will have to wring more revenue out of existing accounts.


One fundamental rule I use when analyzing telecom service provider business models is to assume that half of current revenue has to be replaced every decade. One example is the change in composition of Verizon revenue between 1999 and 2013. In 1999, 82 percent of revenue was earned from the fixed network. 

By 2013, 68 percent of revenue was earned by the mobile network. The same sort of change happened with cash flow (“earnings”). In 1999, the fixed network produced 82 percent of cash flow. By 2013, mobility was producing 89 percent of cash flow. The fixed network was creating only 11 percent of cash flow. 


The picture at AT&T was similar. In 2000, AT&T earned 81 percent of revenue from fixed network services. By 2013, AT&T was earning 54 percent of total revenue from mobility services. 


Also, consider CenturyLink. In 2017 (assuming the acquisition of Level 3 Communications is approved), CenturyLink will earn at least 76 percent of revenue from business customers. In the past, CenturyLink, like other rural carriers, earned most of its money from consumer accounts.

The point is that CenturyLink now is unusually positioned with respect to business revenue, earning a far greater percentage of total revenue from enterprise, small or mid-sized businesses and wholesale services, compared to other major providers. 

After the combination with Level 3, CenturyLink will earn no more than 24 percent of total revenue from all consumer sources, and that contribution is likely to keep shrinking. 


Cable operators have done so as well. Where once video entertainment was 100 percent of revenue, Comcast now generates 63 percent of total revenue from other sources. You can see the same process at work in the mobile business. Where 100 percent of revenues essentially came from voice, today about 80 percent of total U.S. mobile operator revenues come from data services, according to Chetan Sharma. 

That trend has been building for some time. Early on, text messaging was the driver. But mobile internet access now drives growth. But saturation is coming. In the first quarter of 2017, mobile data revenues actually declined for the first time, ever.  



The big strategic issue is how revenue drivers will change over the next decade. As impossible as it seems, today’s mobility services are not likely to produce half of total revenues in a decade.

And that means the search for big new revenue sources is imperative. 

Tuesday, November 5, 2019

Is Mass Market Virtual Reality a 10-Year or 20-Year Issue?

Virtual reality with a resolution equivalent to 4K TV is presently said to require data speeds of about 1 Gbps for smooth play or 2.5 Gbps for interactive sessions, both requiring a minimal latency of 10 milliseconds, round-trip.

And is why many believe 5G--by itself--will not enable such new use cases. Edge computing and artificial intelligence are intimately required as well. 


The need for synchronizing all those elements, plus advances on the device side, might mean that much of the actual commercial upside does not happen within the next decade, but takes longer. 

As frustrating as that might be, 3G and 4G have tended to show that important and novel use cases take more than 10 years to develop. 

So 5G, edge computing and artificial intelligence, plus advances in devices, applications and monetization models, are intrinsically related where it comes to enabling widespread VR and AR use cases.

New Space Emerges

The “space” industry now includes many private and commercial initiatives directly related to communications, such as constellations of low earth orbit satellites, but also many not of that sort, such as space tourism, private launch capability, small satellites and small satellite launch, Imagery, asteroid or planetary mining.

One key difference between “new space” and “old space” is the emphasis on private and commercial ventures not directly run by, or for, governments. 

Monday, November 4, 2019

Will Businesses Abandon Use of Structured Wiring for Voice and UCC?

Some might argue that enterprises might stop wiring new buildings for phone service, just as many now use Wi-Fi instead of structured Ethernet systems for data connections. “With no more need to use desk phones, IT will stop wiring business buildings for phone systems to save on costs,” argues  Jeff Ton, Intervisiion SVP. 

There is some evidence for that belief. PwC in the United Kingdom has shifted all 18,000 staff across 24 offices to use mobile phones only.  

Orf course, PwC operates differently than many large enterprises, as many of its staff work frequently out of the office, allowing PwC to use a temporary desk approach to office space. 

According to Ofcom, in 2010, more than 10 million U.K. businesses were using fixed-line
phones. By about 2018 that number had fallen to just 6.4 million.

By some estimates, about eight percent of U.S. businesses rely solely on mobile phones. 

These days, it is hard to separate use of business phone systems from the rest of the unified communications application business, though. And overall business spending on UCC still seems to be growing. 


Still, growth of UCC applications does not directly speak to use of structured telephone wiring to support the devices and software, which might be supplied using Wi-Fi, Ethernet wiring systems, telephone wiring or other wireless means. 


For the most part, it still seems reasonable to argue that most enterprises, and most businesses, will continue to use structured telephone wiring systems, even if a growing percentage of firms might abandon use of such fixed interior wiring.

Sunday, November 3, 2019

China Launches 5G

Chinese 5G prices are said to range from with prices ranging from 128 yuan (about US $18) to 599 yuan (about $85). China Unicom's 129-yuan service plan service plan reportedly comes with a 30-gigabyte data cap, 500 minutes of voice talk and a 500-Mbps speed cap. 

Chian Unicom’s 599-yuan plan allows 300 gigabytes of data and 3,000 minutes of voice talk, with speed at 1 Gbps.

China’s gross domestic product, adjusted for purchasing power parity (a method of comparing the cost of the same product across countries), is higher than that of the United States, so PPP prices stated in U.S. currency are, using that method, roughly comparable to U.S. prices. 

In 2016, for example, the cost of mobile service was about 0.63 percent of gross national income per person, compared to the U.S. figure of 0.77 percent of GNI per capita, according to the International Telecommunications Union. 

As always, it matters which particular plans are compared, whether they are prepaid or postpaid, and whether one compares the lowest-priced option, mid-priced or most-costly plans/ Usage allowances also matter, as do buying preferences in each country, as well as the popularity of bundled service plans that can obscure or lower actual prices paid. 

India’s mobile data is cheapest globally


Saturday, November 2, 2019

Broadband Cannot Help Job Creation that Much

Fully 77 percent of new jobs created in the United States between 2007 and 2017 happened in just 25 metro areas, according to the McKinsey Global Institute. That includes the 25 cities classified as megacities, high-growth hubs and urban peripheries of those areas. 

As praiseworthy as creating high-quality broadband access services in the other areas might be, it is unlikely to materially change the job creation engine. 



Using AI to Reduce Small Cell Cost as Much as 4X

Mobile service providers are finding lots of ways to keep 5G infrastructure costs down, including using artificial intelligence to improve siting of small cells, thereby reducing the amount of small cell investment. 

Small cell capex can be as much as four times higher when the small cells are not located within about 20 meters to 40 meters (65 feet to 131 feet) of demand hotspots, according to a new white paper by the Small Cell Forum. So operators are starting to use AI to improve cell siting. 

“Where small cell placement was off by as little as half a cell radius from a given hotspot, the result – according to one operator – was that four times as many small cells were required to carry the same traffic,” the Small Cell Forum says.

Location accuracy is even more challenging when higher frequency bands are used, reducing the practical serving radius of the cells.


“A well-placed small cell has its serving area covering locations with high demand,” the paper states. “A poorly located small cell is too far away from the hotspot; its serving area does not cover an area of high demand.” And that efficiency is directly related to the number of small cells that need to be deployed, and therefore capital investment. 

To maximize small cell return on investment, and minimize capex overall, small cell siting should be within 20 meters to 40 meters of the ideal theoretical placement (real estate considerations matter, in that regard).

Friday, November 1, 2019

No Quick Revenue Uplift from 5G?

Many observers are hoping for a relatively-quick uptick in firm productivity and capabilities driven by 5G, edge computing, internet of things and artificial intelligence. Fewer likely believe 5G will positively affect gross revenues and profits from consumer mobility services. Patience is the watchword. 

It often takes much longer to reap technology benefits than observers expect. Researchers call this the productivity paradox. Quite often, big new information technology projects or technologies fail to produce the expected gains, for longish periods of time, such as a decade to three decades. 

In fact, some argue that a productivity boost between 1995 and 2000 was not enabled by information technology. But it also is likely the case that better information technology allows some firms to take market share from other firms, though overall productivity might not shift much, if at all. 

Even though knowledge of electricity was widespread in the late 1800s, electric power technology did not significantly spread until around 1914 to 1917 in the United States. In fact, factories did not fully utilize electricity until the 1920s. 

It took two to three decades before electricity was being used in a productive manner: about 40 years, as it turns out. 

To note just one example, much of the current economic impact of “better computing and communications” is what many would have expected at the turn of the century, before the “dot com” meltdown. Amazon, cloud computing in general, Uber, Airbnb and the shift of internet activity to mobile use cases in general provide examples.

But that lag was more than 15 years in coming. Nor is that unusual. Many would note that similar lags in impact happened with enterprises invested in information technology in the 1980s and 1990s.

Investments do not always immediately translate into effective productivity results. This productivity paradox was apparent for much of the 1980s and 1990s, when one might have struggled to identify clear evidence of productivity gains from a rather massive investment in information technology.

In fact, diffusion of a new technology takes time precisely because people need time to learn how to use the technology, while organizations must retool the ways they work to incorporate the better technologies most productively. 

Computing power in the U.S. economy increased by more than two orders of magnitude between 1970 and 1990, for example, yet productivity, especially in the service sector, stagnated).

And though it seems counter-intuitive, some argue the Internet has not clearly affected economy-wide productivity. But part of the problem is that it is impossible to capture productivity gains using our normal measuring tools when products or services have zero price. And much of the consumer internet is based on zero pricing. 

Other traditional measures of growth also suffer when technology arguably improves efficiency and productivity (more produced by the same--or fewer--humans). Look only at the print content business, where revenues, profits and employment have plummeted in the internet era, even as the volume of content of all sorts has increased exponentially. 

Or consider the impact of software on information technology. The Bureau of Labor Statistics estimates that employment in information technology was lower in 2018 than it was in 1998, despite the obvious increase in software-intensive life and business.

Gartner, for example, recently said that enterprises will have to wait twice as long as they expect to reap incremental revenue from technology investments.

Through 2021, incremental revenue from digital transformation initiatives is largely unlikely, Gartner researchers predict. That will not come as good news for executives hoping for revenue growth from repositioning existing business practices for digital delivery and operation. 

On average, it will “take large traditional enterprises twice as long and cost twice as much as anticipated,” Gartner researchers predict. 

When, and How Will "Digital Transformation" Show Up in Statistics?

Despite a massive wave of capital investment now underway to “digitize” most aspects of business, it is a fair question to ask how long it will take before tangible financial benefits are reaped, beyond a shift of market share from some suppliers to others. That will be quite tangible, and will show up in gross revenue changes. 

There are several problems. First, there is almost always a long lag between major waves of technology investment and tangible changes in productivity. Also, digital transformation can cannibalize a firm’s revenue. 

“In a recent survey we conducted, companies with digital transformations under way said that 17 percent of their market share from core products or services was cannibalized by their own digital products or services,” according to researchers at McKinsey. 

The point is that heavy information technology spending to “digitalize the enterprise” might not show especially tangible benefits in productivity, incremental new revenues and new products for quite some time.

What will happen is that firms will slow the rate of market share shift from attackers, while some attackers will gain market share. While that might not alter long-term productivity or growth rates in an economy as a whole or within an industry, it will tangibly affect gross revenue, profit margins and market share within an industry, amongst competitors. 

The telecom analogy is the business impact of switching to fiber-to-the-home or other access network platforms. At least in competitive markets, where telcos were facing competition from cable TV operators in every core service, the decision to invest in FTTH was actually not driven so much by an expectation of overall increased revenue or market share, but fundamentally by the effort to retain overall revenues in the face of share loss.

Basically, and colloquially, the advantage of FTTH was strategic: “you get to keep your business.” The logic was that new video subscription revenues would offset loss voice market share, while FTTH would allow telcos to keep pace with cable TV operators in internet access speeds. 

That might seem like an awful expensive proposition. Investing heavily to “stay where one is” is not a traditionally sound investment principle. But lost market share really does matter as well, and many new “digital enterprise” investments arguably are of that nature: invest to limit attacker market share gains. 

That is not to say there will be no non-quantifiable gains for legacy or established providers. Better customer satisfaction, lower operating costs, better marketing platforms and other effects hard to capture on financial reports are possible. But the impact visible on financial reports might, in the near and medium term (several years to 10 years), mostly only be captured in a negative sense (market share not lost; market share not lost as fast), rather than in a positive sense (market share gained). 

The impact for attackers might arguably be different: market share taken from existing competitors. That noted, eventually, we are going to see value in the traditional metrics of productivity growth. 

Non-traditional measures, though, likely are needed to capture the benefits of innovations and value with a zero price, or reflecting quality improvements impossible to capture with price metrics. 

Traditional metrics do not capture increases in well-being and consumer welfare provided by zero-price quality improvements or zero-price products that often as substitutes for positive price products.

Yes, Follow the Data. Even if it Does Not Fit Your Agenda

When people argue we need to “follow the science” that should be true in all cases, not only in cases where the data fits one’s political pr...