Friday, January 31, 2020

What Will Telecom Look Like in 20 years?

Few, if any, executives or managers ever really looks out two decades to shape today’s business decisions. It simply is not rational to do so. Futurists, otten wrong as much as 80 percent of the time, must do so. 

“The insurance, transportation, and retail industries will either not exist in 20 years or will have changed completely due to artificial intelligence, innovation, and other factors,” according to Dave Jordan, global head, consulting and services integration at Tata Consultancy Services.

With the caveat that the odds of being substantially correct are perhaps lower than 20 percent, what might today’s communications business look like in 20 more years? 

The TCS analysis works something like this: auto insurance will not be needed as much when autonomous vehicles reduce so many accidents, when not so many people own their own personal vehicles, and when 3D printing allows quick and cheaper repairs to vehicles. 

3D printing will enable so much personalization and customization that “mass market retailing” is unnecessary, TCS suggests. 

Applying the same sort of logic to the telecommunications industry, at least directionally, is not so hard. As the functions of software, firmware or communications often are embedded into the use of product, so larger parts of the “connectivity function” are likely to be subsumed into other products.

As tires are part of the value of a new car, and transmission was embedded in the consumption of over the air TV, so a growing part of the value of tomorrow’s products might include embedded communications, and be purchased as part of some other product.

As the cost of public Wi-Fi (and the cost of the WAN that connects it) is embedded in the cost of goods sold by retailers, as the value of hotel room Wi-Fi (and the cost of the WAN access), so the cost of connectivity might increasingly be bundled with the cost of other products (safety, transportation, content, devices). 

AT&T, Comcast and Verizon Collectively Generate about $212 Per Home Passed, Annually

It is not easy to run a big fixed network business these days. As Verizon CEO Hans Vestberg said on Verizon’s fourth quarter earnings call, Verizon faces a “secular decline in wireline business that is continuing.” 

Secular means a trend that is not seasonal, not cyclical, not short term in nature. For multi-product companies such as AT&T, Verizon and Comcast, it can be argued that "everything other than the core business is doing a lot worse than the core business, both at Comcast and at AT&T and at Verizon.

One supposes the “core business” for AT&T and Verizon is mobility, while the core business for Comcast is fixed network broadband. The conclusion analyst Craig Moffett of MoffettNathanson reaches is that AT&T, for example, will have to be broken up. 

The suggestion to focus on the “core business” often produces financial returns when conglomerates are broken up. 

What might not be so clear is how breaking up triple play assets, or separating mobile from fixed assets necessarily helps the surviving connectivity assets to generate greater revenue and profits. 

Is it logical to assume that the AT&T and Verizon businesses would all do better if the fixed network assets, mobile assets and media assets were separated? Would Comcast’s financial returns be better if the content assets were separated from the fixed network, or the video entertainment business separated from the network connectivity business?

Given the “secular decline” of the fixed network business, could a fixed services only approach (internet access, voice and perhaps video entertainment) actually work, at the scale the separated Comcast, AT&T or Verizon assets would represent?

The issue is not whether a small firm, with a light cost structure, might be able to sustain itself in some markets selling internet access alone, or internet plus voice. The issue is whether an independent AT&T fixed network or an independent Verizon fixed network business could sustain itself. 

The answers arguably are tougher than they were twenty years ago, when a telco and a cable company faced each other with a suite of services including internet access, voice and entertainment video. Basically, they traded market, at best. Telcos ceded voice share, but cable lost some video share, and both competed for internet access accounts. 

At a high level, the strategy was that both firms would trade share, but by selling three services on one network, instead of one service on each network, the numbers would still be workable.

But the math gets harder when every one of those three services faces sustained declining demand and falling prices. 

That being the case, it is hard to see how a sustainable business can be built on connectivity services alone, especially for either AT&T or Verizon. Perhaps Comcast could survive with a strong position in internet access and smaller contributions from voice and possibly video entertainment. 

In the fourth quarter of 2019, Comcast Cable generated $14.8 billion in revenue.  Total revenue that quarter was $28.4 billion. 

Verizon’s fixed network business, on the other hand, generated about $7 billion, out of total revenue of nearly $35 billion. 

AT&T had fourth quarter 2019 total revenue of nearly $47 billion. AT&T’s fixed network, plus satellite TV, generated about $18 billion in revenue.  AT&T’s “fixed network plus satellite” operations generate 38 percent of revenue. Perhaps $8 billion or so of that revenue comes from the satellite operations. So the fixed network business might generate $10 billion in revenue. 

Comcast Cable passes 58 million consumer and business locations. Comcast has 26.4 million residential high-speed internet customers, 20.3 million residential video customers and 9.9 million voice accounts, generating average cash flow (EBITDA) of $63 per unit. 

At a high level, the problem is that Verizon’s entire fixed network operation generates about 20 percent of total revenue. AT&T’s fixed network generates perhaps 21 percent of revenue. Comcast, which has a small mobile operation, generates close to $15 billion from the fixed network. 

And that, it seems to me, illustrates the problem. Comcast, AT&T and Verizon all put together generate about $32 billion in fixed network revenue, and revenue is likely to remain flat to negative. 

Verizon homes passed might number 27 million. Comcast has (can actually sell service to ) about 57 million homes passed.

AT&T’s fixed network represents perhaps 62 million U.S. homes passed. 

CenturyLink never reports its homes passed figures, but likely has 20-million or so consumer locations it can market services to. 

Looking only at Comcast, AT&T and Verizon, $32 billion in annual fixed network revenue is generated by networks passing about 146 million U.S. homes. That works out to about $212 per home passed, per year. 

How that is sustainable is a clear challenge.

Thursday, January 30, 2020

Analysys Mason 2020 Predictions

Is Private 5G a Threat to Mobile Operator Revenue?

Some believe private 5G or private 4G networks are an elephant in the room, a big potential threat to public network revenue models. Others see an opportunity to supply enterprises with private networks. 

A couple of proven models might clarify the potential upside and downside. Generally speaking, private networks have not historically been a threat to service provider revenue models. The examples include both cabled local area networks and Wi-Fi. In each case, the public network terminates at the side of the building and the internal network is owned and operated by the occupants. 

The LAN business always has been separate from the telecommunications business, and has either stimulated or been neutral in terms of revenue. 

To be sure, an obvious potential business model might have a telco operating a services integration business, building, operating and maintaining either cabled LANs or Wi-Fi networks on behalf of enterprise clients. This has proven difficult, both for telcos and a few firms that have tried to build a business doing so. 

Boingo, arguably the largest third-party supplier of enterprise venue Wi-Fi and neutral host mobile access in the U.S. market, has total annual revenue in the range of $275 million. As significant as that might be for many firms, it indicates a total addressable market simply too small to support a tier-one telco effort. 

In fact, in recent years, Boingo revenue growth has shifted to supplying distributed antenna system access to venues for mobile service providers. Basically, Boingo supplied the indoor or premises radio network for mobile phone service. 

Another example is the business private branch exchange (enterprise telephony) business. Telcos historically have preferred not to operate in this segment of the business, as gross revenue and profit margins are close to non-existent. Instead, ecosystem partners including system integrators and interconnect firms have occupied this niche in the market. 

The enterprise PBX market has not been large enough, or profitable enough, for the typical telco to pursue. 

Network slicing provides a new wrinkle, however. In principle, a private 5G enterprise network could in turn use a network slice for WAN connectivity. That still leaves the issue of which entity owns and operates the premises network, however. In principle, a network slice is simply another way the enterprise buys a connectivity service, while maintaining its own private local network.

The big takeaway might be that private network markets are not large enough for most telcos to pursue. The costs of building and operating a Wi-Fi network, enterprise telephony or indoor mobile network are not prohibitive for enterprises. So the opportunity for managed services might not be so large for any would-be third party suppliers. 

It remains to be seen whether private 4G or private 5G networks could break from those prior models. But suppliers will have to explore the possibilities. So Ericsson and Capgemini have partnered to explore their opportunities in the private 4G and private 5G network area. 

Telia in Sweden os the first service provider to join Capgemini and Ericsson looking for commercial projects in the Scandinavian market. 

In terms of business models, Ericsson might hope to sell infrastructure and software. Capgemini might look to provide both consulting, implementation and operation. Telia might seek mostly to garner the access revenues. 

But note the possible roles: private 4G or private 5G can be undertaken directly by an enterprise, or might be outsourced to a third party. The issue is whether the third parties might include telcos operating in the system integrator role, or whether that function will, as past patterns suggest, mostly be an opportunity for third parties. 

Beyond that, there is the question of how big the market opportunity might be for third parties. History might suggest the opportunity for telcos is limited, while the upside for third party integrators ultimately also is somewhat limited. 

Most large enterprises ultimately find that the cost of using a managed service provider exceeds the cost of building and operating a private local network. At low volume, a managed service often is more affordable. Those advantages often disappear at volume, however. That is why many enterprises still find they save money by operating their own LANs and PBXes. 

The takeaway might be that private 4G and private 5G will ultimately not prove to be disruptive for mobile service providers, even if significant private network activity occurs.

Wednesday, January 29, 2020

NTT Global Data Centers Execs on What is Happening in Data Center Market

Joe Goldsmith,NTT Global Data Centers, Americas  Chief Revenue Officer and Steve Manos, NTT Global Data Centers, Americas Vice President of Global Accounts talk about what they see happening in the data center market and what NTT is doing to meet customer requirements. 

Tuesday, January 28, 2020

How Many Streaming Services Will Consumers Ultimately Buy?

Virtually nobody believes Netflix will be unaffected by new competition coming from Disney+, Peacock, Apple and HBOMax, as well as existing competition from Amazon Prime and Hulu. So far, Disney+ has gained about 20 million accounts in about a quarter. 

Still, use of Netflix penetration remained steady throughout 2019 while the penetration rates for both Hulu and Amazon Prime grew about six percent, according to MoffettNathanson and HarrisX .

Some 90 percent of Disney+ customers appear to buy Netflix as well. At least so far, it appears that Disney+ is complementary to Netflix, not cannibalizing it.

It is of course possible that market share could start to shift more significantly when HBO Max launches in May 2020 and Peacock launches in July. The issue is how many different services customers will be willing to buy. 

At the moment, that number seems to hover between two to three subscriptions in some studies; three to four subscriptions per household in others. 

So the issue is whether the market saturates at five or more subscriptions per household. 

That might seem a stretch, but some households looking at the landscape, and which services have “must see” content already estimate they might need to buy five different subscriptions. 

The other issue is that Netflix is the first global service. The others arguably will compete primarily in the U.S. market, to start. 

What Will 5G Mean for the Rest of the Ecosystem?

What might 5G mean for all those in the ecosystem aside from mobile service providers? What will change, and what possibly could happen in the rest of the ecosystem, ranging from chips to apps; use cases to business models? How does 5G bring those changes? Where is the upside and the downside?

Dean Bubley, Director, Disruptive Analysis, United Kingdom 
John Ghirardelli, Director, U.S. Innovation, American Tower Corporation, USA 
Ramy Katrib, CEO & Founder, DigitalFilm Tree, USA 
Yang Yang, Co-Director, and Professor, School of Information Science and Technology, SHIFT, and ShanghaiTech University, Peoples Republic of China 
Gary Kim, Consultant, IP Carrier, USA 

Are Wi-Fi Routers Dangerous to Your Health?

As long as I can remember, there have been periodic and generally low level concerns about non-ionizing radiation--the type of energy radio signals represent. By non-ionizing, we mean that the signals are not capable of dislodging electrons atoms or molecules, as are x-rays or gamma rays. Ionizing radiation, in high doses, is carcinogenic, though useful in low doses. 

Non-ionizing radiation can cause tissue heating, as you can experience with food in a microwave oven. The health concerns about non-ionizing radiation come from the potential long term exposure. As with any form of natural radiation (sunlight, for example), the key is exposure levels. 

The key thing about non-ionizing radiation is that it is found, in real-world communications cases, at very low power levels. Also, signals decay logarithmically. 

This is an illustration of how a Wi-Fi router’s power levels drop rapidly with distance. Power levels drop more than half in the first two meters. Once people are about four meters from the router, signal levels have dropped from milliWatts to microWatts, about an order of magnitude (10 times). 

Some people are concerned about power emitted from mobile cell towers. Keep in mind that mobile radios on cell towers have power levels that decay just as do Wi-Fi signals. Some liken the power levels of a mobile radio on a tower to that of a light bulb

Radio signals weaken (attenuate) logarithmically, by powers of 10, so the power levels decay quite rapidly.

Basically, doubling the distance of a receiver from a transmitter means that the strength of the signal at that new location is 50 percent  of its previous value. Just three meters from the antenna, a cell tower radio’s power density has dropped by an order of magnitude (10 times).

At 10 meters--perhaps to the base of the tower, power density is down two orders of magnitude. At 500 meters, a distance a human is using the signals, power density has dropped six orders of magnitude.

Though there is no scientific evidence that such low levels of non-ionizing radiation actually have health effects, such as causing cancer, a prudent human will limit the amount of exposure, just as one takes the prudent risk of wearing a seat belt in an automobile, minimizing time spent in the sun and so forth.

Would It Have Made a Difference If Telcos Had Stuck with ATM?

It is no longer a question, but there was a time not so long ago when global telcos argued for asynchronous transfer mode (broadband ISDN) as the next-generation protocol, rather than TCP/IP.  

A report issued by Ovum in 1999 argued that “telecommunications providers are expected to reinvent themselves as operators of packet-switched communications networks by 2005.”
“Growth in Internet Protocol (IP) services is expected to fuel the transition,” Ovum argued.

Of course, all that was before the internet basically recreated the business, making connectivity providers a part of the broader ecosystem of applications, services and devices requiring internet connectivity to function. 

In retrospect, it might seem obvious that the shift of all media types (voice, video, image, text) to digital formats would make TCP/IP a rational choice. But that was not the case in the telecommunications industry from the 1980s to the first decade of the 21st century. Telco engineers argued that ATM was the better choice to handle all media types. 

But the internet, cheap bandwidth and cheap computing all were key forces changing the economics and desirability of IP, compared to ATM. 

Once internet apps became mass market activities, network priorities shifted from “voice optimized” to “data optimized.”

Connection-oriented protocols historically were favored by wide area network managers, while connectionless protocols were favored by local area network managers. The relative cost of bandwidth drove much of the decision making.

WAN bandwidth was relatively expensive, LAN bandwidth was not. That meant the overhead associated with connectionless protocols such as TCP/IP did not matter. On WANs, packet overhead mattered more, so lower header overhead was an advantage. 

It is by no means clear that the choice of a connectionless transmission system, instead of the connection-oriented ATM, would have changed the strategic position of the connectivity provider part of the internet ecosystem. Indeed, one key argument for IP was simply cost: IP devices and network elements were much cheaper than ATM-capable devices. 

One might argue the global telecom industry simply had no choice but to go with IP, no matter what its historic preferences might have been.

Monday, January 27, 2020

Applications and Use Cases are Big 6G Challenge

The whole point of any access network is send and receive user and device data as quickly as possible, as affordbly as possible, to the core network and all the computing resources attached to the core network. The future 6G network, no less than the new 5G network, is likely to feature advancements of that type.

Bandwidth will be higher, network ability to support unlimited numbers of devices and sensors will be greater, latency will be even lower and the distance between edge devices and users and computing resources will shrink.

The biggest unknowns are use cases, applications and revenue models, as has been true since 3G. The best analogy is gigabit fixed network internet access. Users often can buy service running at speeds up to a gigabit per second. Few customers presently have use cases requiring far lower speeds.

So it is likely that 6G, as will 5G, often will feature capabilities that exceed consumer use cases, initially.

NTT Docomo has released a white paper with an initial vision of what 6G will entail. Since every mobile generation since 2G has increased speeds and lowered latency, while connection density grew dramatically between 4G and 5G, we might look first to those metrics for change.

Docomo suggests peak data rates of 100 Gbps, latency under one millisecond and device connection density of 10 million devices in each square kilometer would be design goals. 

Along with continued progress on the coverage dimension, 6G standards might extend to space, sky and sea communications as well. Docomo also believes quality of service mechanisms exceeding “five nines” and device performance (no charging devices, cheaper devices) would be parts of the standard. 

Looking at commercial, economic or social impact, since the 3G era we have tended to see a lag of execution compared to expectations. In other words, many proposed 3G use cases did not emerge until 4G. Some might say a few key 4G use cases will not flourish until 5G is well underway.

For that reason, we might also discover that many proposed 5G innovations will not actually become typical until the 6G era. Autonomous vehicles are likely to provide an example. 

So Docomo focuses on 6G outcomes instead of network performance metrics. Docomo talks about “solving social problems” as much as “every place on the ground, sky, and sea” having  communications capability. Likewise, 6G might be expected to support the cyber-physical dimensions of experience. 

Also, 5G is the first mobile platform to include key support for machine-to-machine communication, instead of primarily focusing on ways to improve communication between humans. Docomo believes 6G will deepen that trend. 

It is worth noting that the 5G spec for the air interface entails availability higher than the traditional telecom standard of “five nines” (availability of 99.999 percent). 5G networks are designed to run at “six nines.” So 6G might well run at up to “seven nines” (99.99999 percent availability). 

The legacy telecom standard of five nines meant outages or service unavailability of 5.26 minutes a year. The 5G standard equates to less than 32 seconds of network unavailability each year. A seven nines standard means 3.16 seconds of unavailability each year. 

Some might say 4G was the first of the digital era platforms to design in support for internet of things (machines and computers talking to machines and computers) instead of the more traditional human mobile phone user. That trend is likely to be extended in the 6G era, with more design support for applications and use cases, with artificial intelligence support being a key design goal as well. 

In part, that shift to applications and use cases is more important as the wringing of traditional performance out of the network becomes less critical than new use cases taking advantage of performance boosts. 

As it already is the case that almost no consumer users actually “need” gigabit speeds, much less speeds in the hundreds of megabits per second, so few human users or sensors will actually “need” the 6G levels of throughput and latency. 

Architecturally, the evolution towards smaller cells will continue, in part to support millimeter wave frequencies, in part to assure better connectivity. Where traditional cell architectures have emphasized non-overlapping coverage, 6G networks might use orthogonally non-aligned cells (overlapping), deliberately overlapping to assure connectivity. 

That almost certainly will require more development of low-cost beam forming and signal path control. Having cheap artificial intelligence is going to help, one might suggest.

The Downside of Multi-Purpose IP Networks

By now, virtually all observers agree that direct revenue generated by fixed networks will shift to supplying broadband access, while some o...