Thursday, November 19, 2020

Will 5G Prove to be Another Example of Innovation Despite our Efforts?

Sometimes big changes in communications demand happen almost despite our best efforts. Some might argue the 1996 Telecommunications Act succeeded despite itself, for example. Innovation came not so much from telecom competition but from product substitutions based on mobility and the internet, it can be argued.


If you remember the major revision of U.S. telecommunications law called the Telecommunications Act of 1996, you will remember the practical consequences of deregulating the local telecom access business. 


Revising U.S. law, the Act enabled competition for local telecom services, lawful operation and ownership of Class 5 voice switches, the right to sell customers voice and other services and wholesale access to incumbent networks. 


All that happened just prior to voice communications reaching a historic peak about 2000, with a rapid decline. Most incumbent telcos lost 35 percent of their customers for that service in 10 years, as much as 65 to 70 percent over two decades. 


Service providers also lost half their revenue from long distance calling over that same period. 


source: CDC, Statista 


At the same time, other big changes in end user demand were happening: substitution of mobile phone service for fixed service; use of mobiles instead of cameras or music players, GPS devices or video screens. 


source: Wikimedia

There also was increasing use of the internet as a substitute for a wide range of other activities and products. In 1996, for example, it is estimated there were 36 million global users of the internet, representing less than one percent of the world population. A decade later, that had grown to 17 percent. 


About that time, some 14 percent of the U.S. population was using the internet, on dial-up connections. A decade later, that had grown to about 66 percent. 


source: Pew Research 


The point is that disruptive changes in regulatory framework can produce outcomes we did not expect, especially when disruptive enabling technologies happen at the same time, allowing massive product substitution and behavioral changes. 


The same thing might happen with 5G. It arrives in tandem with other key technologies and platforms, including commercial artificial intelligence, edge computing and internet of things. It may, in the end, be hard to separate the various threads from each other. 


In part, that is because computing architectures shift over time, oscillating between centralized and decentralized approaches. That puts computing resources at different places within the architecture, fluctuating between centralized and decentralized designs. 


In the mainframe era, computing resources were centralized at far ends of the network. That shifted in the client-server era to more local processing on devices themselves or on local servers. In the internet era computing switched back to far end hyperscale data centers. 


source: GSMA 


But most observers believe we are now in a stage of shifting more workloads back locally, to take advantage of artificial intelligence, heavy local analysis of sensor data to support the internet of things and compute-intensive applications using virtual or augmented reality. 


“These days lots of companies want to turn bandwidth problems into compute problems because it’s often hard to add more bandwidth and easier to add more compute,” said Andrew Page, NVIDIA media group director of advanced products. 


So maybe 5G will ultimately not be the big story. Maybe other simultaneous changes will provide the most-consequential effects. Put another way, 5G might not be as transformational as edge computing, applied artificial intelligence or IoT.


By the 6G era, network slicing and heterogeneous access might turn on to be more consequential than mobile platform performance. Perhaps value will have migrated further in the direction of orchestration, and away from underlying facilities.


So 5G might be part of a consequential change that we are not deliberately planning.


Wednesday, November 18, 2020

Digital Redlining or Response to Demand?

Terms such as digital redlining imply that U.S. internet service providers upgrade neighborhoods able to pay for higher speed internet access underinvesting in poorer neighborhoods. At some level, it is hard to argue with that point of view, at least where it comes to gigabit internet access. 


Google itself pioneered the tactic of building in neighborhoods where there is demonstrated demand, building Google Fiber first in neighborhoods (typically higher-income areas) where potential customers were most interested. Other gigabit service providers have used the placing of deposits for the same reason. 


And regulatory officials at the local level seem to now agree that “universal service” (building a gigabit network past every home and business) is desirable in some cases, but not absolutely mandatory in all cases. The thinking is that allowing new internet service providers or facilities to be built wherever possible is a better outcome than requiring ubiquity, and getting nothing. 


Also, higher-speed facilities often are not found everywhere in a single market or city. CenturyLink does sell gigabit internet access in Denver, just not everywhere in the metro area. That is not necessarily “redlining,” but likely based on capital available to invest; expectations about financial return; customer density or any other combination of business issues that discourages the investment in new access facilities. 


The economics of communication networks also are clear. Density and cost per location are inversely related. Mobile networks typically have 10 percent of cell sites supporting 50 percent of usage. About 30 percent of sites carry about 80 percent of traffic. That has been true since at least the 3G era.  


In fixed networks, network cost and density also are inversely related. So population density has a direct bearing on network costs. In the U.S. market, network unavailability is concentrated on the last couple of percent of locations.  


With cable operators already holding at least 70 percent share of the internet access installed base of customers, any new investment in faster facilities faces a tough challenge. Any new fiber to home network, for example, essentially is playing catch-up to a cable operator, as roughly 80 percent of U.S. households already also are reached by gigabit speed cable networks. 


And cable share has grown, up from possibly 67 percent share in 2017. 


That noted, internet speeds do vary by geography: speeds in urban areas frequently are higher than in rural areas. But the argument that large numbers of U.S. households are underserved often is correct, depending on what standard one wished to apply, and how one defines the supplier market.


Some claim 42 million U.S. residents are unable to buy broadband internet access, defined as minimum speeds of 25 Mbps in the downstream.  That actually is incorrect. 


Virtually every household in the continental United States is able to buy 25 Mbps or faster service from at least two different satellite providers. But those who claim “42 million” people cannot buy broadband simply ignore those choices, and focus only on the claimed availability of 25 Mbps service by fixed network providers. 


There are other estimates which also vary wildly. Roughly 10 percent of U.S. households are in rural areas, the places where it is most expensive to install fast fixed network internet access facilities, and where the greatest speed gaps--compared to urban areas--almost certainly continue to exist.


In its own work with TV white spaces, Microsoft has targeted perhaps two million people, or roughly a million households, that have no fixed network internet access. That assumes there are two people living in a typical household, which is below the U.S. average of roughly 2.3 to 2.5 per household.


Recall that the definition of broadband is 25 Mbps downstream. Microsoft has argued that 20 million people (about 10 million homes) or perhaps eight percent of the population (perhaps four percent of homes) cannot get such speeds from any fixed network service provider.


Microsoft also has cited figures suggesting 25 million people cannot buy broadband--presumably using the 25 Mbps minimum standard, most of those people living in rural areas. 


That conflicts with data from Openvault that suggests 95 percent of the U.S. population can buy internet access at a minimum of 25 Mbps, while 91 percent to 92 percent can buy service at a minimum of 100 Mbps. 


Using the average 2.5 persons per U.S. household average, that suggests a universe of about 10 million U.S. homes unable to purchase internet access at 25 Mbps from a fixed network supplier, in 2018. What is not so clear is the percentage of households or persons who can do so using a mobile network. 


None of that explains urban areas with slow speeds, though. There the issue is more likely to be high construction costs in urban areas where underground construction is necessary, along with demand expectations that are lower than in suburban areas. That is true whether it is electrical lines or communications networks being considered.   


But at least one Microsoft analysis suggests that about half of all U.S. households are not using 25 Mbps access. The claim is that 162.8 million people are “not using the internet at broadband speeds.” That seems to clearly contradict data gathered by firms such as Ookla and Opensignal suggesting that average U.S. speeds are in triple digits.


In 2018, the average U.S. broadband speed was 94 Mbps, according to the NCTA. That same year, Ookla reported the average U.S. speed was 96 Mbps. 


It is not quite clear how the Microsoft data was generated, though one blog post suggested it was based on an analysis of “anonymized data that we collect as part of our ongoing work to improve the performance and security of our software and services.” 


The claim of 162.8 million people “not using the internet at broadband speeds” (probably using 25 Mbps as the definition) equates to about 65 million households, using the 2.5 persons per household definition. That does not seem to match other data, including the statistics Microsoft itself cites. 


What remains difficult, but might explain the divergence, is if applications and services include both apps run on smartphones as well as PCs and other devices connected to fixed networks. That would explain the number of users, while usage on mobile networks might account for large numbers of sessions where 25 Mbps speeds downstream were not noted, or perhaps it was the upstream speed definition (minimum of 3 Mbps) that was the issue.  


Even then, downstream average 4G speeds in 2018 were in excess of 40 Mbps downstream, so even that explanation is a bit difficult. 


Perhaps there are other ways to make sense of the data. There is a difference between users (people) and households. There is a difference between usage and availability; usage by device (mobile, PC, tablet, gaming device, sensor); application bandwidth and network bandwidth. 


Perhaps the issue is application performance on a wide range of devices including mobiles and untethered devices using Wi-Fi, which would reduce average experienced speeds, compared to “delivered access speed.” 


Methodology does matter. So do the costs and benefits of broadband capital investment under competitive conditions, in areas with high construction costs or low demand for advanced services, especially when newer platforms with better economics are being commercialized. 


Telecommunications is a business like any other. Investments are made in expectation of profits. Where a sustainable business case does not exist, subsidies for high-cost areas or universal service support exist. 


The point is that every human activity has a business and revenue model: it can be product sales, advertising, memberships, subscriptions, tax support, fees, donations or inheritances. Children have a “parents support me” revenue model, supported in turn by any of the aforementioned revenue models. 


But every sustainable activity has a revenue model, direct or indirect. The whole global communications business now operates on very different principles than the pre-competitive monopoly business prior to the 1980s. We still have a “universal service” low end, but we increasingly rely on end user demand to drive the high end. 


Our notions of low end change--and higher--over time. We once defined “broadband” as any data rate of 1.544 Mbps or higher. These days we might use functional definitions of 25 Mbps or 30 Mbps. Recall that 30 Mbps--in 2020--was called “superfast” as a goal for U.K. fixed network broadband. 


Few of us would consider 30 Mbps “superfast” any longer. Some might say the new “superfast” is gigabit per second speeds. But that is the change in real-world communications over just a decade. What was a goal in 2010 now is far surpassed. 


What some call “redlining” is simply a response to huge changes in the internet access and communications business. “Maximum” is a moving target that responds to customer demand. “Minimums” tend to be set by government regulators in search of universal service. 


As independent internet service providers cherry pick service areas where they believe the greatest demand for gigabit per second internet access exists, so do incumbents. 


Similar choices are made by providers of metro business services; builders of subsea connectivity networks or suppliers of low earth orbit satellite constellations and fixed wireless networks. They build first--or pick customer segments--where they think the demand is greatest.


Saturday, November 14, 2020

Technology Displacement is Harder than it Seems

Technology displacement--new for older--is a tricky business. Sometimes a whole ecosystem has to be built before a key innovation can reach mass adoption.  


Not every feasible technology substitute actually displaces other solutions with which they potentially compete, even when the argument is made that the substitute is “better” on some relevant performance metric. 


Sometimes the failures are the result of business execution, as when a promising startup runs out of money, grows too fast or too slowly. 


And customer adoption is almost always related to potential customer underlying habits and preferences. Changing has costs, so the innovation must deliver value in significant excess to the costs of changing. 


Customer experience, broadly defined, always is important. “Better” in some sense is offset by “hard to use,” “inconvenient” or “not worth the extra money.”


Politics and culture sometimes also play a key role. Is an existing way of doing things beneficial to important and powerful interests? Can they resist innovations that threaten those interests? 


Sometimes it is deemed too much hassle to displace an existing solution and ecosystem with a rival. Despite its inferiority to other keyboard layouts, we still use QWERTY, which originally was developed to slow down typing and prevent key jamming on mechanical typewriters.


Some call that path dependence, the idea that small, random events at critical moments can determine choices in technology that are extremely difficult and expensive to change.


Innovation is more a human process than a technological one,” says Stacy Wood, North Carolina State professor. “Persuasion, environment, culture and context always matter.” 


If the primary end-use value of a smartphone is the expected ability to remain connected “anywhere,” on the go, then it makes sense that Wi-Fi--though a key part of the connectivity ecosystem and experience--is not a direct or convenient substitute.


For perhaps similar reasons, few of us use smartphones without cellular service, though some functionality is possible. 


In the mobile communications business, the service always is bundled: text messaging, voice and internet access being the foundations. It remains possible to purchase a basic bundle including only voice and messaging, but increasingly, the foundation package includes internet access. 


Decades ago, the emergence of Wi-Fi was touted as the potential foundation for mobile phone service, and so it has become, though not in the way some expected. Periodically, it has been suggested that Wi-Fi could be the sole connectivity mechanism for mobile phone service. 


Voice and text messaging still are required features of a 5G network, whether they directly generate lots of specific revenue or not. Customers might willingly buy a 5G-based home broadband service, without voice or texting capabilities. They might buy a 5G dongle for PC internet  access. 


But it remains an open question whether smartphone service without voice and texting is viable, lawful or desirable. In principle, a smartphone can function without a “mobile” account enabling voice, using Wi-Fi and VoIP. 


There are some issues, such as inability to use a phone number or communicate easily with other users of the public telephone network. But think of a smartphone connected to Wi-Fi, with no subscriber identification module and mobile service, as a PC connected to the internet and using Zoom or any other messaging or VoIP service.  


It can be done, but the utility or value is not high, for most people, if the mobile service bundle of value also includes low-cost public network voice and messaging (for domestic communications, for example) as well as the ability to use Wi-Fi or the mobile network when roaming or conducting international communications.


Calling using VoIP over Wi-Fi is possible and useful. In a mobile device context the overall value of a mobile service might be high enough, and the cost low enough, that bothering with a Wi-Fi-only use of the phone is not worthwhile. 


Technology displacement often is quite a bit more complicated than it appears. 


Content Versus Distribution: Netflix Versus Disney: Where's the Value?

It has long been possible to get a reasonable debate on the respective values of content and distribution in the media business. Simply put, the issue is which part of the value chain is better positioned. Consider one illustration. The market value of Netflix is something on the order of $213 billion. Content creator Disney has a market cap of about $250 billion. 


Market cap is not everything, and both firms produce and distribute content. But it might be surprising that Netflix is so close to Disney, given its simpler business model, which includes original content production, but relies on distribution (subscriptions) for its revenue. 


Disney is far more complex, spanning movie and animation studios, broadcasting networks, content networks, theme parks, hotels, merchandise and video streaming operations.  


Think of Netflix as a content distributor, much as cable TV, satellite and telco video service providers; movie theaters; TV or radio broadcasters and increasingly, many online services and apps now act. True, Netflix invests heavily in original content, as do some other leading streaming video services. But its direct revenue comes solely from subscriptions. 


Notice something about Disney, though. Direct-to-consumer, which includes the Disney streaming service, with 73 million paid subscribers, generates significant revenue, but negligible operating income. Granted, that is partially because the Covid-19 restrictions closed the theme parks, while Disney’s streaming service is in start-up mode, so operating income might not be expected for a bit. 

source: Investopedia 


In a non-Covid environment, the theme parks (which includes hotels and merchandise) contribution to operating income would be vastly higher, between 20 percent and 33 percent of revenue. 

source: Nasdaq 


Operating income from theme parks, hotels and merchandise might range as high as 37 percent. 


source: Valuewalk


The point is that direct-to-consumer, lead by the streaming networks, should ultimately produce significant cash flow and operating income for Disney. The issue for some is how that cash flow and income might affect Disney value, and whether some different legal status for direct-to-consumer might affect the value of that unit. 


To be sure, some are unsure Disney streaming businesses can approach the size of Netflix, which already has perhaps 201 million million subscriptions. All Disney streaming properties collectively might reach about 100 million subscriptions. 


So the question is whether, someday, the value of Disney streaming is such that those assets would fetch a higher valuation if independent, as is Netflix.


Not Even FTTH Might Propel Non-Cable ISP Gigabit Share

One has to remain impressed with the commercial success of the hybrid fiber coax platform used by cable TV companies. Where most of the rest of the telecommunications world has remained fixated on fiber to the home as the platform of the future, cable operators tweaked an inconsistently-available coaxial cable network into the leading supplier of home broadband connections in the United States and a few other countries, with economics arguably better than FTTH for brownfield operations and a more-graceful approach to network upgrades.


In 2016 the cable industry passed about four percent of U.S. homes with networks offering 1 Gbps internet access. By 2018 80 percent of U.S. homes were able to buy gigabit per second service. 

source: CableLabs 


By way of comparison, all FTTH passings number something more than 50 million. There are about 141 million U.S. homes in total. So FTTH passes roughly 35 percent of U.S. homes. Not all those connections are capable of supplying gigabit connections at the moment, though. 


Assume there are 21 million active FTTH connections in the United States. Assume there are a total 103 million total broadband accounts. According to Openvault, in the third quarter of 2020 about five percent of U.S. customers bought gigabit service. 


That implies a total of no more than 5.15 million U.S. gigabit accounts in service. Assume all internet service providers other than cable operators have 30 percent of those accounts, implying about 1.5 million 1 Gbps ISP accounts sold by all firms other than cable operators. 


That further pimples that gigabit FTTH accounts in service represent about seven percent of active FTTH connections. Cable gigabit connections are likely to be closer to five percent of total broadband accounts. 


source: Fiber Broadband Association, RVA 


The point is that we sometimes too casually equate physical media with speed, or physical media with specific speeds, such as gigabit per second connections. Neither FTTH nor HFC directly equate with gigabit service or availability. 


However, cable TV operators do claim about 70 percent of the broadband installed base, possibly representing 3.6 million active gigabit accounts. As early as 2009, at least 75 percent of the fastest U.S. broadband connections were supplied by cable TV operators.  


At the same time, cable HFC platforms arguably have proven more effective than FTTH at generating incremental new revenues for platform owners, while suffering less from demand changes as voice and linear video began to shrink, the former since 2000, the latter since 2012 or so. Telco voice lines have fallen as much as 70 percent since 2000. Cable linear video accounts are down less than 15 percent since about 2012.  


Friday, November 13, 2020

FTTH is the Platform of the Future, But Might "Always Be"

People often forget that, in the communications business, there is no platform that always is best for every use case. What matters are the particular advantages. In other words, it has proven largely pointless to argue whether mobile or Wi-Fi access is “better.” Both have their contributions to make. 


In the fixed networks business, the belief has been--for many decades--that fiber to the premises is the future of the next-generation fixed network. 


With the caveat that there always is a private interest corresponding to every public policy, one cable TV industry vice president decades ago quipped that “fiber is the technology of the future...and always will be.”


Keep in mind, that was said about 40 years ago. And while North American access platforms have a different pattern than in most parts of the world, 40 years later, hybrid fiber coax platforms have about 70 percent share of the U.S. installed base of broadband connections. 


That creates a huge stranded asset problem for brownfield fiber-to-home deployments. Assuming a new FTTH network is deployed at scale, it might find that up to 70 percent of the assets are not generating broadband revenue. 


To be sure, there is still some amount of voice, but the old copper access network works well enough for that application. Investing capital on FTTH does not necessarily improve user experience, value or features for voice customers. 


For decades, though, there was one clear assumed advantage to deploying FTTH: the ability to sell linear entertainment video. So the basic thinking was that FTTH would allow telcos to hold broadband share while losing voice share and gaining video account share. 


It sort of worked that way until about 2012 for video services.

source: Business Insider 


But the “voice” part of the model never worked well at all, as usage, lines sold and value began to drop in the U.S. market as early as 2000, the peak year for telco access lines and long distance revenue


But business cases matter, and in the U.S. market the business case for FTTH, always difficult, has become quite challenged with the dominance of cable operators, and their seeming ability to keep pushing commercial bandwidths ahead faster than FTTH platforms. Already, at least 80 percent of U.S. homes have the ability to buy 1 Gbps internet access service, provided by cable operators. Not even most telco FTTH networks can do that, yet. 


Nor does it seem likely the cable cost advantage can be overcome any time soon, as technologists already are working on ways to boost HFC bandwidths to 10 Gbps or beyond, symmetrically. Sure, FTTH can do that as well, but not at the cost per home that HFC can provide. 


And that leads to telco interest in fixed wireless access, using 4G and 5G. The issue is not whether “fixed wireless can match FTTH” in potential speeds. The issue is whether fixed wireless can create a positive business case in areas where a new FTTH build is not financially feasible


Globally, matters can be quite different, as it often seems as though only one nationwide broadband fixed network can be supported. Still, in many other countries a mix of platforms is called for, based on home density. And then there is the matter of how people use the internet. 


By 2020, mobile accounted for more than half of all of Internet access revenue in more than 75 percent of countries, researchers at PwC said early in the year. Some analysts noted mobile Internet access revenues already had surpassed fixed network broadband revenue as early as 2013 or 2014.


That trend is expected to continue. By 2024, consultants at PwC say, mobile revenue will account for 68 percent of global Internet access market revenues. In other words, more than two thirds of all internet access revenue globally will be generated by mobile internet access. 


source: PwC 


FTTH is, in many countries, perhaps always the next-generation platform. In some countries, though, a mix of platforms is likely for decades. In a few countries, FTTH seems to be infeasible for consumer accounts, if deployed by telcos on a ubiquitous basis. Hence the interest in mobile access, or mobile network fixed access.


Thursday, November 12, 2020

U.S. Gigabit Take Rates Pass 5% for First Time

The percentage of U.S. fixed network broadband subscribers buying gigabit-speed connections surpassed five percent for the first time in the third quarter of 2020, reaching 5.6 percent, according to Openvault. That represents an increase of 124 percent from the third quarter of 2019, when take rates for gigabit services were 2.5 percent.


Gigabit service take rates reached 4.9 percent in the second quarter of 2020, says Openvault. Compare that to availability of gigabit services, which reach at least 80 percent of U.S. homes, counting only cable TV service provider facilities.


source: Openvault 


On the other hand, buyers of service at 10 Mbps and below also increased by 41 percent,  from 4.1 percent to 5.8 percent, quarter over quarter, Openvault says. “Growth at the lower end tier may be pointing to subscribers looking to save money with lower end, lower cost broadband tiers,” Openvault says.


Openvault says the “average” U.S. household (no idea whether this is median or mean) uses 384 GB per month, with average speeds downstream of 170 Mbps, upstream of 13 Mbps. 

source: Openvault

Half of all customers buy services providing downstream speeds between 100 Mbps and 400 Mbps. Nearly 30 percent of U.S. customers buy internet access at speeds ranging from 20 Mbps to 75 Mbps.

So it remains fair to say that there is a big difference between speeds consumers can buy, and speeds they actually do buy. Supply is one matter; demand another.

Directv-Dish Merger Fails

Directv’’s termination of its deal to merge with EchoStar, apparently because EchoStar bondholders did not approve, means EchoStar continue...