Thursday, February 11, 2021

Watch for a Marked Acceleration of Gigabit Home Broadband Subscriptions in 2021

When will gigabit home broadband hit an inflection point? Probably in 2021. The inflection point is important, as it has in the past been the point at which slow or low adoption of a product accelerates, growing at a faster clip. 


Gigabit home broadband, on the other hand, might be nearing an inflection point. Important consumer technologies tend to hit an inflection point at about 10 percent adoption. And Openvault data suggests gigabit home broadband reached 8.5 percent adoption at the end of 2020. 


That means gigabit accounts will hit 10 percent of the home broadband installed base in 2021. And if the pattern holds, that means the adoption rate will shift to a higher gear, growing much more rapidly than we have seen over the last five to 10 years.


The reason 10 percent seems to be the trigger, one might argue,  is that it is the point where early adopters have become customers and users, setting the stage for behavior to extend to the majority of consumers. 

source: Engineering.com 


When will U.S. 5G hit a subscriber inflection point? Not this year.


In January 2021, 5G coverage reached 75 percent of potential U.S. users, albeit mostly using low-band spectrum, so performance improvement is slight. Coverage refers to availability, not subscriptions or usage. By July 2021, 80 percent  of the U.S. population is expected to have 5G coverage, says PwC. 


PwC forecasts that 12 percent of mobile devices in use by U.S. customers in July 2021 will be 5G enabled. That might or might not mean all those devices are used on the 5G network, however. 


Possibly for that reason, PwC suggests the 5G inflection point for adoption will happen later, in 2023.


One can see an example in cell phone adoption by U.S. households. About 1994, household adoption reached 10 percent or so, after a longer period of slow adoption. An analogous pattern happened with smartphone adoption as well. 

 

source: Our World in Data 


The adoption pattern perhaps is easier to visualize with a longer time frame. Here is a chart showing cell phone adoption in the United Kingdom.


source: Our World in Data


A wide range of physical products have shown the same pattern. Automobile adoption shows adoption accelerating once the 10-percent threshold was hit. 


source: Our World in Data

Pareto Theorem, or 80/20 Rule, Applies to Telecom Attackers as Well

This is a good illustration of the Pareto theorem, which states that 80 percent of instances or outcomes in business or nature come from 20 percent of the cases or effort. The Pareto theorem is popularly known as the 80/20 rule


Of the 83 challengers in 20 telecom markets analyzed by Bain & Company, only (22 percent) grew both their revenue and free cash flow and increased their share of profit from 2010 to 2017.


source: Bain and Company 


That is a nearly-perfect example of the predicted Pareto pattern. 


source: IP Carrier 


Vilfredo Pareto, an Italian economist, was studying the distribution of wealth in1906. What he found was a distribution most people would commonly understand as the "80/20 rule," where a disproportionate share of results come from 20 percent of actions. The Pareto distribution has been found widely in the physical and human worlds. It applies, for example, to the sizes of human settlements (few cities, many hamlets/villages). It fits the file size of Internet traffic (many smaller files, few larger ones).


It describes the distribution of oil reserves (a few large fields, many small fields) and jobs assigned supercomputers (a few large ones, many small ones). It describes the price returns on individual stocks. It likely holds for total returns from stock investments over a span of several years, as most observers point out that most of the gain, and most of the loss in a typical portfolio comes from changes on just a few days a year.


The Pareto distribution is what one finds when examining the sizes of sand particles, meteorites or numbers of species per genus, areas burnt in forest fires, casualty losses: general liability, commercial auto, and workers compensation.


The Pareto distribution also fits sales of music from online music stores and mass market retailer market share. The viewership of a single video over time fits the Pareto curve. Pareto describes the distribution of social networking sites. It describes the readership of books and the lifecycle value of telecom customers.


Connectivity Provider Growth Outside the Core Races Loss of Core Value

Few executives in the connectivity business would quarrel with the argument that--if possible--moves into new lines of business beyond core or traditional connectivity are desirable, if not mandatory. The same sort of logic appeals to executives in software, content, infrastructure, devices or commerce industries as well. 


Rakuten was an e-commerce company before it began offering mobile service. Softbank was a software conglomerate before it launched Softbank Mobile. 


Reliance Industries was a huge Indian conglomerate before it launched Reliance Jio and became the leading mobile operator in just a few years. 


Jio Fiber, the fixed network business, launched in 2019. Some might expect similar disruption there as well. 


Application provider competition in the core communications service space likewise has become significant since 2005, according to Mckinsey estimates. By 2018, over the top alternatives had cannibalized 40 percent of mobile messaging revenue, 25 percent of fixed network voice revenues and seven percent of mobile voice revenues. 

source: GSMA Intelligence 


In the internet era, the walls between industries are more porous than they used to be. It is easier for firms to launch attacks in adjacent parts of the content, information, infrastructure or  applications ecosystem with less effort. 


So far, the evidence seems to suggest that it is easier for well-capitalized firms outside the traditional telecom industry to make inroads into communications than for connectivity providers to grab significant positions in adjacent ecosystem roles. 


The well-deserved argument will often be made that the relative lack of telecom services provider success has to do with industry culture, innovation skill, bureaucratic decision-making processes or regulatory constraints. 


Others might say a key impediment is the shareholder base of public telecom companies, particularly those stakeholders who see telcos as dividend-producing value assets. Such expectations mean that significant cash flow must be diverted to dividend payments, and for that reason are not available to be used to support growth initiatives.


All of the above likely operate to constrain the ability of retail connectivity providers selling to consumers and most businesses to deploy their capital for growth outside the core business. 


For such reasons it might always be easier for “outsiders” to take market share and create value in communications services, than for telcos to become value creators and market share leaders outside the core communications function. 


But some telecom industry executives would be well pleased if it were possible to grow non-core revenues to as much as half of total revenues.


Wednesday, February 10, 2021

Gigabit Take Rates Might be at the Inflection Point

In the fourth quarter of 2020, about 8.5 percent of provisioned fixed network internet access connections operated at speeds of 1 Gbps or so, according to Openvault data. That is about a tripling of gigabit customers in a year’s time, showing an acceleration of buying. 


In late 2018, fewer than two percent of buyers were purchasing gigabit speed services. So growth from 2018 to 2019 was about 55 percent. One might argue that take rates for gigabit internet access now have hit an inflection point, and will start growing much faster, possibly as much as doubling every year for a few years. 


source: Openvault 


The reason for such a change in growth is that successful mass market consumer products tend to hit an inflection point at about 10 percent take rates. And gigabit per second internet access seems right at that point. 


source: Openvault 


Sometimes "Following the Science" is Not Yet Possible, for Covid or Broadband

Most people would agree that understanding means and ends--the relationship between outcomes and actions to achieve those outcomes--make good sense. It does not make good sense to desire an outcome and then take actions which do not achieve the desired outcomes. 


That is true everywhere in the connectivity business, but also true in the setting of public policy. Still, we often do not have clarity on the relationship between means and ends. We all believe that quality broadband is important for economic development, job growth, educational outcomes and social equity or inclusion. 


But our public policies to support those outcomes might not have a clear means-ends causation link. We can point to correlation between high use of quality broadband and other outcomes (jobs, economic growth, household income, household wealth, health, safety, educational outcomes, inclusion, educational attainment). 


But we cannot prove “causation” of those outcomes from the supply and uptake of quality broadband, and likely never will be able to do so, as those outcomes are the result of too many independent variables. 


So far, it also appears that our understanding of Covid science and the public policies we see as means to solving the problem of pandemic illness is insufficient for too much confidence about the effectiveness of lockdowns, for example, as a way of slowing disease spread, as logical as that policy seems. 


Here is a chart showing the relationship between Covid-19 death rates and the severity of restrictions on business operations, based on data gathered from the U.S. Census Bureau, the U.S. Bureau of Labor Statistics, the Kaiser Family Foundation, Ballotpedia, McGuireWoods, Editorial Projects in Education, The COVID Tracking Project, National Restaurant Association, Littler Mendelson, JDSupra and Ogletree Deakins by Wallethub. 

source: WalletHub


The issue here is no clear pattern. We have states with high death rates with few restrictions and high restrictions. We have lower death rates in states with few restrictions and high restrictions. It is plausible, perhaps even likely, that conditions other than business closures are at work. 


This lack of pattern also means we are setting public policy without clear scientific consensus on means and ends, practices and derivative outcomes. 


There is a bit more possible clarity when looking at unemployment rates and business closure policies. This analysis shows the relationship between unemployment rates and business closure policies. 


source: WalletHub 


One would expect higher unemployment in areas with more restrictiveness in terms of business operations. This chart suggests a better relationship, with “many restriction” states tending to have higher unemployment, while “few restrictions” states tend to have lower unemployment, as one might expect. 


The problem, so far, is that our expectations about death rates and business closures--death rates “should” be lower when business closures are extensive--do not seem confirmed. That means we cannot be sure business closures actually affect death rates in a direct way. 


It is possible that infection rates might have a better means-ends relationship, though. Logically, greater exposure to people should result in higher rates of disease transmission. To the extent that business closures limit exposure, infection rates should be therefore lower. 


There is evidence that restaurants and gyms were “superspreader” venues, for example. There also is evidence that restaurant Covid spread rates were extremely low. The point is that the “illness transmission science” is far from settled. 


Yet other studies, noting that population density seems to matter. 


That noted, some suggest that limiting total restaurant seating capacity might be more effective than total bans on indoor restaurant operations. 


But infection rates from other paths, such as between household members, even under lockdown conditions, might well have increased from other policies such as stay-at-home orders. 


And studies relying on use of cell phones and mobility also have some methodological issues. People who traveled more had, by definition, more exposure risk. We might not be able to accurately track transmission venues, for that reason.


The point is that the “science” of Covid illness transmission--and the implications for public policy--are not yet largely clear, beyond the general observation that transmission between people is contingent on the number of people one comes into contact with. Population density and duration seem to matter, of course. 


But personal behavior also will matter. 


Humility about the correctness of our public policy and public health recommendations is called for. To a greater extent than some might be willing to admit, we are guessing about what might work, why and how well. 


Other direct consequences, such as job loss, firm bankruptcies, lower economic growth, lower tax revenues, suicides, mental illness and crime rates, also occur because of shutdown policies. 


Choice is required and the science does not seem settled sufficiently to adequately inform our choices, well-intentioned though they may be. 


That is not a completely unusual context for any public policy, though. We often do not have full knowledge of causation mechanisms, so our policies are, to some extent, guesses.


Customers Often Do Not Like Dynamic Pricing

Many observers would argue that dynamic pricing--differential pricing of products based on criteria such as time of day, volume or some other criteria benefits both suppliers and customers. If peak loads, for example, can be shifted to off peak, there are capital investment advantages for suppliers.


That is why long distance calls once were priced dynamically: highest prices during workday working hours; lower prices on weekday evenings and nights and weekends. 


On the other hand, consumer behavior suggests buyers often do not choose to buy dynamically-priced products, but prefer fixed-price, flat-rate subscriptions. We are left to try and explain why that behavior persists. 


Disney’s latest quarterly report might suggest that customers prefer subscriptions to buying on a dynamic basis. Indeed, we might note the same behavior across a wide range of digital content or communications products


Dynamic pricing, where items are purchased a la carte, by the piece and on demand, creates spending uncertainty. Prices vary by criteria such as volume purchased, time of day purchased, possibly how an item is purchased or any discounting methods a seller wishes to use. 


Customers also have notions about the value of specific content items, whole channels or networks. Any customer who has compared the value and price of a Netflix subscription to the cost of buying content dynamically will conclude that the subscription seems to cost less, under any scenario of moderate use. 


Where a linear video subscription--depending on the number of channels purchased--might cost $50 to $80 a month, a streaming service--depending on which services the customer wants to buy--can start at about $7 a month each but might cost $15 a month or so.


Buying only a few pay-per-view or on-demand items a month can exceed the cost of a Disney Plus, Netflix, Prime, Hulu, HBO Max or Peacock subscription, for example. Granted, no subscription offer from any single provider ever offers “most” content, so most customers likely buy more than one streaming service. 


The point is that dynamic pricing offers the most value when a customer’s consumption is low, but offers less perceived value when consumption is moderate (possibly watching more than three movie or TV episodes a month that are purchased dynamically). 


That same trade off exists with other products as well. 


For buyers of cloud computing services, dynamic pricing is a better buy for customers with lower demand, variable workloads, small information technology support staffs or infrastructure and relatively lower adoption of new computing use cases. 


Cloud computing is often not a better buy for customers with high computing workcycle demand, fixed or predictable workloads, large IT infrastructure, staffs and skills or high rates of adopting new computing use cases. 


So there are good financial reasons for customers to see value in subscriptions that offer more content, at lower prices, compared to dynamic buys. 


Many also would argue that the reason customers prefer flat-rate pricing, even when they might pay less buying on a dynamic basis, is unambiguous cost. Any dynamic pricing mechanism--buy by the instance or item--introduces uncertainty of cost. 


Subscriptions offer a static price: the customer knows what the recurring cost is going to be. 


Dynamic electricity pricing is not as popular as one might think, for example, even if consumers could save money consuming more energy off peak. 


Virtually unlimited usage of domestic voice, text messaging or fixed or mobile internet access provide examples. Linear and over-the-top streaming video subscriptions provide other examples. 


Likewise, music streaming has replaced music purchases. Most content subscriptions--physical or digital also are offered on a flat-rate basis. Changes in buyer demand are at work, as well as supplier preferences, but perceptions of value are almost certainly at work. 


Customers seem to value broad catalog access without ownership to ownership of a fraction of the full catalog, which is what the shift to music streaming services, as opposed to content buying (song downloads) represents. 


In the connectivity business there are analogies. Though usage-based (dynamic pricing) is common for some products (international calls in some cases), many other products are sold on a flat rate basis (unlimited usage of internet access, domestic voice and texting. 


Other products are sold based as buckets of usage (subscriptions with different usage allowances). Customers buy subscriptions, but with different usage volumes included, typically with additional costs for usage above the allowance threshold.  


That created the profit driver of “overage charges.” Historically, overage charges were a significant contributor to supplier profits, and avoidance of overage charges became a driver of consumer behavior. 


Customers preferred to buy data plans with more usage than they ever expected, simply to avoid overage charges that introduced uncertainty of cost. 


During the period of video rentals, such overage charges were a major driver of profit for video rental outlets, and a source of customer unhappiness as well. 


For all those reasons, customers often prefer flat-rate pricing to dynamic pricing. They often prefer subscriptions to dynamic purchasing (a la carte, by the piece or instance) as well.


Tuesday, February 9, 2021

Can KT Become a Platform? Can Any Telco Do So?

Korea Telecom wants to become a digital platform company, not a telco. That ambition arguably is shared somewhat widely among tier-one connectivity service providers globally and has been a strategy recommended in some form by most bigger consulting companies. 


Simply, becoming a platform company changes the business model from direct supplier of products to a role as an ecosystem organizer or marketplace. That arguably is an aspirational goal more than anything else. 


What that aspiration means in practice is that KT as a digico “will shift our focus from the telecommunications sector, where growth is stalled due to government regulations, to artificial intelligence (AI), big data, and cloud computing businesses to become the nation's number-one platform operator in the B2B market," said KT CEO Koo Hyun-mo.


So there are qualifications. KT, if successful, would become a platform in the business market, not the consumer market. It would explicitly aim to become the center and organizer of an ecosystem for artificial intelligence, big data analytics and cloud computing. 


Purists and researchers will likely argue about whether all of that actually adds up to KT becoming a platform, in the sense that Amazon, eBay, Alibaba, ridesharing or lodging apps  might be considered platforms. 


A platform, definitionally, makes its money putting buyers and sellers and ecosystem participants together. In computing, a platform is any combination of hardware and software used as a foundation upon which applications, services, processes, or other technologies are built, hosted or run.


Operating systems are platforms, allowing software and applications to be run. Devices are platforms. Cloud computing might be said to be a platform, as systems are said to be platforms. 


Standards likely are thought of as platforms by some. 


In other cases components such as central processing units, physical or software interfaces (Ethernet, Wi-Fi, 5G, application programming interfaces) are referred to as platforms. Browsers might be termed platforms by some. Social media apps are seen as platforms as well. 


The platform business model requires creation of a marketplace or exchange that connects different participants: users with suppliers; sellers with buyers. A platform functions as a matchmaker, bringing buyers and sellers together, but classically not owning the products sold on the exchange. 


A platform orchestrates interactions and value. In fact, a platform’s value may derive in large part from the actions and features provided by a host of ecosystem participants. Facebook’s content is created by user members. Amazon’s customer reviews are a source of value for e-tailing buyers. 


Consumers and producers can swap roles on a platform. Users can ride with Uber today and drive for it tomorrow; travelers can stay with AirBNB one night and serve as hosts for other customers the next. Customers of pipe businesses--an airline, router or phone suppliers, grocery stores-- cannot do so. 


So KT can increase the percentage of revenue it earns from supplying digital, computing, application or non-connectivity services without becoming a platform. As a practical matter, that is what most telco executives have in mind when talking about becoming platforms. 


For KT, even limiting its ambitions to generating more digital and non-connectivity revenue does not make it a platform. That would still be an important, valuable and value-sustaining move. But KT has a very long ways to go, even in its stated objectives of becoming a B2B platform.


Total KT revenue is about 24 trillion won. All B2B revenues at the end of 2020 were about 2.78 trillion won (about 11.5 percent). Information technology services were about 1 trillion won, or about four percent of total revenues. AI and other digital services were about 0.5 trillion won, or about two percent of total revenues. 


It might be a long time between non-connectivity revenues in the B2B part of its business are as much as half of total revenues. And those revenues might not represent a platform transformation of the business model.


KT could win significantly without ever becoming a platform. And some might argue few telcos can ever actually hope to become platforms in the classic sense. Perhaps the more important goal is simply to reduce reliance on traditional connectivity revenues.


Directv-Dish Merger Fails

Directv’’s termination of its deal to merge with EchoStar, apparently because EchoStar bondholders did not approve, means EchoStar continue...