Saturday, February 13, 2021

How Cloonan's Curve Suggests Cable Operators Can Extend the Life of HFC

Nielsen’s Law of Internet Bandwidth states that a high-end user’s connection speed grows by 50 percent each year, doubling roughly every 21 months. That suggests a top-end internet access connection in 2025 will offer 10 Gbps speeds in the downstream. 


But it is reasonable to assume Nielsen’s growth rates cannot continue forever, as 50 percent compounded growth without end has some physical limits (time, physics, cost, demand, substitutes). At some point, as was true with personal computer processors, parallel processing becomes the method for boosting performance, while raw processing itself loses relevance as a product differentiator. 


In the consumer internet access space, that suggests both new ways of supplying bandwidth, less value produced by ever-increasing speed offers and a shift to other forms of value. 


Nielsen’s Law only predicts the top speed available for purchase, however, not the average or typical speed a consumer might buy. It has taken quite some time for customer uptake of gigabit internet access services to reach as much as eight percent share of total, for example. 


Keep in mind that the first U.S. gigabit services began commercialization in 2013. It has taken seven years for adoption to reach eight percent of the installed base, in part because that grade of service is not universally available in the U.S. market, for example. 


Cloonan's Curve provides a way of estimating bandwidth speeds purchased by cable modem customers, in relation to the headline speed (Nielsen rate). Most customers do not typically buy the fastest-available service, as that also is typically the most-expensive tier of service. Instead, they tend to buy the mid-level service. 


The caveat is that Cloonan’s Curve obviously does not apply to service providers that sell only a single tier of service, at the advertised headline rate (“gigabit only,” for example). 

source: Commscope


This illustration of downstream bandwidth plans actually purchased by customers suggests that although both Nielsen and Cloonan rates increase at about 50 percent per year, most customers buy services that offer six times to 20 times less speed than the fastest-available service tier. 


Think of the fastest tier of service (1 Gbps, for example) as the “billboard tier” that is featured in service provider advertising as the “speeds as fast as X” rate. Then consider the “common or popular tiers” as those in the middle of the offered speed ranges. Then there is an “economy tier” for customers with light usage patterns, limited app requirements or willingness to pay profiles. 


That has implications for network planning, bandwidth upgrades and marketing. Internet service providers can advertise the headline speed knowing that a small percentage of customers are going to buy it. 

source: Commscope


Networks obviously must be designed to deliver the headline rate. But total bandwidth consumption, which affects the capabilities of the rest of the network, does not assume that every customer buys the headline rate service. Instead, the variable portions of the network can be designed on the assumption that most customers will, in fact, not buy the headline service. 


Since speed and data consumption tend to be correlated, that affects capacity planning for backhaul, for example. Simply, the Cloonan Curve informs thinking about how much capacity must grow to support the actual mix of demand from the full set of customers, based on their actual buying patterns. 


That is important to match capital investment as much as possible to the variable demands placed on the network by various customer groups. 


For a cable ISP, there are other implications. At some point, it will make sense to migrate the highest-usage customers--often identical with those buying the headline service--off the hybrid fiber coax network and onto a parallel access network using fiber to the home instead. 


It is common to find that the top one percent of customers generate as much as 15 percent of total network usage, for example. So moving those customers off the core network frees up considerable capacity for the rest of the customers, 90 percent of whom might be supported on the legacy access network. 


That allows a longer useful life for the HFC network, as most customers will continue to buy the popular and economy tiers of service that still can be supported using HFC. 


Nielsen’s Law does not account for upstream bandwidth, however. Upstream capacity tends to grow at about half the rate of downstream bandwidth, or about 25 percent per year. 


Customer behavior also varies. On cable networks, the heaviest users (one percent) of customers generate as much as 47 percent of upstream bandwidth. And it often is the case that 80 percent of total upstream capacity demand is generated by just 10 percent of total users. 


ISPs using telecom platforms also will confront that same general issue of bandwidth growth, and the differential demand for tiers of service. Fiber to home platforms keep increasing performance as well, and some suggest future performance will be boosted economically based on use in the local loop of components originally commercialized to support data center optics. 


That is why 25 Gbps passive optical networks initially deployed for business-to-business applications in the local loop will be powered by commercial availability of data center optical components, Nokia argues. Commercialization for B2B use cases should then be leveraged for B2C applications as well. 


Nielsen’s Law and Cloonan’s Curve also suggest the potential limits of HFC as a platform. If consumer usage patterns do not change; if ISP usage policies do not change; if app usage patterns do not change; if pricing patterns do not change, then there is a point in time where HFC fails to support cable operator business models. 


The point of overlaying FTTH for the heaviest users is that, all other things being equal, the useful life of HFC is extended, with a more-gradual shift of cable platforms to FTTH over time. 


The issue is to avoid the stranded capital problem and immediate higher capital investment implications of a jump cut to FTTH. That would be as difficult for cable operators has it has proven to be for telcos.


Friday, February 12, 2021

"Doing Good" Versus "Feeling Good" in Government Telecom Policy

It bears repeating that the test of public policy to help people hinges on whether our policies actually succeed. It is not enough to "feel good." Our policies must also "do good."

Subsidies for use of telecom services, including broadband and voice, are common in part for reasons of social equity and also because communications networks have a network effect: the network is more valuable the more people who can use the network. So programs to fund rural broadband or provide assistance to low-income households are common. 

Ignoring the risk of waste or abuse, studies also suggest the difficulty of determining how effective the programs might be. How well the programs work is a recurring issue. 


Advanced communications networks also are viewed by governments as an economic development tool. For that reason, service providers also often get subsidies. 


Whether the programs should exist is rarely, if ever, an issue. Whether the programs work is more often the problem. One recurring issue is that the programs might not achieve their goals


Aside from mismanagement losses (waste, fraud or abuse), there is evidence that the targeted recipients would have purchased service for other reasons. Since the major internet service providers also offer their own programs for low-income households, it is by no means certain that the government efforts add as much as we might think. 


AT&T lifeline service, depending on a potential customer’s location, offers 10 megabits per second, for $10 per month; 5 megabits per second, for $10 per month; 3 megabits per second, for $5 per month; 1.5 megabits per second, for $5 per month or 768 kilobits per second, for $5 per month.


Lifeline internet access also is sold by Verizon, CenturyLink, Comcast, Cox Communications, Charter Communications, Suddenlink, Frontier Communications and others. Generally speaking, those services sell for about $10 a month.


ISPs have reasons for connecting low-income households, irrespective of the existence of government subsidies. Customer relationships are important as they create the opportunity for lifelong buying habits, for example. Existing relationships also make it easier to sell additional products.   


The issue then is whether funds supporting demand might be better spent increasing supply, especially of broadband capabilities rather than voice. 


Thursday, February 11, 2021

Lumen's FTTH Strategy is Based on its Customer Base

Lumen Technologies (formerly CenturyLink) has unusual constraints on its deployment of optical fiber access facilities. It has the most-rural serving areas of any of the major telco fixed network providers. In a sense, Lumen is similar to Charter Communications, the second-largest cable company. Charter also has a largely-rural customer base. 


Lumen also is an amalgam of a global capacity business with a largely-rural customer base, serving only a relative handful of tier-two U.S. metro areas such as Seattle, Portland, Denver, Salt Lake City, Omaha and Phoenix, for example. But most of Lumen’s revenue, and arguably most of its profit, come from the global enterprise business. 

source: Denver Business Journal 


Lumen makes about 72 percent of total revenue from enterprise customers, including its wholesale capacity customers, and about 28 percent of total revenue from mass markets, which include consumers and small businesses. Small business accounts for about three percent of revenue. 


Born of a merger between Qwest, the least dense of all the former Regional Bell Operating Companies, with CenturyLink in 2011, the combination created a firm with a significant rural and low-density service territory spread across 37 U.S. states, with a very-different global capacity business. 


That profile explains the Lumen Technology strategy and capital investment profile. Simply put, Lumen’s consumer business consists of rural and small town markets that are high cost, but produce relatively little revenue. 


Nor is Lumen exempt from the challenges other fixed network operators face from cable competition, which limit its ability to deploy optical access facilities and still make an adequate profit. 


All that means Lumen is rather more challenged than firms such as Verizon, with a mostly-dense territory. AT&T has a mix of rural and lower-density serving areas, plus some tier-one metro areas. 


Those realities shape the expected cost and return profiles those firms can expect from FTTH investments, especially when AT&T and Verizon also might use wireless access as an alternative. 


Lumen’s growth strategy relies on edge cloud infrastructure, fiber services for enterprises and gigabit-enabled consumer customers, according to Jeff Storey, Lumen Technologies CEO. That means micro-targeting areas where demand for gigabit services is strongest and where an adequate financial return can be earned as well. 


The corollary is that Lumen does not believe its best use of access network capital for broadband access is in a relatively smaller subset of total consumer access locations. As Lumen reports, of $1.4 billion in total revenue in the fourth quarter of 2020, about eight percent came from federal subsidies for high-cost rural areas. 


Lumen in fact made a decision not to accept most such funds in 2022. “Our subsidy revenue will step down from about $500 million a year to roughly $26 million in 2022,” said Neel Dev, Lumen CFO. 


If Lumen spends roughly $300 million of its own capital, in addition to $500 million of subsidy funds, it gives you some idea of the cost of building rural broadband facilities. Over a three-year period, Lumen added 1.1 million new housing units. 


That suggests $800 million of capital adds 330,000 new passed homes, or roughly a cost of $2424 per location. Other recipients might spend as much as $5,000 per location. 


Those subsidies help defray the cost of rural service, and also suggest the relatively-small expected returns from expanding rural broadband, in Lumen’s view. Lumen’s average revenue for a fiber-to-home connection is $56 a month, while the average take rate in any area where FTTH is available is just 28 percent. In other words, about 72 percent of the FTTH investment is stranded, generating no incremental revenue. 


FTTH-served homes are about 15 percent of the consumer footprint. 

source: Lumen Technologies 


The point is that the clear strategy is to harvest revenues from the consumer access business, with targeting FTTH upgrades for gigabit service in some neighborhoods. 


Most of the enterprise or global capacity assets were contributed by Qwest, while most of the then CenturyTel customers were rural consumer or smaller business accounts. 


The combined firm had at that point about five million broadband (more likely 12 million), 17 million voice and  1.4 million video accounts, as well as 0.85 million wireless units, according to one news report that appears to have undercounted broadband accounts. 


At the time of the merger, CenturyLink had about 6.9 million voice lines and 2.3 million broadband accounts in service. Qwest reported about 9.7 million voice lines and 2.85 million broadband lines in service.  


Today, Lumen makes the bulk of its revenue--about 72 percent of total--from enterprise and wholesale customers. 


source: Lumen Technologies 


About 25 percent of revenue comes from consumer services, while about three percent of total comes from small business, now known--as most large telcos report--mass market. 

source: Lumen Technologies

25 Gbps PON for Access Networks


Though business cases sometimes lag commercial optical fiber access platforms, bandwidth demand always increases over time, leading to more-affordable upgrades of optical access platforms. Deployment first is commercially viable for business or backhaul use cases in the local loop, and that should be the case for 25G PON as well.

Watch for a Marked Acceleration of Gigabit Home Broadband Subscriptions in 2021

When will gigabit home broadband hit an inflection point? Probably in 2021. The inflection point is important, as it has in the past been the point at which slow or low adoption of a product accelerates, growing at a faster clip. 


Gigabit home broadband, on the other hand, might be nearing an inflection point. Important consumer technologies tend to hit an inflection point at about 10 percent adoption. And Openvault data suggests gigabit home broadband reached 8.5 percent adoption at the end of 2020. 


That means gigabit accounts will hit 10 percent of the home broadband installed base in 2021. And if the pattern holds, that means the adoption rate will shift to a higher gear, growing much more rapidly than we have seen over the last five to 10 years.


The reason 10 percent seems to be the trigger, one might argue,  is that it is the point where early adopters have become customers and users, setting the stage for behavior to extend to the majority of consumers. 

source: Engineering.com 


When will U.S. 5G hit a subscriber inflection point? Not this year.


In January 2021, 5G coverage reached 75 percent of potential U.S. users, albeit mostly using low-band spectrum, so performance improvement is slight. Coverage refers to availability, not subscriptions or usage. By July 2021, 80 percent  of the U.S. population is expected to have 5G coverage, says PwC. 


PwC forecasts that 12 percent of mobile devices in use by U.S. customers in July 2021 will be 5G enabled. That might or might not mean all those devices are used on the 5G network, however. 


Possibly for that reason, PwC suggests the 5G inflection point for adoption will happen later, in 2023.


One can see an example in cell phone adoption by U.S. households. About 1994, household adoption reached 10 percent or so, after a longer period of slow adoption. An analogous pattern happened with smartphone adoption as well. 

 

source: Our World in Data 


The adoption pattern perhaps is easier to visualize with a longer time frame. Here is a chart showing cell phone adoption in the United Kingdom.


source: Our World in Data


A wide range of physical products have shown the same pattern. Automobile adoption shows adoption accelerating once the 10-percent threshold was hit. 


source: Our World in Data

Pareto Theorem, or 80/20 Rule, Applies to Telecom Attackers as Well

This is a good illustration of the Pareto theorem, which states that 80 percent of instances or outcomes in business or nature come from 20 percent of the cases or effort. The Pareto theorem is popularly known as the 80/20 rule


Of the 83 challengers in 20 telecom markets analyzed by Bain & Company, only (22 percent) grew both their revenue and free cash flow and increased their share of profit from 2010 to 2017.


source: Bain and Company 


That is a nearly-perfect example of the predicted Pareto pattern. 


source: IP Carrier 


Vilfredo Pareto, an Italian economist, was studying the distribution of wealth in1906. What he found was a distribution most people would commonly understand as the "80/20 rule," where a disproportionate share of results come from 20 percent of actions. The Pareto distribution has been found widely in the physical and human worlds. It applies, for example, to the sizes of human settlements (few cities, many hamlets/villages). It fits the file size of Internet traffic (many smaller files, few larger ones).


It describes the distribution of oil reserves (a few large fields, many small fields) and jobs assigned supercomputers (a few large ones, many small ones). It describes the price returns on individual stocks. It likely holds for total returns from stock investments over a span of several years, as most observers point out that most of the gain, and most of the loss in a typical portfolio comes from changes on just a few days a year.


The Pareto distribution is what one finds when examining the sizes of sand particles, meteorites or numbers of species per genus, areas burnt in forest fires, casualty losses: general liability, commercial auto, and workers compensation.


The Pareto distribution also fits sales of music from online music stores and mass market retailer market share. The viewership of a single video over time fits the Pareto curve. Pareto describes the distribution of social networking sites. It describes the readership of books and the lifecycle value of telecom customers.


Connectivity Provider Growth Outside the Core Races Loss of Core Value

Few executives in the connectivity business would quarrel with the argument that--if possible--moves into new lines of business beyond core or traditional connectivity are desirable, if not mandatory. The same sort of logic appeals to executives in software, content, infrastructure, devices or commerce industries as well. 


Rakuten was an e-commerce company before it began offering mobile service. Softbank was a software conglomerate before it launched Softbank Mobile. 


Reliance Industries was a huge Indian conglomerate before it launched Reliance Jio and became the leading mobile operator in just a few years. 


Jio Fiber, the fixed network business, launched in 2019. Some might expect similar disruption there as well. 


Application provider competition in the core communications service space likewise has become significant since 2005, according to Mckinsey estimates. By 2018, over the top alternatives had cannibalized 40 percent of mobile messaging revenue, 25 percent of fixed network voice revenues and seven percent of mobile voice revenues. 

source: GSMA Intelligence 


In the internet era, the walls between industries are more porous than they used to be. It is easier for firms to launch attacks in adjacent parts of the content, information, infrastructure or  applications ecosystem with less effort. 


So far, the evidence seems to suggest that it is easier for well-capitalized firms outside the traditional telecom industry to make inroads into communications than for connectivity providers to grab significant positions in adjacent ecosystem roles. 


The well-deserved argument will often be made that the relative lack of telecom services provider success has to do with industry culture, innovation skill, bureaucratic decision-making processes or regulatory constraints. 


Others might say a key impediment is the shareholder base of public telecom companies, particularly those stakeholders who see telcos as dividend-producing value assets. Such expectations mean that significant cash flow must be diverted to dividend payments, and for that reason are not available to be used to support growth initiatives.


All of the above likely operate to constrain the ability of retail connectivity providers selling to consumers and most businesses to deploy their capital for growth outside the core business. 


For such reasons it might always be easier for “outsiders” to take market share and create value in communications services, than for telcos to become value creators and market share leaders outside the core communications function. 


But some telecom industry executives would be well pleased if it were possible to grow non-core revenues to as much as half of total revenues.


Wednesday, February 10, 2021

Gigabit Take Rates Might be at the Inflection Point

In the fourth quarter of 2020, about 8.5 percent of provisioned fixed network internet access connections operated at speeds of 1 Gbps or so, according to Openvault data. That is about a tripling of gigabit customers in a year’s time, showing an acceleration of buying. 


In late 2018, fewer than two percent of buyers were purchasing gigabit speed services. So growth from 2018 to 2019 was about 55 percent. One might argue that take rates for gigabit internet access now have hit an inflection point, and will start growing much faster, possibly as much as doubling every year for a few years. 


source: Openvault 


The reason for such a change in growth is that successful mass market consumer products tend to hit an inflection point at about 10 percent take rates. And gigabit per second internet access seems right at that point. 


source: Openvault 


Sometimes "Following the Science" is Not Yet Possible, for Covid or Broadband

Most people would agree that understanding means and ends--the relationship between outcomes and actions to achieve those outcomes--make good sense. It does not make good sense to desire an outcome and then take actions which do not achieve the desired outcomes. 


That is true everywhere in the connectivity business, but also true in the setting of public policy. Still, we often do not have clarity on the relationship between means and ends. We all believe that quality broadband is important for economic development, job growth, educational outcomes and social equity or inclusion. 


But our public policies to support those outcomes might not have a clear means-ends causation link. We can point to correlation between high use of quality broadband and other outcomes (jobs, economic growth, household income, household wealth, health, safety, educational outcomes, inclusion, educational attainment). 


But we cannot prove “causation” of those outcomes from the supply and uptake of quality broadband, and likely never will be able to do so, as those outcomes are the result of too many independent variables. 


So far, it also appears that our understanding of Covid science and the public policies we see as means to solving the problem of pandemic illness is insufficient for too much confidence about the effectiveness of lockdowns, for example, as a way of slowing disease spread, as logical as that policy seems. 


Here is a chart showing the relationship between Covid-19 death rates and the severity of restrictions on business operations, based on data gathered from the U.S. Census Bureau, the U.S. Bureau of Labor Statistics, the Kaiser Family Foundation, Ballotpedia, McGuireWoods, Editorial Projects in Education, The COVID Tracking Project, National Restaurant Association, Littler Mendelson, JDSupra and Ogletree Deakins by Wallethub. 

source: WalletHub


The issue here is no clear pattern. We have states with high death rates with few restrictions and high restrictions. We have lower death rates in states with few restrictions and high restrictions. It is plausible, perhaps even likely, that conditions other than business closures are at work. 


This lack of pattern also means we are setting public policy without clear scientific consensus on means and ends, practices and derivative outcomes. 


There is a bit more possible clarity when looking at unemployment rates and business closure policies. This analysis shows the relationship between unemployment rates and business closure policies. 


source: WalletHub 


One would expect higher unemployment in areas with more restrictiveness in terms of business operations. This chart suggests a better relationship, with “many restriction” states tending to have higher unemployment, while “few restrictions” states tend to have lower unemployment, as one might expect. 


The problem, so far, is that our expectations about death rates and business closures--death rates “should” be lower when business closures are extensive--do not seem confirmed. That means we cannot be sure business closures actually affect death rates in a direct way. 


It is possible that infection rates might have a better means-ends relationship, though. Logically, greater exposure to people should result in higher rates of disease transmission. To the extent that business closures limit exposure, infection rates should be therefore lower. 


There is evidence that restaurants and gyms were “superspreader” venues, for example. There also is evidence that restaurant Covid spread rates were extremely low. The point is that the “illness transmission science” is far from settled. 


Yet other studies, noting that population density seems to matter. 


That noted, some suggest that limiting total restaurant seating capacity might be more effective than total bans on indoor restaurant operations. 


But infection rates from other paths, such as between household members, even under lockdown conditions, might well have increased from other policies such as stay-at-home orders. 


And studies relying on use of cell phones and mobility also have some methodological issues. People who traveled more had, by definition, more exposure risk. We might not be able to accurately track transmission venues, for that reason.


The point is that the “science” of Covid illness transmission--and the implications for public policy--are not yet largely clear, beyond the general observation that transmission between people is contingent on the number of people one comes into contact with. Population density and duration seem to matter, of course. 


But personal behavior also will matter. 


Humility about the correctness of our public policy and public health recommendations is called for. To a greater extent than some might be willing to admit, we are guessing about what might work, why and how well. 


Other direct consequences, such as job loss, firm bankruptcies, lower economic growth, lower tax revenues, suicides, mental illness and crime rates, also occur because of shutdown policies. 


Choice is required and the science does not seem settled sufficiently to adequately inform our choices, well-intentioned though they may be. 


That is not a completely unusual context for any public policy, though. We often do not have full knowledge of causation mechanisms, so our policies are, to some extent, guesses.


Customers Often Do Not Like Dynamic Pricing

Many observers would argue that dynamic pricing--differential pricing of products based on criteria such as time of day, volume or some other criteria benefits both suppliers and customers. If peak loads, for example, can be shifted to off peak, there are capital investment advantages for suppliers.


That is why long distance calls once were priced dynamically: highest prices during workday working hours; lower prices on weekday evenings and nights and weekends. 


On the other hand, consumer behavior suggests buyers often do not choose to buy dynamically-priced products, but prefer fixed-price, flat-rate subscriptions. We are left to try and explain why that behavior persists. 


Disney’s latest quarterly report might suggest that customers prefer subscriptions to buying on a dynamic basis. Indeed, we might note the same behavior across a wide range of digital content or communications products


Dynamic pricing, where items are purchased a la carte, by the piece and on demand, creates spending uncertainty. Prices vary by criteria such as volume purchased, time of day purchased, possibly how an item is purchased or any discounting methods a seller wishes to use. 


Customers also have notions about the value of specific content items, whole channels or networks. Any customer who has compared the value and price of a Netflix subscription to the cost of buying content dynamically will conclude that the subscription seems to cost less, under any scenario of moderate use. 


Where a linear video subscription--depending on the number of channels purchased--might cost $50 to $80 a month, a streaming service--depending on which services the customer wants to buy--can start at about $7 a month each but might cost $15 a month or so.


Buying only a few pay-per-view or on-demand items a month can exceed the cost of a Disney Plus, Netflix, Prime, Hulu, HBO Max or Peacock subscription, for example. Granted, no subscription offer from any single provider ever offers “most” content, so most customers likely buy more than one streaming service. 


The point is that dynamic pricing offers the most value when a customer’s consumption is low, but offers less perceived value when consumption is moderate (possibly watching more than three movie or TV episodes a month that are purchased dynamically). 


That same trade off exists with other products as well. 


For buyers of cloud computing services, dynamic pricing is a better buy for customers with lower demand, variable workloads, small information technology support staffs or infrastructure and relatively lower adoption of new computing use cases. 


Cloud computing is often not a better buy for customers with high computing workcycle demand, fixed or predictable workloads, large IT infrastructure, staffs and skills or high rates of adopting new computing use cases. 


So there are good financial reasons for customers to see value in subscriptions that offer more content, at lower prices, compared to dynamic buys. 


Many also would argue that the reason customers prefer flat-rate pricing, even when they might pay less buying on a dynamic basis, is unambiguous cost. Any dynamic pricing mechanism--buy by the instance or item--introduces uncertainty of cost. 


Subscriptions offer a static price: the customer knows what the recurring cost is going to be. 


Dynamic electricity pricing is not as popular as one might think, for example, even if consumers could save money consuming more energy off peak. 


Virtually unlimited usage of domestic voice, text messaging or fixed or mobile internet access provide examples. Linear and over-the-top streaming video subscriptions provide other examples. 


Likewise, music streaming has replaced music purchases. Most content subscriptions--physical or digital also are offered on a flat-rate basis. Changes in buyer demand are at work, as well as supplier preferences, but perceptions of value are almost certainly at work. 


Customers seem to value broad catalog access without ownership to ownership of a fraction of the full catalog, which is what the shift to music streaming services, as opposed to content buying (song downloads) represents. 


In the connectivity business there are analogies. Though usage-based (dynamic pricing) is common for some products (international calls in some cases), many other products are sold on a flat rate basis (unlimited usage of internet access, domestic voice and texting. 


Other products are sold based as buckets of usage (subscriptions with different usage allowances). Customers buy subscriptions, but with different usage volumes included, typically with additional costs for usage above the allowance threshold.  


That created the profit driver of “overage charges.” Historically, overage charges were a significant contributor to supplier profits, and avoidance of overage charges became a driver of consumer behavior. 


Customers preferred to buy data plans with more usage than they ever expected, simply to avoid overage charges that introduced uncertainty of cost. 


During the period of video rentals, such overage charges were a major driver of profit for video rental outlets, and a source of customer unhappiness as well. 


For all those reasons, customers often prefer flat-rate pricing to dynamic pricing. They often prefer subscriptions to dynamic purchasing (a la carte, by the piece or instance) as well.


"Lean Back" and "Lean Forward" Differences Might Always Condition VR or Metaverse Adoption

By now, it is hard to argue against the idea that the commercial adoption of “ metaverse ” and “ virtual reality ” for consumer media was in...