Saturday, October 1, 2022

Genie's Coming Out of the Bottle: Net Neutrality is Dead

Communications policymakers are about to let the genie out of the bottle. Once they feared the possible impact of internet service providers charging some app providers for using their networks. That gave us network neutrality. 


Now, some appear more worried about a few hyperscale app providers, to the extent that they are willing to gut network neutrality entirely. 


It is hard to know just what else might also change. As with any tax, the firms affected by the tax will not “pay” it. Customers, shareholders or business partners will do so. The only question is which sets of stakeholders see higher costs. 


And, as with any tax, once the principle is established, other sets of payees might emerge later. And tax rates will tend to grow. 


But network neutrality will be an early casualty. 


One might well argue the whole net neutrality policy was wrong from the beginning. We may someday look back and argue the “sending party pays” policies for app providers were equally wrong. 


Some might argue that new taxes on a few hyperscalers shed more light on broken business models of some ISPs, in some regions. Almost always, such taxes bear the mark of attempted industrial policy as well: efforts to protect domestic suppliers from foreign suppliers. 


Ironically, many of the same people and policymakers who touted the value of network neutrality now are appearing to kill network neutrality. Net neutrality was supposed to protect app providers from internet service providers. 


The overturning of net neutrality will happen to protect ISPs from hyperscalers. Never was it more clear that every act of public policy has private consequences; winners and losers. 


The new policies--similar to taxes levied in South Korea--would charge fees on a few hyperscale app and content providers, and represents a shift in funding of universal service and access networks in general.


Basically, customers and taxpayers have funded universal service and, by extension, universal service. For the most part, customers have paid the costs of the access networks. The new policies proposed for EU countries would add taxation of a few large app providers to that mix. 


This will kill the logic behind network neutrality. Once governments accept the principle that the big providers of apps ISP customers use must pay to use the access networks, the door is open to charge other app providers. 


The door is open to effectively subsidize some apps and not others; to allow some apps expedited access or higher quality of service, as has been true for business grade services, even where network neutrality rules are applied. 


It is difficult to see all the potential ramifications. Once the notion of taxing traffic sources gains traction, how might other business practices change? And what traffic sources might be taxed next? Large data centers? Content delivery and edge network providers? 


Will peering relationships return to the older transit model, at least where traffic imbalances exist? 


The debate over how to fund access networks, as framed by some policymakers and connectivity providers, relies on how access customers use those networks. The argument is that a disproportionate share of traffic, and therefore demand for capacity investments, is driven by a handful of big content and app providers. 


But the list of “traffic sources” is larger than that. The South Korean and proposed EU regulations distinguish between traffic sources and traffic sinks (senders and receivers). In pre-internet days, that traffic was considered to be voice traffic between telcos. At the end of the true, firms would “true up” payments to cover any unequal traffic flows. 


The new principles essentially apply that same sort of logic to some traffic sources. But if “sources” are broadened, why would the category of sources not be later broadened further?


It is a novel argument, in the area of communications regulation. Business partners (other networks) have been revenue contributors when other networks terminate their voice traffic, for example. 


But some point to South Korea as an example of cost-sharing mechanisms applied to hyperscale app providers.


South Korean internet service providers levy fees on content providers representing more than one percent of access network traffic or have one million or more users. Fees amount to roughly $20/terabyte ($0.02/GB).


Some might argue it is inevitable that European connectivity providers will get government sanction to levy fees on a few hyperscale app and content providers as a matter of industrial policy and faltering economics. The measures are protectionist in that all the proposed app payers are based in the United States. 


Opponents--especially the hyperscalers--view it as an internet app tax. It arguably is all of the above. 


Yet others might note that such a policy undermines the argument for network neutrality regulations, at its core. The fundamental argument for net neutrality was to prevent unequal treatment of bits, no matter who the owner. 


Thursday, September 29, 2022

Can Lumen Increase Consumer FTTH by 10X?

Lumen Technologies is expanding its fiber to home expansion activity, expecting to boost the availability of fiber-to-premises beyond the 27 percent of homes it already supports.  With the caveat that Lumen’s future success rests more with its enterprise portfolio than its consumer broadband business (all mass markets revenue is about 25 percent of total), the fiber upgrades should boost subscription rates. 


Where the fiber access network gets about 27 percent adoption, the copper access network gets about 14 percent take rates. In other words, FTTH gets almost double the adoption of the copper access product, with FTTH average revenue per account of about $59 a month. 


source: Lumen 


In principle, Lumen should be able to gain a price advantage over its key cable TV competitors, at least if most customers on all the networks buy the advertised products at the advertised prices (ignoring promotional or bundle pricing). Whether that is the case in practice is far from clear. 


source: Lumen 


Nor is it always completely clear that 5G and 4G mobile networks are outclassed to the point that they cannot gain significant market share. The early evidence suggests that a significant portion of the consumer market is content with lower-speed service (up to 200 Mbps), and will buy a fixed wireless service. That value segment could represent about 33 percent of the present market. 


As always, that value segment will be offered higher speeds over time, for about the same price (less than $50 a month). 


Some of us would argue that the real advantage over cable will lie in symmetrical broadband features, not price per bit or downstream speed. 


The issue is how fast cable companies will move to boost upstream speeds; how fast Lumen can upgrade its home broadband access facilities and how fast both Lumen and the cable firms can boost downstream speeds. 


Of these three, the first two are likely going to be crucial, as the salient performance advantage Lumen will be able to claim, once facilities are upgraded to fiber, is upstream speed. Lumen believes it will be able to build the network for $1,000 per passing or less, with incremental capital required to activate each customer location. 


The issue then will become the penetration rate (customer adoption rate): can Lumen relatively quickly boost its customer share from 27 percent up closer to 40 percent? Possibly equally important, can Lumen get a higher share of the performance-oriented segment of the market, willing to pay more? 


Much could hinge on whether Lumen can hit its own goal of upgrading to a total of about 12 million FTTH  locations over the next six years, at the expected capital investment cost, at the expected take rate.


Wednesday, September 28, 2022

Shift from Multicast to Unicast Underlies Access Network Economics

The debate over how to fund access networks, as framed by some policymakers and connectivity providers, relies on how access customers use those networks. The argument is that a disproportionate share of traffic, and therefore demand for capacity investments, is driven by a handful of big content and app providers. 


It is a novel argument, in the area of communications regulation. Business partners (other networks) have been revenue contributors when other networks terminate their voice traffic, for example. 


But some point to South Korea as an example of cost-sharing mechanisms applied to hyperscale app providers.


South Korean internet service providers levy fees on content providers representing more than one percent of access network traffic or have one million or more users. Fees amount to roughly $20/terabyte ($0.02/GB).


The principle is analogous to the bilateral agreements access providers have with all others: when a traffic source uses a traffic sink (sender and receiver), network resources are used, so compensation is due. 


Such agreements, in the past, have been limited to access provider networks. What is novel in South Korea is the notion that some application sources are equivalent to other historic traffic sources: they generate remote traffic terminated on a local network. 


So far, such claims are not officially bilateral, which is how prior arrangements have worked. The South Korean model is sender pays, similar to a “calling party pays” model. 


Those of you with long memories will recall how the vested interests play out in any such bilateral agreements when there is an imbalance of traffic. Any payment mechanisms based on sender pays (calling party pays) benefit small net sinks and penalize large net sources. 


In other words, if a network terminates lots of traffic, it gains revenue. Large traffic generators (sources) incur substantial operating costs. 


Of course, as with all such matters, it is complicated. There are domestic content implications and industrial policy interests. In some quarters, such rules might be part of other strategies to protect and promote domestic suppliers against foreign suppliers. 


At the level of network engineering, the imbalances and costs are a direct result of choices about network architectures, namely the shift of content delivery from broadcast or multicast to unicast or “on demand” delivery. 


This is a matter of physics. Some networks are optimized for multicast (broadcast). Others are optimized for on-demand and unicast. Satellite networks, TV and radio broadcast networks are optimized for multicast: one copy to millions of recipients. 


Unicast networks (the internet, voice networks) are optimized to support one-to-one sessions. 


So what happens when we shift broadcast traffic (multicast) to unicast and on-demand delivery is that we change the economics. In place of bandwidth-efficient delivery (multicast or broadcast), we substitute bandwidth “inefficient” delivery.


In place of “one message, millions of receivers” we shift to “millions of messages, millions of recipients.” Instead of launching one copy of a TV show--send to millions of recipients-- we launch millions of copies  to individual recipients. 


Bandwidth demand grows to match. If a multicast event requires X bandwidth, then one million copies of that same event requires 1,000,000X. Yes, six orders of magnitude more bandwidth is needed. 


There are lots of other implications. 


Universal service funding in the United States is based on a tax on voice usage and voice lines. You might argue that made lots of sense in prior eras where voice was the service to be subsidized. 


It makes less sense in the internet era, when broadband internet access is the service governments wish to subsidize. Also, it seems illogical to tax a declining service (voice) to support the “now-essential” service (internet access). 


The point is that what some call “cost recovery” and others might call a “tax” is part of a horribly complicated shift in how networks are designed and used.


Monday, September 26, 2022

Home Broadband Costs Keep Falling

In the twelve months to the close of the second quarter of  2022, global fixed-line home broadband subscribers saw their average monthly charges decrease by four percent on copper, cable and fiber-to-home based tariffs, says Point Topic.

source: Point Topic 


Across the three technologies the average bandwidth increased by 22 percent year-on-year. 


source: Point Topic 


Still, the typical cost of each megabit-per-second unit of capacity was markedly lower on hybrid fiber coax and fiber to home networks, compared to slower copper-access networks. In substantial part, that is because of the vast difference in capacity between copper and other networks. 


As speed climbs, cost-per-bit falls. 


Many Power Users Among Low-Income "Home Broadband" Households

Life often is more complicated and surprising than any theory can predict. Consider home broadband consumer behavior. Consider a recent analysis by Openvault, of “low-income” home broadband households using subsidy programs.


“Broadband usage patterns of participants in the FCC’s Affordable Connectivity Program are significantly exceeding those of the broader connected population,” according to Openvault. 


Launched in January 2022, the ACP provides low-income households with a $30 per month ($75 for tribal households) subsidy that can be applied towards a monthly internet subscription. 


One might guess that low-income households would sign up for a value package, perhaps often a 100-Mbps connection that might, with the $30 subsidy, be almost--or actually--free. That might also suggest that total household consumption would be lower than usual, simply because speed tends to correlate with total data consumption. 

source: Openvault-+


But Openvault suggests that often is not the case. ACP participants’ median usage of 499.3 gigabytes per month is almost 60 percent higher than the median of 313.9 GB per month for all subscribers, Openvault says.


ACP participants are 36 percent more likely to be power users of 1 TB or more, and 52 percent more likely to be super power users of 2 TB or more, Openvault notes. Also, ACP participants’ average usage of 654 GB per month is 33.3 percent higher than the average of 490.7 GB for all subscribers.


There is a fairly simple reason for such data. The ACP can be used by college students. Younger users and college students tend to be heavier consumers of internet data. Eligible participants include people who:


  • Received a Federal Pell Grant during the current award year;

  • Meets the eligibility criteria for a participating provider's existing low-income internet program;

  • Participates in one of these assistance programs:

    • Free and Reduced-Price School Lunch Program or School Breakfast Program, including at U.S. Department of Agriculture (USDA) Community Eligibility Provision schools.

    • SNAP

    • Medicaid

    • Housing Choice Voucher (HCV) Program (Section 8 Vouchers)

    • Project-Based Rental Assistance (PBRA)/202/811

    • Public Housing 

    • Supplemental Security Income (SSI)

    • WIC

    • Veterans Pension or Survivor Benefits

    • or Lifeline;

  • Participates in one of these assistance programs and lives on Qualifying Tribal lands:

    • Bureau of Indian Affairs General Assistance 

    • Tribal TANF

    • Food Distribution Program on Indian Reservations

    • Tribal Head Start (income based)

    • Affordable Housing Programs for American Indians, Alaska Natives or Native Hawaiians


That list includes some people and households most people would consider “low income.” But it also includes college students who are only temporarily “low income.” They also are more likely to be “power users.” 


The data do not preclude, however, the notion that many low-income households might be using the subsidies to buy higher-speed service, whether those customers are students, the eldersly or others with low income.


Is Web3 the Second Coming of Peer-to-Peer?

Decentralization is the main organizing principle of Web3, often said to support user ownership of content as well as disintermediation of many other operations: banking and finance, for example. 


Those of you who can remember the early days of the internet--before the emergence of the World Wide Web or visual browsers and multimedia, plus the shift to read-write from read-only--might remember the early ethos of the pre-Web internet, which was collaborative. 


The internet was about sharing, not commerce. The push for Web3 has a similar ethos: put control and ownership of assets back in the hands of “users.” Think of the role of peer-to-peer for a Web2 analogy of how things could work. 


source: McKinsey 


Has P2P revolutionized media and content; payments; finance or other functions conducted on the web? Perhaps, to a limited degree, in some areas. But P2P has not revolutionized much of anything related to the web. 


We also may decry the rise of “platforms” owned by corporations or bigger firms. But in what part of the real economy does this not happen? We often applaud and support efforts to reduce inequality in any society; even to the extent of opposing rigid class structures. But where has that ever worked, one might ask. 


That is hot to say something will come of Web3 that is useful. Blockchain seems destined for wider use, for example. Crypto currencies likewise should gain traction, eventually. 


But we might be skeptical of the broader claims about Web3 democratizing all--or most--of the internet, preventing the rise of large new platforms. That is not to say a further balkanization of the internet is unlikely. Indeed, that balkanization has been going on for a couple to a few decades. 


In a sense, Web3 appears to be the second coming of P2P. P2P did not revolutionized the web. Web3 might not do so, either.


Wednesday, September 21, 2022

Vodafone Gets New "Activist" Investor Atlas Investissement and Might See Push for Divesting or Monetizing Some Digital Infrastructure

Atlas Investissement, a private equity firm, has taken a 2.5-percent stake in Vodafone, presumably to push Vodafone into further actions to streamline and consolidate its businesses. 


Among the possible moves is pressure to encourage Vodafone to structurally separate parts of its infrastructure beyond cell towers, which the company already has said it is contemplating. 


“Atlas Investissement is supportive of Vodafone’s publicly-stated intention to pursue consolidation opportunities in selected geographies, as well as its efforts in infrastructure separation,” the firm said in announcing the investment. 


In recent days we have seen former Brazilian incumbent telco Oi essentially adopt a mobile virtual network operator model where it runs on leased facilities owned by a separate entity. 


Oi, which had entered bankruptcy in 2016, is moving ahead with a slimmed-down and “asset light” operating model wher it leases wholesale capacity and services from a facilities-based entity rather than owning the assets outright. 


To make that shift, and shed debt, Oi has shed its mobile assets, cell towers, data centers, video entertainment operations. It also is structurally separating its fixed network infrastructure operations from its retail fixed network operations, but will retain a minority stake in the infrastructure assets supplier. 


Structural separation of Telecom Italia’s fixed network also has been the subject of extensive consideration. Mergers also have been discussed for parts of the existing business, including fiber access infra. 


In other cases, joint ventures or co-invesment has been the path chosen to reduce capital investment in digital infrastructure, especially access networks.  


The future question is digital infrastructure assets might eventually be monetized. Data centers, of course, already have been purchased by private equity and other institutional investors. But perhaps there could be some further interest in related assets such as:


  • Structured cabling 

  • Distributed antenna systems (DAS)

  • Electrical, aerial and underground fiber deployment 

  • Civil construction 

  • Small cell or micro cell installations 

  • Indoor DAS and outdoor DAS integration


None of those assets historically have been of interest to buyers who have purchased real estate assets. Lack of scale is an issue for most of these sorts of assets, for example.


Tuesday, September 20, 2022

Customer Satisfaction Falls Off a Cliff

Whatever else one wishes to glean from this data, something has changed, and in a big way. The American Customer Satisfaction Index is a long-term research project measuring customer satisfaction with a wide variety of goods and services. Some say the Covid pandemic plays a role, as any service industry would have been negatively affected. 

 

source: ACSI 


Still, the decline began before the pandemic, declining in 2018 and 2019 before the pandemic hit. 


The point is that after nearly two decades of climbing, customer satisfaction has plummeted. That will provide little to no comfort for marketers required to assess their own firm’s customer satisfaction scores. Most firms, in most industries, are likely to track the precipitous fall in customer satisfaction scores.


Monday, September 19, 2022

Big Change in Universal Service Mechanisms Coming?

For the first time, both European Union and U.S. regulatory officials are considering whether  universal service should be supported by a combination of user and customer fees. The charges would be indirect, rather than direct, in several ways. 


In the past, fees to support access networks in high-cost areas were always based on profits from customers. To be sure, high profits from business services and international long distance voice calls have been the support mechanism. In more recent days, as revenue from that source has kept dropping, support mechanisms have shifted in some markets to flat-fee “per connection” fees. 


But that already seems not to be generating sufficient funds, either, at least in the U.S. market. So in what can only be called a major shift, some regulators are looking at levying fees on some users, who are not actually “customers.” 


Specifically, regulators are looking at fees imposed on a few hyperscale app providers, using the logic that they represent a majority of internet traffic demands on access network providers. Nobody has done so, yet, but the same logic might also be applied to wide area network transport


Ignore for the moment the obvious violation of network neutrality principles. To be sure, one might argue that net neutrality only applies to consumer services, and hyperscaler access might be viewed as a business service, to the extent data centers or content enterprises connect to any public networks.  


Hyperscale and other data centers now drive as much as half of all data traffic across wide area networks. In 2021, data traffic between data centers now represents as much traffic as used by internet users. 


So half of total demand for WAN capacity now is driven directly by data centers that need to connect to other data centers. To be sure, local access facilities are required, whether traffic is bound for an actual end user location or moving between data centers. 

source: Cisco 


Whether that is logical or “good public policy” can be debated. What cannot be debated is that the internet has essentially destroyed the traditional logic about how to fund universal service access networks. 


European internet service providers, who over the past couple of decades have been severely challenged in terms of their business models, now essentially argue that those broken business models can only be fixed by new taxes on a handful of users of their networks, and not customers. 


And those firms are “users” only because ISP customers make high use of some apps and content providers. That is not to say some big users might, here and there, be customers of access services as well. 


But that is not the argument advanced by proponents of the hyperscaler fees. The argument is a new one: “a few big content and app sources supply value that ‘my’ ISP customers want to use.” 


And because customers keep using more data, ISPs have to keep investing in capacity, but without a direct revenue match correlated to usage. 


Ignore for the moment the way ISPs rate internet access usage (generally flat rate for some bucket of usage), and the ability to change policies to better match usage and cost, as suppliers of virtually all other products tend to do. 


To be sure, streaming video services tend to price based on flat fees as well, with no relationship to consumption in any billing period. In many markets, local or domestic phone calls and text messages also are essentially flat rated. 


But other “public utility” products tend to have a usage-based pricing policy. Use more water, electricity or natural gas and you will pay more, even when there are flat-rate components of overall bills. 


The point is that such “charge users” proposals deviate from past “charge customer” support mechanisms. 


It is not stretching the analogy to note that existing support mechanisms shift payments from some customers to others (heavy usage customers are subsidized by low-usage customers). Some network apps get taxed while others do not. 


A fee to connect to the local network is charged, but not the number of text messages sent or phone calls made or number of ISP connection sessions or volume, the number of shows watched or songs listened to, the number of web pages viewed or the total connection time or data volume (with some reasonable usage limits). 


We can argue about the merits of creating new universal service support mechanisms. But fairness and logic should be part of the discussion. “Because we can” should not be a reason for doing so.


Sunday, September 18, 2022

Why Metaverse Seems Likely to Emerge

Many are skeptical of the idea that "metaverse" will really develop as a three-dimensional, persistent and immersive experience widely used by people, businesses and organizations. It might not be inevitable, but it is probable.


Perhaps it is a form of technological determinism to assume that because a technology exists, it must be inevitable; must succeed in shaping economies, culture and social relationships. It is not a new idea, having gained notoriety in the late 1960s in Marshall McLuhan’s book Understanding Media


In the book, a seminal chapter, called The Medium is the Message, makes the argument that new technology reflects human culture and also shapes it. That might seem an unremarkable assertion. 


But, like all assertions that there is one root cause of human social relations, institutions and culture, it can be challenged as being reductionist: explaining wildly-complex outcomes as a result of just one driver. 


McLuhan argued that technology, not content--how we communicate, not what we say--is determinant of impact. In other words, the actual content of media is irrelevant. Only the media form matters. 


source: Alan Hook, Slideshare 


We are never very far from accepting this sort of thinking. 


Consider the way policymakers, regulators, analysts and much of the general public likely agrees that “broadband is a necessity” because it causes economic development, education and social inclusion. Policymakes and advocates often argue that faster broadband likewise drives higher economic growth. 


Correlation, though, is not causation. Virtually all government programs to close the digital divide are touted as important because--it is argued--broadband leads to economic growth. In fact, careful reports only use the word correlation, not “causation” when discussing broadband and economic growth. 


Of course, lots of things also correlate with economic growth. The rule of law, population density, educational attainment, household income, household wealth, transportation networks, proximity to the oceans, or other sources of comparative advantage are known to correlate with economic growth. 


The same sort of thinking might arguably be advanced for 5G, personal computing devices, some applications, blockchain, web3 or the metaverse


The phrase “X changes everything” is an example of such thinking. In place of “humans use tools” we get “Tools shape humans.” Again, most people would perceive a grain of truth; perhaps many grains. 


One might argue that air conditioning was responsible for the current economic resilience and growth of the U.S. South, for example. 


The point is that it is never inevitable that any technology “must or will succeed,” simply because it can be brought into existence. Any new successful technology succeeds because it solves real problems. 


Computing devices and 5G succeed because they solve real problems: the need to access the internet and communicate in the former case; the ability to support quality experiences in the latter case. 


It is said that the novel Upgrade contains a conversation between two people, discussing two-dimensional media: “I can’t watch the flats. Hurts my eyes.” “Me too. It’s unnatural.”


The novel is a warning about the dangers of the metaverse, to be sure. But the element of realism--whether something seems natural or lifelike or not--is among the reasons some of us might believe the metaverse ultimately will develop. 


Simply, the history of all electronic media is an evolution towards more realism. Realism might be defined as the experience that “you are there,” realism approaches “real life” experiences: three dimensional, interactive, using multiple senses. 


Think about the experience of participating in a sport, watching others play a sport live in a stadium, watching it on television or listening on radio, viewing a photo of the game or hearing somebody talk about a great play during that game, reading a story about that game or viewing an artist’s rendition of a key moment in a game. 


The point is that there are degrees of immersion and realism, and that the degree of realism has tended to improve in the eras of electronic media. Consider augmented reality and virtual reality as part of that movement towards full metaverse. 


Though not perhaps inevitable, the history of electronic media suggests it is likely, simply because humans prefer greater realism in electronic media. That is why television displaced radio; why sound replaced “silent” movies; why color prevailed; why stereo and surround sound are popular; why HDTV replaced NTSC; why experiments with 3D experiences continue.


Has AI Use Reached an Inflection Point, or Not?

As always, we might well disagree about the latest statistics on AI usage. The proportion of U.S. employees who report using artificial inte...