Monday, September 19, 2022

Big Change in Universal Service Mechanisms Coming?

For the first time, both European Union and U.S. regulatory officials are considering whether  universal service should be supported by a combination of user and customer fees. The charges would be indirect, rather than direct, in several ways. 


In the past, fees to support access networks in high-cost areas were always based on profits from customers. To be sure, high profits from business services and international long distance voice calls have been the support mechanism. In more recent days, as revenue from that source has kept dropping, support mechanisms have shifted in some markets to flat-fee “per connection” fees. 


But that already seems not to be generating sufficient funds, either, at least in the U.S. market. So in what can only be called a major shift, some regulators are looking at levying fees on some users, who are not actually “customers.” 


Specifically, regulators are looking at fees imposed on a few hyperscale app providers, using the logic that they represent a majority of internet traffic demands on access network providers. Nobody has done so, yet, but the same logic might also be applied to wide area network transport


Ignore for the moment the obvious violation of network neutrality principles. To be sure, one might argue that net neutrality only applies to consumer services, and hyperscaler access might be viewed as a business service, to the extent data centers or content enterprises connect to any public networks.  


Hyperscale and other data centers now drive as much as half of all data traffic across wide area networks. In 2021, data traffic between data centers now represents as much traffic as used by internet users. 


So half of total demand for WAN capacity now is driven directly by data centers that need to connect to other data centers. To be sure, local access facilities are required, whether traffic is bound for an actual end user location or moving between data centers. 

source: Cisco 


Whether that is logical or “good public policy” can be debated. What cannot be debated is that the internet has essentially destroyed the traditional logic about how to fund universal service access networks. 


European internet service providers, who over the past couple of decades have been severely challenged in terms of their business models, now essentially argue that those broken business models can only be fixed by new taxes on a handful of users of their networks, and not customers. 


And those firms are “users” only because ISP customers make high use of some apps and content providers. That is not to say some big users might, here and there, be customers of access services as well. 


But that is not the argument advanced by proponents of the hyperscaler fees. The argument is a new one: “a few big content and app sources supply value that ‘my’ ISP customers want to use.” 


And because customers keep using more data, ISPs have to keep investing in capacity, but without a direct revenue match correlated to usage. 


Ignore for the moment the way ISPs rate internet access usage (generally flat rate for some bucket of usage), and the ability to change policies to better match usage and cost, as suppliers of virtually all other products tend to do. 


To be sure, streaming video services tend to price based on flat fees as well, with no relationship to consumption in any billing period. In many markets, local or domestic phone calls and text messages also are essentially flat rated. 


But other “public utility” products tend to have a usage-based pricing policy. Use more water, electricity or natural gas and you will pay more, even when there are flat-rate components of overall bills. 


The point is that such “charge users” proposals deviate from past “charge customer” support mechanisms. 


It is not stretching the analogy to note that existing support mechanisms shift payments from some customers to others (heavy usage customers are subsidized by low-usage customers). Some network apps get taxed while others do not. 


A fee to connect to the local network is charged, but not the number of text messages sent or phone calls made or number of ISP connection sessions or volume, the number of shows watched or songs listened to, the number of web pages viewed or the total connection time or data volume (with some reasonable usage limits). 


We can argue about the merits of creating new universal service support mechanisms. But fairness and logic should be part of the discussion. “Because we can” should not be a reason for doing so.


Sunday, September 18, 2022

Why Metaverse Seems Likely to Emerge

Many are skeptical of the idea that "metaverse" will really develop as a three-dimensional, persistent and immersive experience widely used by people, businesses and organizations. It might not be inevitable, but it is probable.


Perhaps it is a form of technological determinism to assume that because a technology exists, it must be inevitable; must succeed in shaping economies, culture and social relationships. It is not a new idea, having gained notoriety in the late 1960s in Marshall McLuhan’s book Understanding Media


In the book, a seminal chapter, called The Medium is the Message, makes the argument that new technology reflects human culture and also shapes it. That might seem an unremarkable assertion. 


But, like all assertions that there is one root cause of human social relations, institutions and culture, it can be challenged as being reductionist: explaining wildly-complex outcomes as a result of just one driver. 


McLuhan argued that technology, not content--how we communicate, not what we say--is determinant of impact. In other words, the actual content of media is irrelevant. Only the media form matters. 


source: Alan Hook, Slideshare 


We are never very far from accepting this sort of thinking. 


Consider the way policymakers, regulators, analysts and much of the general public likely agrees that “broadband is a necessity” because it causes economic development, education and social inclusion. Policymakes and advocates often argue that faster broadband likewise drives higher economic growth. 


Correlation, though, is not causation. Virtually all government programs to close the digital divide are touted as important because--it is argued--broadband leads to economic growth. In fact, careful reports only use the word correlation, not “causation” when discussing broadband and economic growth. 


Of course, lots of things also correlate with economic growth. The rule of law, population density, educational attainment, household income, household wealth, transportation networks, proximity to the oceans, or other sources of comparative advantage are known to correlate with economic growth. 


The same sort of thinking might arguably be advanced for 5G, personal computing devices, some applications, blockchain, web3 or the metaverse


The phrase “X changes everything” is an example of such thinking. In place of “humans use tools” we get “Tools shape humans.” Again, most people would perceive a grain of truth; perhaps many grains. 


One might argue that air conditioning was responsible for the current economic resilience and growth of the U.S. South, for example. 


The point is that it is never inevitable that any technology “must or will succeed,” simply because it can be brought into existence. Any new successful technology succeeds because it solves real problems. 


Computing devices and 5G succeed because they solve real problems: the need to access the internet and communicate in the former case; the ability to support quality experiences in the latter case. 


It is said that the novel Upgrade contains a conversation between two people, discussing two-dimensional media: “I can’t watch the flats. Hurts my eyes.” “Me too. It’s unnatural.”


The novel is a warning about the dangers of the metaverse, to be sure. But the element of realism--whether something seems natural or lifelike or not--is among the reasons some of us might believe the metaverse ultimately will develop. 


Simply, the history of all electronic media is an evolution towards more realism. Realism might be defined as the experience that “you are there,” realism approaches “real life” experiences: three dimensional, interactive, using multiple senses. 


Think about the experience of participating in a sport, watching others play a sport live in a stadium, watching it on television or listening on radio, viewing a photo of the game or hearing somebody talk about a great play during that game, reading a story about that game or viewing an artist’s rendition of a key moment in a game. 


The point is that there are degrees of immersion and realism, and that the degree of realism has tended to improve in the eras of electronic media. Consider augmented reality and virtual reality as part of that movement towards full metaverse. 


Though not perhaps inevitable, the history of electronic media suggests it is likely, simply because humans prefer greater realism in electronic media. That is why television displaced radio; why sound replaced “silent” movies; why color prevailed; why stereo and surround sound are popular; why HDTV replaced NTSC; why experiments with 3D experiences continue.


Saturday, September 17, 2022

Is Connectivity Business Ultimately Doomed?

Among the obvious changes in connectivity provider business models over the past 30 years is the diminished role of variable revenue, compared to fixed components. The companion change is a switch from usage-based charging to flat-rate pricing, independent of usage. 


Both those changes have huge consequences. The big change is that variable usage no longer can be matched to comparable revenue upside. In other words, higher usage of network resources does not automatically result in higher revenue, as once was the case. 


And that is why provisioned data capacity of networks keeps growing, even if revenue per account remains relatively flat. That also is why network capital investment has begun to creep up. 


So consider what happens when markets saturate: when every household and person already buys internet access, mobility service and mobile broadband. When every consumer who wants landline voice and linear video already buys it, where will growth come from?


The strategic answer has to be “new services and products.” Tactically, it might not matter whether revenue from such services is based on variable (consumption based) or fixed value (flat rate charge to use). Eventually, it will matter whether usage can be matched to variable charges for usage. 


Consider that most cloud computing services (infrastructure, services or platform) feature variable charges based on usage, even if some features are flat rated. For the most part, data center revenue models are driven by usage and variable revenue models. 


Connectivity providers have no such luxury. 


Though there always has been a mix: fixed charges for “lines and features” but variable charges for long distance usage in the voice era, in the internet era the balance has shifted.


Consider what has happened with long distance voice, which is mostly variable, mobile service, which is partly variable, partly fixed, or internet access or video. 


Globally, mobile subscriptions, largely a “fixed” revenue stream--with flat rate recurring charges-- are a key driver of retail revenue growth. And though mobile internet access is mostly a flat rate service (X gigabytes for a flat fee), uptake is variable in markets where most consumers do not routinely use it.


source: Ericsson 


And make no mistake, mobile subscriptions, followed by uptake of mobile broadband, drive retail revenue in the global communications market. Fixed network broadband, the key service now provided by any fixed network, lags far behind. 


As early as 2007, in the U.S. market, long distance voice, which once drove half of total revenue and most of the profit, had begun its decline, with mobility subscriptions rising to replace that revenue source. 

source: FCC  


At a high level, that is mostly a replacement of variable revenue, based on usage, with fixed revenue, insensitive to usage.


As a practical matter, internet access providers cannot price usage of applications consumed as they once charged for international voice minutes of use. For starters, “distance” no longer matters, and distance was the rationale for differentiated pricing. 


Network neutrality rules in many markets prohibit differential pricing based on quality of service, so quality of connections or sessions is not possible, either. Those same rules likely also make any sort of sponsored access illegal, such as when an app provider might subsidize the cost of internet access used to access its own services. 


Off-peak pricing is conceivable, but the charging mechanisms are probably not available. 


It likely also is the case that the cost of metering is higher than the incremental revenue lift that might be possible, even if consumers would tolerate it. 


The competitive situation likely precludes any single ISP from moving to any such differential charging mechanisms, as well.


In other words, the cost of supporting third party or owned services, while quite differentiated in terms of network capacity required, cannot actually be matched by revenue mechanisms that could vary based on anything other the total amount of data consumption. 


Equally important, most ISPs do not own any of the actual apps used by their access customers, so there is no ability to participate in recurring revenues for app subscriptions, advertising or commerce revenues. 


All of that is part of the drive to raise revenues by having governments allow taxation of a few hyperscale app providers that drive the majority of data consumption, with the proceeds being given to ISPs to fund infrastructure upgrades.   


Ignore for the moment the different revenue per bit profiles of messaging, voice, web browsing, social media, streaming music or video subscriptions. Text messaging has in the past had the highest revenue per bit, followed by voice services


Subscription video always has had low revenue per bit, in large part because, as a media type, it requires so much bandwidth, while revenue is capped by consumer willingness to pay. Assume the average TV viewer has the screen turned on for five hours a day.


That works out to 150 hours a month. Assume an hour of standard definition video streaming (or broadcasting, in the analog world) consumes about one gigabyte per hour. That represents, for one person, consumption of perhaps 150 Gbytes. Assume overall household consumption of 200 Gbytes, and a monthly data cost of $50 per month.


Bump quality levels up to high definition and one easily can double the bandwidth consumption, up to perhaps 300 GB.  


That suggests a “cost”--to watch 150 hours of video--of about 33 cents per gigabyte, with retail price in the dollars per gigabyte range. 


Voice is something else. Assume a mobile or fixed line account represents about 350 minutes a month of usage. Assume the monthly recurring cost of having voice features on a mobile phone is about $20.


Assume data consumption for 350 minutes (5.8 hours a month) is about 21 MB per hour, or roughly 122 MB per month. That implies revenue of about $164 per consumed gigabyte. 


The point is that there are dramatic differences in revenue per bit to support both owned and third party apps and services. 

source: TechTarget 


In fact, the disparity between text messaging and voice and 4K video is so vast it is hard to get them all on the same scale. 


Sample Service and Application Bandwidth Comparisons

Segment

Application or Service Name

Mbps

Consumer mobile

SMS

0.13

Consumer mobile

MMS with video

100

Business

IP telephony (1-hour call)

28,800

Residential

Social networking (1 hour)

90,000

Residential

Online music streaming (1 hour)

72,000

Consumer mobile

Video and TV (1 hour)

120,000

Residential

Online video streaming (1 hour)

247,500

Business

Web conferencing with webcam (1 hour)

310,500

Residential

HD TV programming (1 hour, MPEG 4)

2,475,000

Business

Room-based videoconferencing (1 hour, multi codec telepresence)

5,850,000

source: Cisco


At a high level, as always is the case, one would prefer to operate a business with the ability to price according to usage. Retail access providers face the worst of all possible worlds: ever-growing usage and essentially fixed charges for that usage. 


Unless variable usage charges return, to some extent, major market changes will keep happening. New products and services can help. But it will be hard for incrementally small new revenue streams to make a dent if one assumes that connectivity service providers continue to lose about half their legacy revenues every decade, as has been the pattern since deregulation began. 


Consolidation of service providers is already happening. A shift of ownership of digital infrastructure assets is already happening. Stresses on the business model already are happening. 


Will we eventually see a return to some form of regulated communications models? And even if that is desired, how is the model adjusted to account for ever-higher capex? Subsidies have always been important. Will that role grow? 


And how might business models adjust to accommodate more regulation or different subsidies? A delinking of “usage” from “ability to charge for usage” makes answers for those questions inevitable, at some point. 


How many businesses or industries could survive 40-percent annual increases in demand and two-percent annual increases in revenue?


Friday, September 16, 2022

"Hybrid" is a Risk-Reducing Strategy for Nearly All Firms

“Hybrid” is an important business strategy for communications service providers of all types. Hughes Network Systems, for example, has added 4G terrestrial connectivity in some markets to improve latency performance.  


In fact, “hybrid” tends to be a reasonable migration strategy for any firm or industry when new technologies emerge to replace older platforms. If software-defined wide area networks start to displace Multiprotocol Label Switching, most customers and suppliers will sell and support both. 


Initially, especially when MPLS legacy revenues are significant, suppliers will joust about the merits of the newer technology. But if customers start to switch, so will the marketing and sales stances. 


At some point, if and when low earth orbit satellite constellations begin to take market share, suppliers of geosynchronous satellite service will offer both, often acquiring assets to do so. 


Eventually, older platforms have shrunk so much in terms of revenue and usage that firms will shift reliance virtually exclusively to newer platforms. 


If you are familiar with the concept of the S curve, you know that at some point in the product life cycle of any technology, growth halts. So the logical strategy is to begin development and sales of the newer technologies before market saturation hits. 


source: Strategic Thinker


One sees this sort of thing all the time in the computing, software and connectivity businesses. There never is a flash cut from a legacy platform to a next-generation platform. Hybrid is virtually always the model. 


Users begin by buying the new platform, if they can, for new installations or sites (green field). They continue to operate the legacy platform as well, gradually beginning a replacement process (brown field). 


Cable operators did precisely that when they began grafting optical fiber into their access networks. “Hybrid fiber coax” was the strategy. Eventually, even cable operators are likely to transition to all-fiber access.


But the strategy might still be hybrid. To the extent possible, cable operators will, in some cases, continue to deploy more advanced versions of their DOCSIS access platform, up to the point that upgrades to the physical infrastructure are required. 


They already are starting to deploy FTTH in the dense service areas for business customers. Then FTTH will replace HFC in dense urban areas for home broadband, while keeping HFC in lower-density markets. Over time, FTTH will replace more of the legacy network. 


But “how long” matters. As with development of the metaverse, it matters for suppliers and real-world practitioners how long it takes for various forms of augmented reality and virtual reality to reach commercial volume. 


Being too early can doom a firm. Being too late also can have serious consequences. Hybrid therefore is one way to limit execution risk.


Wednesday, September 14, 2022

Oi Essentially Adopts MVNO Business Model

Brazilian operator Oi, which had entered bankruptcy in 2016, is moving ahead with a slimmed-down operating model roughly analogous to that of a mobile virtual network operator, which leases wholesale capacity and services from a facilities-based service provider. 


To make that shift, and shed debt, Oi has shed its mobile assets, cell towers, data centers, video entertainment operations. It also is structurally separating its fixed network infrastructure operations from its retail fixed network operations, but will retain a minority stake in the infrastructure assets supplier. 


Oi’s mobile assets were sold to TIM (Telecom Italia) as well as Brazilian mobile operators Vivo and Claro Americas. The mobile cell tower assets were sold to Highline, a unit of DigitalBridge, which invests in digital  infrastructure. 


source: Oi 


The data center assets were sold to Brazil-based private equity firm Piemonte Holding. The video subscription assets were offloaded to Sky Brasil. 


A controlling stake in its fiber infrastructure business V.tal was sold to a group of investors headed by Globenet Cabos Submarinos and BTG.


After the structural separation of the fixed network business, Oi will continue to hole a minority stake in the infrastructure wholesale business, and operate its retail business--anchored by internet access--as a retail customer of the infrastructure business.


It is akin to the model used in the United Kingdom, Australia, New Zealand and Singapore, for example. 


The moves also illustrate the shift of ownership of digital infrastructure assets from service providers to private equity and other institutional investors that began decades ago with decisions by mobile operators to offload ownership of cell towers. 


Since then, a wider range of assets have begun to shift, including local access networks and data centers. 


In essence, a wider range of physical infrastructure assets are considered for disposal, to enable a lighter-asset business model that has been proposed by many observers for more than a decade.


Tuesday, September 13, 2022

Verizon Uses Owned Optical Fiber for 48% of its Mobile Site Backhaul

Verizon now says it connects 48 percent of its cell sites using owned optical fiber. That might not seem like such a big deal, but consider that Verizon’s fixed network only reaches about 20 percent of U.S. homes. 


That matters because ownership of a fixed network reaching homes and businesses provides cost advantages for deploying optical fiber backhaul to cell towers and sites. AT&T, in contrast, has fixed network assets passing about 44 percent of U.S. homes. That also provides advantages for cell site backhaul. 


Building fiber backhaul to towers outside the Verizon fixed network territory requires construction or long-term leases of capacity from other providers who can provide the access. It appears that a substantial percentage of Verizon backhaul uses built or owned facilities. 


For U.S. cable operators, who prefer to sell mobile service only to their own existing customer base, the same logic applies. Owning their own landline facilities reduces the cost of creating a cell network. 


Of a total of 140 million homes, AT&T’s landline network passes 62 million. Comcast has (can actually sell service to) about 57 million homes passed.


The Charter Communications network passes about 50 million homes, the number of potential customer locations it can sell to.


Verizon homes passed might number 27 million. Lumen Technologies never reports its homes passed figures, but likely has 20-million or so consumer locations.

What Will "Return to Normal" Mean for Home Broadband?

“After a tumultuous 2020, in which the COVID-19 pandemic caused internet traffic patterns to shift and volumes to surge, network operators have returned to the business of adding bandwidth and engineering their traffic in a more measured manner,” say researchers at Telegeography. 


Connectivity suppliers have said that the immediate shift to “work from home” and “learn from home” pulled forward some demand, meaning subscription growth that might have taken a year happened in a couple of months. 


Telegeography estimates suggest traffic growth also has returned to more typical levels. 


source: Telegeography 


What might take some time to decipher is whether the remote work gains (subscriptions, for example) will presiste on a permanent basis or whether an eventual return to the office and schools might actually lead to some reduction of demand for home connections. 


It is too early to tell.


On the Use and Misuse of Principles, Theorems and Concepts

When financial commentators compile lists of "potential black swans," they misunderstand the concept. As explained by Taleb Nasim ...