Thursday, September 29, 2022

Can Lumen Increase Consumer FTTH by 10X?

Lumen Technologies is expanding its fiber to home expansion activity, expecting to boost the availability of fiber-to-premises beyond the 27 percent of homes it already supports.  With the caveat that Lumen’s future success rests more with its enterprise portfolio than its consumer broadband business (all mass markets revenue is about 25 percent of total), the fiber upgrades should boost subscription rates. 


Where the fiber access network gets about 27 percent adoption, the copper access network gets about 14 percent take rates. In other words, FTTH gets almost double the adoption of the copper access product, with FTTH average revenue per account of about $59 a month. 


source: Lumen 


In principle, Lumen should be able to gain a price advantage over its key cable TV competitors, at least if most customers on all the networks buy the advertised products at the advertised prices (ignoring promotional or bundle pricing). Whether that is the case in practice is far from clear. 


source: Lumen 


Nor is it always completely clear that 5G and 4G mobile networks are outclassed to the point that they cannot gain significant market share. The early evidence suggests that a significant portion of the consumer market is content with lower-speed service (up to 200 Mbps), and will buy a fixed wireless service. That value segment could represent about 33 percent of the present market. 


As always, that value segment will be offered higher speeds over time, for about the same price (less than $50 a month). 


Some of us would argue that the real advantage over cable will lie in symmetrical broadband features, not price per bit or downstream speed. 


The issue is how fast cable companies will move to boost upstream speeds; how fast Lumen can upgrade its home broadband access facilities and how fast both Lumen and the cable firms can boost downstream speeds. 


Of these three, the first two are likely going to be crucial, as the salient performance advantage Lumen will be able to claim, once facilities are upgraded to fiber, is upstream speed. Lumen believes it will be able to build the network for $1,000 per passing or less, with incremental capital required to activate each customer location. 


The issue then will become the penetration rate (customer adoption rate): can Lumen relatively quickly boost its customer share from 27 percent up closer to 40 percent? Possibly equally important, can Lumen get a higher share of the performance-oriented segment of the market, willing to pay more? 


Much could hinge on whether Lumen can hit its own goal of upgrading to a total of about 12 million FTTH  locations over the next six years, at the expected capital investment cost, at the expected take rate.


Wednesday, September 28, 2022

Shift from Multicast to Unicast Underlies Access Network Economics

The debate over how to fund access networks, as framed by some policymakers and connectivity providers, relies on how access customers use those networks. The argument is that a disproportionate share of traffic, and therefore demand for capacity investments, is driven by a handful of big content and app providers. 


It is a novel argument, in the area of communications regulation. Business partners (other networks) have been revenue contributors when other networks terminate their voice traffic, for example. 


But some point to South Korea as an example of cost-sharing mechanisms applied to hyperscale app providers.


South Korean internet service providers levy fees on content providers representing more than one percent of access network traffic or have one million or more users. Fees amount to roughly $20/terabyte ($0.02/GB).


The principle is analogous to the bilateral agreements access providers have with all others: when a traffic source uses a traffic sink (sender and receiver), network resources are used, so compensation is due. 


Such agreements, in the past, have been limited to access provider networks. What is novel in South Korea is the notion that some application sources are equivalent to other historic traffic sources: they generate remote traffic terminated on a local network. 


So far, such claims are not officially bilateral, which is how prior arrangements have worked. The South Korean model is sender pays, similar to a “calling party pays” model. 


Those of you with long memories will recall how the vested interests play out in any such bilateral agreements when there is an imbalance of traffic. Any payment mechanisms based on sender pays (calling party pays) benefit small net sinks and penalize large net sources. 


In other words, if a network terminates lots of traffic, it gains revenue. Large traffic generators (sources) incur substantial operating costs. 


Of course, as with all such matters, it is complicated. There are domestic content implications and industrial policy interests. In some quarters, such rules might be part of other strategies to protect and promote domestic suppliers against foreign suppliers. 


At the level of network engineering, the imbalances and costs are a direct result of choices about network architectures, namely the shift of content delivery from broadcast or multicast to unicast or “on demand” delivery. 


This is a matter of physics. Some networks are optimized for multicast (broadcast). Others are optimized for on-demand and unicast. Satellite networks, TV and radio broadcast networks are optimized for multicast: one copy to millions of recipients. 


Unicast networks (the internet, voice networks) are optimized to support one-to-one sessions. 


So what happens when we shift broadcast traffic (multicast) to unicast and on-demand delivery is that we change the economics. In place of bandwidth-efficient delivery (multicast or broadcast), we substitute bandwidth “inefficient” delivery.


In place of “one message, millions of receivers” we shift to “millions of messages, millions of recipients.” Instead of launching one copy of a TV show--send to millions of recipients-- we launch millions of copies  to individual recipients. 


Bandwidth demand grows to match. If a multicast event requires X bandwidth, then one million copies of that same event requires 1,000,000X. Yes, six orders of magnitude more bandwidth is needed. 


There are lots of other implications. 


Universal service funding in the United States is based on a tax on voice usage and voice lines. You might argue that made lots of sense in prior eras where voice was the service to be subsidized. 


It makes less sense in the internet era, when broadband internet access is the service governments wish to subsidize. Also, it seems illogical to tax a declining service (voice) to support the “now-essential” service (internet access). 


The point is that what some call “cost recovery” and others might call a “tax” is part of a horribly complicated shift in how networks are designed and used.


Monday, September 26, 2022

Home Broadband Costs Keep Falling

In the twelve months to the close of the second quarter of  2022, global fixed-line home broadband subscribers saw their average monthly charges decrease by four percent on copper, cable and fiber-to-home based tariffs, says Point Topic.

source: Point Topic 


Across the three technologies the average bandwidth increased by 22 percent year-on-year. 


source: Point Topic 


Still, the typical cost of each megabit-per-second unit of capacity was markedly lower on hybrid fiber coax and fiber to home networks, compared to slower copper-access networks. In substantial part, that is because of the vast difference in capacity between copper and other networks. 


As speed climbs, cost-per-bit falls. 


Many Power Users Among Low-Income "Home Broadband" Households

Life often is more complicated and surprising than any theory can predict. Consider home broadband consumer behavior. Consider a recent analysis by Openvault, of “low-income” home broadband households using subsidy programs.


“Broadband usage patterns of participants in the FCC’s Affordable Connectivity Program are significantly exceeding those of the broader connected population,” according to Openvault. 


Launched in January 2022, the ACP provides low-income households with a $30 per month ($75 for tribal households) subsidy that can be applied towards a monthly internet subscription. 


One might guess that low-income households would sign up for a value package, perhaps often a 100-Mbps connection that might, with the $30 subsidy, be almost--or actually--free. That might also suggest that total household consumption would be lower than usual, simply because speed tends to correlate with total data consumption. 

source: Openvault-+


But Openvault suggests that often is not the case. ACP participants’ median usage of 499.3 gigabytes per month is almost 60 percent higher than the median of 313.9 GB per month for all subscribers, Openvault says.


ACP participants are 36 percent more likely to be power users of 1 TB or more, and 52 percent more likely to be super power users of 2 TB or more, Openvault notes. Also, ACP participants’ average usage of 654 GB per month is 33.3 percent higher than the average of 490.7 GB for all subscribers.


There is a fairly simple reason for such data. The ACP can be used by college students. Younger users and college students tend to be heavier consumers of internet data. Eligible participants include people who:


  • Received a Federal Pell Grant during the current award year;

  • Meets the eligibility criteria for a participating provider's existing low-income internet program;

  • Participates in one of these assistance programs:

    • Free and Reduced-Price School Lunch Program or School Breakfast Program, including at U.S. Department of Agriculture (USDA) Community Eligibility Provision schools.

    • SNAP

    • Medicaid

    • Housing Choice Voucher (HCV) Program (Section 8 Vouchers)

    • Project-Based Rental Assistance (PBRA)/202/811

    • Public Housing 

    • Supplemental Security Income (SSI)

    • WIC

    • Veterans Pension or Survivor Benefits

    • or Lifeline;

  • Participates in one of these assistance programs and lives on Qualifying Tribal lands:

    • Bureau of Indian Affairs General Assistance 

    • Tribal TANF

    • Food Distribution Program on Indian Reservations

    • Tribal Head Start (income based)

    • Affordable Housing Programs for American Indians, Alaska Natives or Native Hawaiians


That list includes some people and households most people would consider “low income.” But it also includes college students who are only temporarily “low income.” They also are more likely to be “power users.” 


The data do not preclude, however, the notion that many low-income households might be using the subsidies to buy higher-speed service, whether those customers are students, the eldersly or others with low income.


Is Web3 the Second Coming of Peer-to-Peer?

Decentralization is the main organizing principle of Web3, often said to support user ownership of content as well as disintermediation of many other operations: banking and finance, for example. 


Those of you who can remember the early days of the internet--before the emergence of the World Wide Web or visual browsers and multimedia, plus the shift to read-write from read-only--might remember the early ethos of the pre-Web internet, which was collaborative. 


The internet was about sharing, not commerce. The push for Web3 has a similar ethos: put control and ownership of assets back in the hands of “users.” Think of the role of peer-to-peer for a Web2 analogy of how things could work. 


source: McKinsey 


Has P2P revolutionized media and content; payments; finance or other functions conducted on the web? Perhaps, to a limited degree, in some areas. But P2P has not revolutionized much of anything related to the web. 


We also may decry the rise of “platforms” owned by corporations or bigger firms. But in what part of the real economy does this not happen? We often applaud and support efforts to reduce inequality in any society; even to the extent of opposing rigid class structures. But where has that ever worked, one might ask. 


That is hot to say something will come of Web3 that is useful. Blockchain seems destined for wider use, for example. Crypto currencies likewise should gain traction, eventually. 


But we might be skeptical of the broader claims about Web3 democratizing all--or most--of the internet, preventing the rise of large new platforms. That is not to say a further balkanization of the internet is unlikely. Indeed, that balkanization has been going on for a couple to a few decades. 


In a sense, Web3 appears to be the second coming of P2P. P2P did not revolutionized the web. Web3 might not do so, either.


Wednesday, September 21, 2022

Vodafone Gets New "Activist" Investor Atlas Investissement and Might See Push for Divesting or Monetizing Some Digital Infrastructure

Atlas Investissement, a private equity firm, has taken a 2.5-percent stake in Vodafone, presumably to push Vodafone into further actions to streamline and consolidate its businesses. 


Among the possible moves is pressure to encourage Vodafone to structurally separate parts of its infrastructure beyond cell towers, which the company already has said it is contemplating. 


“Atlas Investissement is supportive of Vodafone’s publicly-stated intention to pursue consolidation opportunities in selected geographies, as well as its efforts in infrastructure separation,” the firm said in announcing the investment. 


In recent days we have seen former Brazilian incumbent telco Oi essentially adopt a mobile virtual network operator model where it runs on leased facilities owned by a separate entity. 


Oi, which had entered bankruptcy in 2016, is moving ahead with a slimmed-down and “asset light” operating model wher it leases wholesale capacity and services from a facilities-based entity rather than owning the assets outright. 


To make that shift, and shed debt, Oi has shed its mobile assets, cell towers, data centers, video entertainment operations. It also is structurally separating its fixed network infrastructure operations from its retail fixed network operations, but will retain a minority stake in the infrastructure assets supplier. 


Structural separation of Telecom Italia’s fixed network also has been the subject of extensive consideration. Mergers also have been discussed for parts of the existing business, including fiber access infra. 


In other cases, joint ventures or co-invesment has been the path chosen to reduce capital investment in digital infrastructure, especially access networks.  


The future question is digital infrastructure assets might eventually be monetized. Data centers, of course, already have been purchased by private equity and other institutional investors. But perhaps there could be some further interest in related assets such as:


  • Structured cabling 

  • Distributed antenna systems (DAS)

  • Electrical, aerial and underground fiber deployment 

  • Civil construction 

  • Small cell or micro cell installations 

  • Indoor DAS and outdoor DAS integration


None of those assets historically have been of interest to buyers who have purchased real estate assets. Lack of scale is an issue for most of these sorts of assets, for example.


Tuesday, September 20, 2022

Customer Satisfaction Falls Off a Cliff

Whatever else one wishes to glean from this data, something has changed, and in a big way. The American Customer Satisfaction Index is a long-term research project measuring customer satisfaction with a wide variety of goods and services. Some say the Covid pandemic plays a role, as any service industry would have been negatively affected. 

 

source: ACSI 


Still, the decline began before the pandemic, declining in 2018 and 2019 before the pandemic hit. 


The point is that after nearly two decades of climbing, customer satisfaction has plummeted. That will provide little to no comfort for marketers required to assess their own firm’s customer satisfaction scores. Most firms, in most industries, are likely to track the precipitous fall in customer satisfaction scores.


Monday, September 19, 2022

Big Change in Universal Service Mechanisms Coming?

For the first time, both European Union and U.S. regulatory officials are considering whether  universal service should be supported by a combination of user and customer fees. The charges would be indirect, rather than direct, in several ways. 


In the past, fees to support access networks in high-cost areas were always based on profits from customers. To be sure, high profits from business services and international long distance voice calls have been the support mechanism. In more recent days, as revenue from that source has kept dropping, support mechanisms have shifted in some markets to flat-fee “per connection” fees. 


But that already seems not to be generating sufficient funds, either, at least in the U.S. market. So in what can only be called a major shift, some regulators are looking at levying fees on some users, who are not actually “customers.” 


Specifically, regulators are looking at fees imposed on a few hyperscale app providers, using the logic that they represent a majority of internet traffic demands on access network providers. Nobody has done so, yet, but the same logic might also be applied to wide area network transport


Ignore for the moment the obvious violation of network neutrality principles. To be sure, one might argue that net neutrality only applies to consumer services, and hyperscaler access might be viewed as a business service, to the extent data centers or content enterprises connect to any public networks.  


Hyperscale and other data centers now drive as much as half of all data traffic across wide area networks. In 2021, data traffic between data centers now represents as much traffic as used by internet users. 


So half of total demand for WAN capacity now is driven directly by data centers that need to connect to other data centers. To be sure, local access facilities are required, whether traffic is bound for an actual end user location or moving between data centers. 

source: Cisco 


Whether that is logical or “good public policy” can be debated. What cannot be debated is that the internet has essentially destroyed the traditional logic about how to fund universal service access networks. 


European internet service providers, who over the past couple of decades have been severely challenged in terms of their business models, now essentially argue that those broken business models can only be fixed by new taxes on a handful of users of their networks, and not customers. 


And those firms are “users” only because ISP customers make high use of some apps and content providers. That is not to say some big users might, here and there, be customers of access services as well. 


But that is not the argument advanced by proponents of the hyperscaler fees. The argument is a new one: “a few big content and app sources supply value that ‘my’ ISP customers want to use.” 


And because customers keep using more data, ISPs have to keep investing in capacity, but without a direct revenue match correlated to usage. 


Ignore for the moment the way ISPs rate internet access usage (generally flat rate for some bucket of usage), and the ability to change policies to better match usage and cost, as suppliers of virtually all other products tend to do. 


To be sure, streaming video services tend to price based on flat fees as well, with no relationship to consumption in any billing period. In many markets, local or domestic phone calls and text messages also are essentially flat rated. 


But other “public utility” products tend to have a usage-based pricing policy. Use more water, electricity or natural gas and you will pay more, even when there are flat-rate components of overall bills. 


The point is that such “charge users” proposals deviate from past “charge customer” support mechanisms. 


It is not stretching the analogy to note that existing support mechanisms shift payments from some customers to others (heavy usage customers are subsidized by low-usage customers). Some network apps get taxed while others do not. 


A fee to connect to the local network is charged, but not the number of text messages sent or phone calls made or number of ISP connection sessions or volume, the number of shows watched or songs listened to, the number of web pages viewed or the total connection time or data volume (with some reasonable usage limits). 


We can argue about the merits of creating new universal service support mechanisms. But fairness and logic should be part of the discussion. “Because we can” should not be a reason for doing so.


Sunday, September 18, 2022

Why Metaverse Seems Likely to Emerge

Many are skeptical of the idea that "metaverse" will really develop as a three-dimensional, persistent and immersive experience widely used by people, businesses and organizations. It might not be inevitable, but it is probable.


Perhaps it is a form of technological determinism to assume that because a technology exists, it must be inevitable; must succeed in shaping economies, culture and social relationships. It is not a new idea, having gained notoriety in the late 1960s in Marshall McLuhan’s book Understanding Media


In the book, a seminal chapter, called The Medium is the Message, makes the argument that new technology reflects human culture and also shapes it. That might seem an unremarkable assertion. 


But, like all assertions that there is one root cause of human social relations, institutions and culture, it can be challenged as being reductionist: explaining wildly-complex outcomes as a result of just one driver. 


McLuhan argued that technology, not content--how we communicate, not what we say--is determinant of impact. In other words, the actual content of media is irrelevant. Only the media form matters. 


source: Alan Hook, Slideshare 


We are never very far from accepting this sort of thinking. 


Consider the way policymakers, regulators, analysts and much of the general public likely agrees that “broadband is a necessity” because it causes economic development, education and social inclusion. Policymakes and advocates often argue that faster broadband likewise drives higher economic growth. 


Correlation, though, is not causation. Virtually all government programs to close the digital divide are touted as important because--it is argued--broadband leads to economic growth. In fact, careful reports only use the word correlation, not “causation” when discussing broadband and economic growth. 


Of course, lots of things also correlate with economic growth. The rule of law, population density, educational attainment, household income, household wealth, transportation networks, proximity to the oceans, or other sources of comparative advantage are known to correlate with economic growth. 


The same sort of thinking might arguably be advanced for 5G, personal computing devices, some applications, blockchain, web3 or the metaverse


The phrase “X changes everything” is an example of such thinking. In place of “humans use tools” we get “Tools shape humans.” Again, most people would perceive a grain of truth; perhaps many grains. 


One might argue that air conditioning was responsible for the current economic resilience and growth of the U.S. South, for example. 


The point is that it is never inevitable that any technology “must or will succeed,” simply because it can be brought into existence. Any new successful technology succeeds because it solves real problems. 


Computing devices and 5G succeed because they solve real problems: the need to access the internet and communicate in the former case; the ability to support quality experiences in the latter case. 


It is said that the novel Upgrade contains a conversation between two people, discussing two-dimensional media: “I can’t watch the flats. Hurts my eyes.” “Me too. It’s unnatural.”


The novel is a warning about the dangers of the metaverse, to be sure. But the element of realism--whether something seems natural or lifelike or not--is among the reasons some of us might believe the metaverse ultimately will develop. 


Simply, the history of all electronic media is an evolution towards more realism. Realism might be defined as the experience that “you are there,” realism approaches “real life” experiences: three dimensional, interactive, using multiple senses. 


Think about the experience of participating in a sport, watching others play a sport live in a stadium, watching it on television or listening on radio, viewing a photo of the game or hearing somebody talk about a great play during that game, reading a story about that game or viewing an artist’s rendition of a key moment in a game. 


The point is that there are degrees of immersion and realism, and that the degree of realism has tended to improve in the eras of electronic media. Consider augmented reality and virtual reality as part of that movement towards full metaverse. 


Though not perhaps inevitable, the history of electronic media suggests it is likely, simply because humans prefer greater realism in electronic media. That is why television displaced radio; why sound replaced “silent” movies; why color prevailed; why stereo and surround sound are popular; why HDTV replaced NTSC; why experiments with 3D experiences continue.


Saturday, September 17, 2022

Is Connectivity Business Ultimately Doomed?

Among the obvious changes in connectivity provider business models over the past 30 years is the diminished role of variable revenue, compared to fixed components. The companion change is a switch from usage-based charging to flat-rate pricing, independent of usage. 


Both those changes have huge consequences. The big change is that variable usage no longer can be matched to comparable revenue upside. In other words, higher usage of network resources does not automatically result in higher revenue, as once was the case. 


And that is why provisioned data capacity of networks keeps growing, even if revenue per account remains relatively flat. That also is why network capital investment has begun to creep up. 


So consider what happens when markets saturate: when every household and person already buys internet access, mobility service and mobile broadband. When every consumer who wants landline voice and linear video already buys it, where will growth come from?


The strategic answer has to be “new services and products.” Tactically, it might not matter whether revenue from such services is based on variable (consumption based) or fixed value (flat rate charge to use). Eventually, it will matter whether usage can be matched to variable charges for usage. 


Consider that most cloud computing services (infrastructure, services or platform) feature variable charges based on usage, even if some features are flat rated. For the most part, data center revenue models are driven by usage and variable revenue models. 


Connectivity providers have no such luxury. 


Though there always has been a mix: fixed charges for “lines and features” but variable charges for long distance usage in the voice era, in the internet era the balance has shifted.


Consider what has happened with long distance voice, which is mostly variable, mobile service, which is partly variable, partly fixed, or internet access or video. 


Globally, mobile subscriptions, largely a “fixed” revenue stream--with flat rate recurring charges-- are a key driver of retail revenue growth. And though mobile internet access is mostly a flat rate service (X gigabytes for a flat fee), uptake is variable in markets where most consumers do not routinely use it.


source: Ericsson 


And make no mistake, mobile subscriptions, followed by uptake of mobile broadband, drive retail revenue in the global communications market. Fixed network broadband, the key service now provided by any fixed network, lags far behind. 


As early as 2007, in the U.S. market, long distance voice, which once drove half of total revenue and most of the profit, had begun its decline, with mobility subscriptions rising to replace that revenue source. 

source: FCC  


At a high level, that is mostly a replacement of variable revenue, based on usage, with fixed revenue, insensitive to usage.


As a practical matter, internet access providers cannot price usage of applications consumed as they once charged for international voice minutes of use. For starters, “distance” no longer matters, and distance was the rationale for differentiated pricing. 


Network neutrality rules in many markets prohibit differential pricing based on quality of service, so quality of connections or sessions is not possible, either. Those same rules likely also make any sort of sponsored access illegal, such as when an app provider might subsidize the cost of internet access used to access its own services. 


Off-peak pricing is conceivable, but the charging mechanisms are probably not available. 


It likely also is the case that the cost of metering is higher than the incremental revenue lift that might be possible, even if consumers would tolerate it. 


The competitive situation likely precludes any single ISP from moving to any such differential charging mechanisms, as well.


In other words, the cost of supporting third party or owned services, while quite differentiated in terms of network capacity required, cannot actually be matched by revenue mechanisms that could vary based on anything other the total amount of data consumption. 


Equally important, most ISPs do not own any of the actual apps used by their access customers, so there is no ability to participate in recurring revenues for app subscriptions, advertising or commerce revenues. 


All of that is part of the drive to raise revenues by having governments allow taxation of a few hyperscale app providers that drive the majority of data consumption, with the proceeds being given to ISPs to fund infrastructure upgrades.   


Ignore for the moment the different revenue per bit profiles of messaging, voice, web browsing, social media, streaming music or video subscriptions. Text messaging has in the past had the highest revenue per bit, followed by voice services


Subscription video always has had low revenue per bit, in large part because, as a media type, it requires so much bandwidth, while revenue is capped by consumer willingness to pay. Assume the average TV viewer has the screen turned on for five hours a day.


That works out to 150 hours a month. Assume an hour of standard definition video streaming (or broadcasting, in the analog world) consumes about one gigabyte per hour. That represents, for one person, consumption of perhaps 150 Gbytes. Assume overall household consumption of 200 Gbytes, and a monthly data cost of $50 per month.


Bump quality levels up to high definition and one easily can double the bandwidth consumption, up to perhaps 300 GB.  


That suggests a “cost”--to watch 150 hours of video--of about 33 cents per gigabyte, with retail price in the dollars per gigabyte range. 


Voice is something else. Assume a mobile or fixed line account represents about 350 minutes a month of usage. Assume the monthly recurring cost of having voice features on a mobile phone is about $20.


Assume data consumption for 350 minutes (5.8 hours a month) is about 21 MB per hour, or roughly 122 MB per month. That implies revenue of about $164 per consumed gigabyte. 


The point is that there are dramatic differences in revenue per bit to support both owned and third party apps and services. 

source: TechTarget 


In fact, the disparity between text messaging and voice and 4K video is so vast it is hard to get them all on the same scale. 


Sample Service and Application Bandwidth Comparisons

Segment

Application or Service Name

Mbps

Consumer mobile

SMS

0.13

Consumer mobile

MMS with video

100

Business

IP telephony (1-hour call)

28,800

Residential

Social networking (1 hour)

90,000

Residential

Online music streaming (1 hour)

72,000

Consumer mobile

Video and TV (1 hour)

120,000

Residential

Online video streaming (1 hour)

247,500

Business

Web conferencing with webcam (1 hour)

310,500

Residential

HD TV programming (1 hour, MPEG 4)

2,475,000

Business

Room-based videoconferencing (1 hour, multi codec telepresence)

5,850,000

source: Cisco


At a high level, as always is the case, one would prefer to operate a business with the ability to price according to usage. Retail access providers face the worst of all possible worlds: ever-growing usage and essentially fixed charges for that usage. 


Unless variable usage charges return, to some extent, major market changes will keep happening. New products and services can help. But it will be hard for incrementally small new revenue streams to make a dent if one assumes that connectivity service providers continue to lose about half their legacy revenues every decade, as has been the pattern since deregulation began. 


Consolidation of service providers is already happening. A shift of ownership of digital infrastructure assets is already happening. Stresses on the business model already are happening. 


Will we eventually see a return to some form of regulated communications models? And even if that is desired, how is the model adjusted to account for ever-higher capex? Subsidies have always been important. Will that role grow? 


And how might business models adjust to accommodate more regulation or different subsidies? A delinking of “usage” from “ability to charge for usage” makes answers for those questions inevitable, at some point. 


How many businesses or industries could survive 40-percent annual increases in demand and two-percent annual increases in revenue?


Whatever the Eventual Impact, Telecom Execs Say They are Investing in AI

With the caveat that early reported interests, tests, trials and investments in new technology such as artificial intelligence--especially t...