Sunday, October 31, 2021

Big Change in FTTH Business Case in Rural Areas?

Could the business case for U.S.  fiber to the home in historically-challenged settings rather sharply improve for the better because of government action? Certainly. All that matters is the level of subsidies we are willing to make. 


George Ford, economist at the Phoenix Center for Advanced Legal and Economic Public Policy Studies, argues that about 9.1 million U.S. locations are “unserved” by any fixed network provider.  


“According to my calculations, if the average subsidy is $2,000 (which is the average of the RDOF auction), then the additional subsidy required to reach unserved households is $18.2 billio,” Ford argues. “If the average subsidy level is $3,000, then $22.8 billion is needed. And at a very high average subsidy of $5,000, getting broadband to every location requires approximately $45.5 billion.”


That amount possibly could be authorized by the U.S. Congress, and would dramatically change the business case for FTTH in rural areas. 


Given a big-enough subsidy, service providers would have a far-easier time justifying FTTH even in quite-rural areas where the present subsidy system has been deemed insufficient to incentivize construction. 


source: Cartesian


Such an extensive subsidy system would change the FTTH business model for all telcos operating in rural areas, might affect cable operators and also could reduce demand for all satellite and fixed-wireless operators. 


All that is just a reminder that for every public policy there are corresponding private interests that are helped or harmed.


"Digital Infrastructure" Can Mean "Everything" But Then Means "Nothing"

Digital infrastructure is a term increasingly used by suppliers of connectivity services. But digital infrastructure arguably includes the rest of the information and communications ecosystem as well. To the extent that digital infrastructure includes connectivity, computing, applications and services as well as devices, it is synonymous with “internet ecosystem.”

source: AIIB


When used in the more-narrow sense of “connectivity infrastructure,”however, some connectivity assets are of growing interest to private equity, institutional investors and others, as a form of alternative investment providing diversification. 


As investment organizations sometimes desire to hold assets such as land or real estate, they now sometimes wish to own infrastructure assets that throw off predictable cash flows, have business moats and stable and recurring demand. 


That interest on the part of buyers also  accounts for service provider interest in monetizing some parts of their access infrastructure, steps that might not have been deemed wise decades ago, when ownership of scarce facilities was viewed as a primary source of business advantage. 


  

source: AIIB


But digital infrastructure now sees a confluence of supply and demand interest as much private equity views investments in digital infrastructure the same way other long-lived infrastructure (transportation, utilities, real estate) is seen: reliable providers of long-term cash flows. 


So in addition to investments in infrastructure related to  clean energy, water, and wastewater, some investors see digital infrastructure as part of the mix. 


The capital-intensive nature of these assets also creates barriers for competition, while demand growth is robust. Still, infrastructure investing--digital or not--is an asset class expected to  deliver low returns, but also with low volatility. 


At the same time, interest in alternative asset classes and low dividend yields on bonds contribute to the interest in digital infrastructure asset investment, and the trend to monetize such assets on the part of service providers. 


In substantial part, there also is corresponding interest on the part of infrastructure owners to monetize assets, driven in large part by increasing capital requirements and declining return on investment. 


In Asia, for example, communications infrastructure faces higher capital intensity and yet lower returns on invested capital. In that sense, communications infrastructure faces potential  investment gaps similar to those of other infrastructure categories. 


source: AIIB


That accounts for the prevalence of interest in “capital light” business models, public-private partnerships, wholesale access service models and privatization of assets. 


Saturday, October 30, 2021

Data Centers Now Drive Half of Demand for WAN Capacity

There are lots of reasons why hyperscale data centers are such a huge driver of global wide area network business models. 


Hyperscale and other data centers now drive as much as half of all data traffic across wide area networks. In 2021, data traffic between data centers now represents as much traffic as used by internet users. 


So half of total demand for WAN capacity now is driven directly by data centers that need to connect to other data centers. To be sure, local access facilities are required, whether traffic is bound for an actual end user location or moving between data centers. 

source: Cisco 


But access facilities supporting data center to data center facilities are quite concentrated. Access facilities supporting consumers are ubiquitous.  


That was not the case two decades ago. 


Also, hyperscale “computing/storage/software as a service” suppliers now also directly supply some “connectivity” services that otherwise might be sold by public telecommunications providers. 


AWS, for example, sells private connections, content delivery and virtual private network services. Some estimate that such networking services represent as much as 25 percent of total AWS revenues


In other words, AWS earns significant revenues from selling WAN networking services to its customers. That drives even more hyperscaler activity in the WAN market. 


Maybe Quality, not Quantity, Now is the Issue for Broadband Access

Figuring out what percentage of persons or homes have “broadband” access requires several assumptions about locations (households, occupation rates, household density) and buy rates. 


To really understand adoption rates, one also has to back out business accounts and locations. Also, one has to adjust for platforms, since fixed networks provide one solution, but mobility is a substitute in some cases. In other words, some homes and people use a “mobile-only” approach to their broadband needs. 


Bulk accounts (colleges and universities with living facilities) also play some part in adding to “homes with broadband services being used.” It is probably less material than the errors from all the other assumptions. 


“Quality” and “price” require more assumptions. Even homes buying--or able to buy--fiber to home services are not the same. Some connections support gigabit per second speeds, others perhaps offer speeds that top out in the hundreds of megabits per second. So “speed” and “access media” are not synonymous, at least at the moment. ]


Nor does access media tell us much about upstream bandwidth. 


“Affordability” and “price” likewise require assumptions, since nobody has granular full market data about the actual packages customers buy, and whether posted retail rates actually reflect the real prices paid, including all promotions and discounts. 


Beyond that, “price” has a relative element. Some would argue that what matters is the relative cost of broadband access compared to household income or disposable income. Is it a larger or smaller percentage of household budgets?


Nor do we necessarily need to rely on government data that often is inaccurate to some degree because it is two years old or uses a methodology which is not granular enough. 


There were perhaps 111.9 million U.S. fixed network broadband accounts at the end of the second quarter 2021, according to Leichtman Research Group. That presumably includes both consumer and business accounts. 


Separately, Point Topic estimates there are a total 120 million or so U.S. fixed broadband accounts in service. Separately, consumer adoption is estimated at about 86 percent of households or persons, depending on which methodology is used (active connections to places or people with active connections).


. Some estimate occupied households to number about 125 million units.   

source: Point Topic 


If so, then about 107.5 million accounts are consumer subscriptions, not counting perhaps 4.4 million fixed network business accounts. 


But it also is important to remember that fixed network subscriptions understate the number of people with broadband at their homes, as most U.S. households are multi-person. Perhaps a quarter of all occupied housing has a single occupant. The other 75 percent are multi-person households.


Using a 2.6-persons-per-household average, for example, would suggest 279.5 million people having access to broadband. Total U.S. population in 2020 is estimated at 334.5 million, confirming the estimate that 84 percent or so of people buy broadband. 


Whether calculating fixed broadband access by locations with connections or persons with connections, the estimates are congruent, which I find somewhat surprising (again, looking only at fixed network accounts, and ignoring mobile access). 


source: Statista 


It has been estimated that 15 percent to 20 percent of homes are mobile-only for internet access. In the U.S. market that might mean 18.8 million to 25 million homes. Adding those figures to the estimated 197.5 million home accounts yields totals that exceed the total number of U.S. homes. 


So we have to assume one or more facts. There are more occupied homes than we think; there are more homes than we think; there are fewer mobile-only homes than we think; the fixed network accounts are overstated or some combination of those issues. Also, we tend to ignore some percentage of highly-rural consumer locations that rely on satellite access, as well as the changes in that market as new satellite constellations go commercial. 


Or perhaps the issue now is the quality of connections, not the coverage. Some people and some households simply do not wish to use the internet, though the percentage seems to shrink every year. As that is the case, “100-percent take rates” is some number less than the total number of homes. 


Perhaps the issue in the U.S. market is more “quality” than “availability.”


Friday, October 29, 2021

Metaverse Drives Computing to the Edge

The name change from Facebook to Meta illustrates why remote computing and computing as a service are incorporating computing at the edge. 


“The metaverse is a shared virtual 3D world, or worlds, that are interactive, immersive, and collaborative,” says Nvidia. 


Facebook says “the metaverse will feel like a hybrid of today’s online social experiences, sometimes expanded into three dimensions or projected into the physical world.”


As 3D in the linear television world has been highly bandwidth intensive, so are metaverse applications expected to feature needs for lots of bandwidth. As fast-twitch videogaming has been reliant on low latency response, metaverse applications will require very low latency. As web pages are essentially custom built for each individual viewer based on past experience, so metaverse experiences will be custom built for each user, in real time, often requiring content and computing resources from different physical locations. 


All of that places new emphasis on low latency response and high bandwidth computing and  communications network support. Multiverse experiences also will be highly compute intensive, often requiring artificial intelligence. 


As with other earlier 3D, television, high-quality video conferencing apps and immersive games, metaverse experiences also require choices about where to place compute functions: remote or local. Those decisions in turn drive decisions about required communications capabilities. 


Those choices  always involve cost and quality decisions, even as computational and bandwidth costs have fallen roughly in line with Moore’s Law for the last 70 years. 


source: Economist, Whats the Big Data


As low computational costs created packet switching and the internet, so low computational costs support remote and local computing. Among the choices app designers increasingly face are the issues of latency performance and communications cost. Local resources inherently have the advantage for latency performance and also can be a material issue when the cost of wide area bandwidth is added. Energy footprint also varies (local versus remote computing).  


On the other hand, remote computing means less investment in local servers. The point is that “remote computing plus wide area network communications” is a functional substitute for local computing, and vice versa. When performance is equivalent, designers have choices about when to use remote computing and local, with communications cost being an integral part of the remote cost case. 


Metaverse use cases, on the other hand, are driven to the edge (local) for performance reasons. Highly compute-intensive use cases with low-latency requirements are, in the first instance, about performance, and then secondarily about cost. 


In other words, fast compute requirements and the volume of requirements often dictate the choice of local computing. And that means metaverse apps drive computing to the edge. 


source: Couchbase

Thursday, October 28, 2021

Virtualized Networks Add Complexity, in Some Ways

Virtualized networks--including 5G--are in some ways more complex than legacy, single-vendor networks. Separating hardware platforms from software often involves the physical equivalent of containerizing and segmenting functions, decomposing functions into the physical equivalent of "additional layers."


Reduced cost and complexity are among the hoped-for advantages of virtualization and open approaches to infrastructure.


But at least early on, integration costs and chores could be substantial, undermining the  business cases. There is a cost to integrate hardware and software elements from many suppliers, while still achieving the same performance and time to market as an integrated, single-vendor network. 

source: Ericsson, Analysys Mason


To be sure, complexity varies. Virtualizing software functions and separating those functions from the hardware, in a single-vendor environment, is one thing. Supporting multiple vendors on a mix-and-match basis--such as is desired for open radio access networks--adds more complexity. 


Open radio access networks have faced some commercial skepticism from service providers that typically need “bulletproof” infrastructure and find current open RAN platforms incomplete, not fully vetted and tested.


The new cost and complextiy management related to open network approaches therefore remains an issue, early in the adoption process.

And while cost savings are a key driver of open approaches, lower capex is not the only attraction. Agility is the other hoped-for advantage.


source: Ericsson, Analysys Mason


Still, service providers are going to be careful about rushing into more-open networks, for the simple reason that reliability, availability and consistency of their networks is deemed so essential. 


So some might not be surprised if it is another decade before open radio networks, for example, are considered typical. The same is likely to apply to many other efforts to create open functions across the rest of the network. 



Wednesday, October 27, 2021

Cable TV and ISPs Still Get No Love from Customers

For as long as I can remember, and that is 40 years, cable TV services have been unpopular and unloved in customer satisfaction surveys. That same unhappiness now applies for internet service providers as well. One way of illustrating that is net promoter scores. 


The foundation of the net promoter score is the answer to the question “how likely is it that you would recommend this event to your colleagues and friends?” 


source: Customer Experience Update 


The score is calculated by comparing numbers of “promoters” to the numbers of “detractors.”


For example, if 10 percent of respondents are Detractors, 20 percent are Passives and 70 percent are Promoters, an NPS score would be 70-10 = 60.


source: Attendeaze 


NPS expected scores for events tend to be high, compared to many business-to-consumer products. An event tends to get 45 percent “excellent” scores, for example. 

source: Retently 


AI Will Improve Productivity, But That is Not the Biggest Possible Change

Many would note that the internet impact on content media has been profound, boosting social and online media at the expense of linear form...