Saturday, April 15, 2023

Unknown "Homes Passed" Data Hampers Revenue Growth Estimates

Some important types of statistics and data are not collected because governments do not force firms or industries to collect it. For example, many governments think it is important to track data on where home broadband exists, where it does not, how fast it operates, who buys and who does not. 


Private firms often have important incentives to track and measure their own revenues, sales, profit margins and growth rates. Financial markets and accounting rules often require measurement of this sort. 


AT&T, for example, reports revenues for mobility, fixed network business revenues and consumer fixed network revenues from internet access, voice and other sources. But those are traditional financial metrics, not operating indices such as penetration or take rates, churn rates and new account gains. 

source: AT&T 


Nobody seemingly believes the same effort should be made to measure the number of home broadband provider locations or dwellings reached by various networks. Better mapping, yes. Metrics on locations passed? No. 


And yet “locations passed” is a basic and essential input to accurately determine take rates (percent of potential customers who actually buy). That input matters quite a lot to observers when evaluating the growth prospects of competitors, even if that figure does not matter much for policymakers, who mainly care about the total degree of home broadband take rates, on an aggregate basis. 


The U.S. Census Bureau, for example, reported some 140.5 million housing units housing units as part of the 2020 census. The estimate for 2021 units is 142.2 million units. Assume 1.5 million additional units added each year, for a 2022 total of about 143.6 million dwelling units


Assume vacancy rates of about six percent. That implies about 8.6 million unoccupied units that would not be assumed to be candidates for active home broadband subscriptions. The U.S. Census Bureau, though, estimates there are about 11 million unoccupied units when looking at full-time occupied status. That figure presumably includes vacation homes.


Deducting the unoccupied dwellings gives us a potential home broadband buyer base of about 132.6 million locations. 


More difficult is the degree to which access networks operated by any single contestant actually pass those locations, as firms generally do not report such numbers in quarterly financial or annual reports (they do not have to do so). 


And that is where estimations must be made. AT&T’s 2022 10-K report cites 14.2 million customer locations connected. Assume AT&T has about 20 percent take rates for its home broadband services where it operates. That implies a housing unit coverage of about 71 million dwellings. 


Assume AT&T has a higher take rate of about 39 percent where it operates fixed networks. That implies housing coverage of about 36 million dwellings. 


The estimate of 71 million home passings strikes me as too high, but the estimate of 36 million seems too low. In the past I have used the figure of 62 million homes passed for AT&T. 


Assume Verizon has about 10 million home broadband accounts, with a take rate of 40 percent (a bit high, probably, if we include copper access). That implies housing coverage of some 25.3 million dwellings. 


Leichtman Research Group has estimates of home broadband accounts that vary from company reports. LRG estimates that AT&T has some 15.4 million internet access accounts. The variance might come from business accounts not enumerated. 


Verizon’s consumer accounts might be overstated, as LRG estimates Verizon has about 7.5 million home broadband accounts, not 10 million. Using the LRG account figures, we might estimate Verizon home coverage of about 18.8 million homes, on the high side. 


ISPs

Subscribers at end of 2022

Net Adds in 2022


Cable Companies



Comcast

32,151,000

250,000

Charter

30,433,000

344,000

Cox*

5,560,000

30,000

Altice

4,282,900

(103,300)

Mediacom*

1,468,000

5,000

Cable One**

1,060,400

14,400

Breezeline**

693,781

(22,997)


Total Top Cable

75,649,081

517,103


Wireline Phone Companies



AT&T

15,386,000

(118,000)

Verizon

7,484,000

119,000

Lumen^

3,037,000

(253,000)

Frontier

2,839,000

40,000

Windstream*

1,175,000

10,300

TDS

510,000

19,700

Consolidated**

367,458

724


Total Top Wireline Phone

30,798,458

(181,276)


Fixed Wireless Services



T-Mobile

2,646,000

2,000,000

Verizon

1,452,000

1,171,000


Total Top Fixed Wireless

4,098,000

3,171,000


Total Top Broadband

110,545,539

3,506,827

source: Leichtman Research Group 


Assume Comcast has 31.2 million accounts, with take rates for home broadband of about 52 percent. That implies something on the order of 60 million households. 


Assume Charter Communications has a take rate of about 45.5 percent where it operates fixed networks. Assume Charter has approximately 30.8 million home broadband accounts. That implies a homes-passed figure of about 67.7 million homes. 


If there are 132.6 million U.S. occupied home locations, then Comcast and Charter can reach about 127.7 million of those locations, or about 96 percent of total, as Comcast and Charter essentially have unduplicated networks, not competing in the same geographies. 


That strikes me as unlikely, on the high side. An older rule of thumb is that Comcast and Charter reach about a third of total U.S. locations, each, for a possible reach of up to 66 percent of total U.S. home locations. 


Using different methodologies, I have in the past estimated that Comcast has (can actually sell service to ) about 57 million homes passed, while the Charter Communications network passes about 50 million homes, the number of potential customer locations it can sell to.


Verizon homes passed might number 18.6 to 20 million. To be generous, use the 20 million figure. 


AT&T’s fixed network represents perhaps 62 million U.S. homes passed. CenturyLink never reports its homes passed figures, but likely has 20-million or so consumer locations it can market services to. 


Ignoring the variance in potential customer locations passed, AT&T would seem to have the greatest opportunity in the home broadband space, if it can build optical access connections faster, as has the biggest home footprint and low home broadband market share. 


On the other hand, AT&T revenue is driven by mobility, not the consumer fixed network. So then the question has to be posed as "how much to invest in the consumer fixed network?" compared to other oportunities. A rational person might argue that answer is "not so much."


Capital availability--and financial returns--are always the issue. Even if it dramatically escalated fiber-to-home capital investment, it is not clear AT&T would gain as much new revenue, compared to investing in mobility or business services, for example.


The point of the wider exercise is that we are forced to guess about how many homes each of the major fixed network contestants actually can reach. That, in turn, affects our ability to estimate adoption rates and potential growth opportunities. 


The key point is that the estimates are imprecise. Pinning down the “homes passed” figure, essential as the denominator in any calculation of take rates, requires estimations with variable degrees of uncertainty, especially for the larger networks.


Product and Customer "Who, What, Where, When, Why" is Evolving

What you sell matters. How you sell it also often matters. Who you sell it to, and where you sell it, also matters. How much you sell always matters. What it costs you to sell those items also always matters.


The rise of digital infrastructure partly raises the issue of how best to organize the production and sale of internet access and other connectivity products, as cloud computing has changed the way we think about how to procure and supply computing and applications.


To a greater extent than ever, asset owners and analysts evaluate the merits of asset-light or asset-lighter approaches. That changes the answers to the questions of what, who, where, when, why products get sold, as well as how much and how profitably.


Access providers tend to have operating margin valuation multiples as much as three times lower than infrastructure-only providers such as tower companies. 


Several aspects seem to account for the disparities. Tower companies sell to all competitors in a market, and therefore are viewed as representing less risk, as the tower companies can theoretically address nearly 100 percent of the market.


No single retail telco or internet service provider ever can claim to acquire as customers more than a fraction of the total market. Additionally, tower companies sell multi-year contracts, often with price escalator clauses to protect against inflation. 


That offers the sort of cash flow predictability that investors value in utility type businesses ranging from electrical and natural gas retailers to airports and toll roads. Also, cell tower assets offer some protection against unrestrained new competition. 


source: Deloitte 


Data center assets also are viewed as having similar characteristics, though perhaps with less moat protection, as, in principle, additional data centers can be built at the same locations. 


Still, there are but a handful of hyperscalers who are potential data center tenants, so there are some moats in that regard. But the total range of enterprise and business tenants is far broader. 


While additional cell towers can be built at similar locations, the number of potential tenants is more limited, as there might be only a handful of potential tenants. 


For such reasons, data center assets might show a broader range of valuations, but still be much higher than EBITDA valuations for access providers. 


source: Oliver Wyman

Friday, April 14, 2023

Why Many "Digital" Firms Must Use Non-Traditional Proxies for Success

There is a good reason why many applications use non-traditional metrics for measuring progress: the sources of their value do not map with traditional financial metrics in a direct way. We sometimes think that is true of “software” companies or “digital” firms, but that is partly a coincidence. 


The internet, layered software and virtualization allow most firms to conduct their existing businesses in different ways, without changing the traditional performance metrics. 


Platform businesses, on the other hand, necessarily must account for their success at creating ecosystems of value creation, which are not measurable using standard accounting conventions. 


That is why we often see metrics that are proxies for user engagement, such as daily active users or monthly active users. We might see citations of time spent on the platform. Perhaps we see data on conversions of visits to sales of merchandise. 


Uber might cite gross bookings paid by rider, or the number of trips riders take, the number of drivers or the number of riders per day or month. 


Airbnb might cite nights booked, gross booking value, host earnings or average daily rates as evidence of success. 


Amazon Marketplace and eBay will cite gross merchandise volume. Amazon might point to units sold, customer satisfaction ratings. 


Ebay might track active buyers or seller ratings. 


Since network effects are critical, we might see numbers about growth in the number of producers, merchants, properties, drivers, listings. We might see evidence of success in terms of growing gross merchandise sales, rides, rentals or other metrics about buying volume. 


User abandonment of the platform also could matter, so we might see evidence provided about churn rates declining. 


Connecting domains in the internet era provides an example of the “death of distance” in wide area networking. 


But non-GAAP metrics have grown in importance, even for firms not using platform business models. Competitive local service providers once cited metrics such as “voice grade equivalents” to show sales progress, at a time when service providers were early into the process of measuring bandwidth supplied--rather than voice lines in service--as a proxy for performance. 


Average revenue per account, or average revenue per unit, now are proxies for progress in boosting revenue. Churn rates also became important in competitive markets, where lost customer accounts also tended to mean that a competitor gained an account. 


For similar reasons, customer acquisition cost became an important and relevant metric. 


These days, marketing battles are fought over metrics such as internet access speed, rather than voice quality; outage performance rather than voice quality. 


Consider also that wide area transport of data was charged using distance and capacity as the cost drivers. These days, distance is basically not a significant driver of cost. Instead, interconnection bandwidth tends to drive prices. 


In fact, large domains often agree to “peer” without major recurring cost, exchanging traffic between domains without costs related to traffic volume, as it is expected that inbound and outbound traffic will roughly balance on an annual basis. 


None of those are standard financial reporting categories, but are important proxies for business success.


Thursday, April 13, 2023

Do ISPs Really Suffer From High Data Demand?

When access providers complain about their revenues not keeping pace with the infrastructure they must build to support their own customer requirements, that argument arguably makes more sense for fixed network home broadband than mobile internet access, looking at mobile data consumption and mobile operator service revenues as analyzed by Omdia. 


Granted, mobile networks are less capital intensive than fixed access networks, but mobile data consumption and mobile data revenue growth are not inversely related. Granted, traffic growth is happening at a faster rate than revenue growth, but the revenue trend still is positive. 


source: LightReading, Omdia


Some of us might argue that even that trend is misleading, in the sense that it is mobile operator customers who are invoking delivery of nearly all that data, as they watch streaming video, listen to streaming audio or interact with image-rich social media sources.  


To the extent that networks are supposed to compensate each other for use of facilities, the asymmetrical flow of data between customers on one network--who request content delivery from sources on other networks--might be viewed as an application of the “calling party pays” arrangement. 


In such content sessions, it is the calling party that creates the data demand by invoking delivery of content from a remote network. Only by ignoring that reality can it be claimed that the content delivery network is the “calling party.”


Assume the internet value chain represented about $6.7 trillion in annual revenues, as estimated by Kearney and GSMA. Assume internet access revenue was about 15 percent of that total amount, or about $1 trillion earned each year providing internet access. 


Assume global revenue earned by “telcos” was about $1.5 trillion, as estimated by IDC. That implies that as much as 66 percent of total telco and ISP revenue earned annually was generated by internet access.


Even if that is a high estimate, it suggests the importance of internet access for telco and ISP revenue and profits. 


It is impressionistic, but even if data demand grows at faster rates than access revenue, logic would suggest that profit margins for internet access are likely higher than for many other services including business data networking, video services or mobile messaging and voice. 


Perhaps only legacy voice services, which generally speaking are harvested, requiring minimal new capital investment, might have margins higher than internet access. 


Some 40 years ago, linear video might have produced margins in the 40-percent range, where today most providers would be lucky to see 10-percent margins. 


By some estimates voice service margins in 1980 were as high as 50 percent. Mobile voice margins might have been in the 30-percent range in 1980 and might be as low as five percent today. 


But even if we use a blended rate of 10 percent for ISP and telco service margins, internet access, as the largest product category, still produces the greatest volume of profits. 


Again, it might only be illustrative, but ISPs might well be making average profits on their internet access services, but up to two thirds of gross revenues on those services. 


We can argue about the cost of delivering capacity now, compared to 49 years ago, but nobody would question that the cost to deliver a bit has declined dramatically over that period. 


In that sense, the total capacity demand generated by an ISP’s customers might not matter as much as often portrayed. What matters more is the contribution internet access makes to total revenues and profit margins. 


To the extent that traffic asymmetries exist between access providers in different regions, those traffic flows are mostly dictated by the location of content servers and the end user locations where people are requesting delivery of content. 


So whether one agrees that content delivery is a remote response to a local customer’s requests, or is an unrelated part of a single session, it is not so clear that ISPs literally have a broken business model. As content servers are deployed “closer to the edge” over time, asymmetrical data flows arguably could be reduced.


Tuesday, April 11, 2023

Video Calling Now Among the Top-three Mobile Phone Activities

Technology forecasts can be notoriously incorred, but on timing and adoption. Consider video calling, something that consumers do fairly regularly now.


Despite the many predictions about video calling, that use case was not a mass market and routine use case until recently, when the Covid pandemic forced people to start using it. 


AT&T, for example, demonstrated Picturephone at the 1964 World’s Fair, but adoption never happened. 


source: Time


Not until the Covid pandemic forced people to work from home and students to learn from home was there mass adoption of video calling services such as Zoom


A recent survey of user behavior in 2021, however, shows that video calling follows only instant messaging and voice calling as an activity on a mobile phone.   


source: GSMA Intelligence

How Much is Home Broadband About Physical Media?

Knowing what physical media is used by an access network does not necessarily tell one much about actual capacity or expected customer speed experiences, on any access network. Nor does physical media necessarily drive customer choices in an exclusive way. 


Personally, I’d buy a gigabit service provided by any network compared to an FTTH network supplying less capacity than that. Media does not matter, in that regard. Of course, price, upstream capacity and other issues play a part in such decisions. 


The point is that we sometimes fetishize FTTH, when we should be looking also at speed and other elements of the customer experience. Before FTTH became available, I’d assumed most people would prefer to buy it. In the abstract, that makes a good deal of sense: it’s the better network, right?


But price-value relationships matter. FTTH availability is one matter; buying decisions are driven by a much-wider set of considerations. 


Even though we conventionally assume fiber to home is much faster than copper access, with other platforms such as geostationary satellite, low earth orbit satellite, fixed wireless or hybrid fiber coax somewhere between copper and fiber home broadband platforms, FTTH networks can be activated at a range of speeds. In some cases, FTTH might not represent the fastest-available home broadband choice. 


So comparisons and targets are, in one sense, better evaluated in terms of speed capabilities and price-value relationships, matched by consumer buying behavior. What a policymaker wants is gigabit speeds or multi-gigabit per second speeds, not access media as such. 


There always seems a gap between customer preferences and internet service provider offers. In markets with strong cable operator competition, for example, FTTH tends to get between 40 percent penetration and 45 percent adoption after about three years of marketing. Some FTTH ISPs hope to reach a terminal adoption rate of 50 percent, but that is about the extent of expectations. 


source: IDATE, TelecomTV


Data from other European markets shows similar gaps between facilities deployment and take rates, where take rates hover between 45 percent and 47 percent. And that is a view of physical media choices, not necessarily speed tiers chosen by customers. 


In the U.S. markets, as well, many consumers choose not to buy the “fastest” tiers, but rather tiers someplace in the middle between fastest and slowest. 


source: OpenVault


The point is that enabling fast home broadband networks is one matter; customer demand is another matter. At any given point in time, it is likely that a majority of customers buy services in the middle ranges of capability; not the fastest and not the slowest. 


Consider U.K. fiber to premises networks, where “superfast” networks, by definition, operate at a minimum of 24 Mbps to 30 Mbps. Perhaps 42 percent of U.K. premises can buy FTTH-supplied home broadband. 

source: Uswitch 


Project Gigabit is a UK Government program aimed to bring £5 billion worth of investment to the country’s home broadband infrastructure. The aim is to bring gigabit-capable coverage to 85 percent of the U.K., and maximize coverage in the 20 percent of  hardest-to-reach locations by 2025. 


Based on past experience, it is safe to predict that, at some point, most customers will buy services at the gigabit per second level, just as most now buy services operating at about 30 Mbps. Just as safely, we can predict that, at some point, most customers will buy multi-gigabit per second services as well. 


We sometimes forget that during the dial-up era, people bought services topping out at perhaps 56 kbps in 1997. By 2000, typical speeds had climbed to 256 kbps; by 2002 reaching 2.544 Mbps. 


source: NCTA 


By 2005, typical speeds were in the 8 Mbps range; by 2007 speeds had climbed to about 16 Mbps. By about 2015 we began seeing advertised speeds of 1 Gbps. 


In all those eras save the dial-up period, the top speeds were not purchased by most people. Capabilities are important, to be sure. But consumer demand also matters. 


It is not necessarily a policy failure if most customers choose not to buy a particular product. 

source: Uswitch 


In competitive markets where gigabit alternatives are available on other platforms, FTTH take rates often hover around 40 percent of locations passed. If FTTH were clearly the superior choice, in terms of price-value, take rates would be higher. 


How that changes in the future is a reasonable question, especially in markets with facilities-based competition. In markets with but a single network provider, but multiple retail competitors using one network, FTTH take rates could be much higher, even if market share held by any single contestant 


Monday, April 10, 2023

Why Industry Rebranding Will Mostly Fail

"The more things change, the more they stay the same" might well apply to the connectivity business, despite all efforts to rebrand and reposition the industry as having a value proposition based on something more than "we connect you."


Consider the notion that 6G mobile networks will be about “experience,” as suggested by Interdigital and analysts at Omdia. At some level, this is the sort of advice and thinking we see all the time in business, where suppliers emphasize “solutions” rather than products and virtually all suppliers seek to position them as providers of higher-order value. 


The whole point of home broadband or smartphones with fast internet access is that those capabilities support the user experience of applications. 


And many have been talking about that concept for a while. “The end of communications services as we know them” is the way analysts at the IBM Institute for Business Value talk about 5G and edge computing, for example. 


To be sure, connectivity is not the only business where practitioners are advised to focus on end user benefits, solutions or experiences. But connectivity is among businesses where perceived value, though always said to be “essential” to modern life, also is challenged by robust competition and the ability to create product substitutes. 


One of the realities of the internet era is that although end user data consumption keeps climbing, monetization of that usage by communications service providers is problematic. Higher usage might lead to incremental revenue growth, but at a rate far less than the consumption rate of growth.


That is the opposite of the relationship between consumption and revenue in the voice era, when linear consumption growth automatically entailed linear revenue growth. Though there was some flat-rate charging, most of the revenue was generated by usage of long-distance calling services. 


“On the surface, exponential increases in scale would seem like a good thing for CSPs, but only if pricing keeps pace with the rate of expansion,” the institute says. “History and data suggest it will not.”


Indeed, Nokia Bell Labs researchers have been saying for some time that "creating time" 

is one way of illustrating the difference between core value propositions in the legacy and today’s market. “We connect you” has been the traditional value prop. But that could shift to something else as “connectivity” continues to face commoditization pressures. 

source: Nokia Bell Labs  


The growing business model issue in the internet era is that conventional illustrations of the computing stack refer only to the seven or so layers that pertain to the “computing” or “software” functions. Human beings have experiences at some level above the “applications” layer of the software stack, and business models reside above that. 


Likewise, physical layer communication networks and devices make up a layer zero that underpins the use of software that requires internet or IP network access. The typical illustration of how software is created using layers only pertains to software as a product. 


Software has to run on appliances or machines, and products are assembled or created using software, at additional layers above the seven-layer OSI or TCP/IP models, in other words.


So we might call physical infra and connectivity services as a “layer zero” that supports software layers one to seven. And software itself supports products, services and revenue models above layer seven of the software stack. 


Some humorously refer to “layer eight” as the human factors that shape the usefulness of software, for example. Obviously, whole whole business operating models can be envisioned using eight to 10 layers as well.  


source: Twinstate 


The point is that the OSI software stack only applies to the architecture for creating software. It does not claim to show the additional ways disaggregated and layered concepts apply to firms and industries. 


source: Dragon1 


Many have been using this general framework for many decades, in the sense of business outcomes driving information and computing architecture, platforms, hardware, software and communications requirements. 


source: Wikipedia

 

In a nutshell,  the connectivity industry’s core problem in the internet era is the relative commoditization of connectivity, compared to the perceived value create at other layers. 


Layer zero is bedeviled by nearly-free or very-low-cost core services that have developed over the last several decades as both competition and the internet have come to dominate the business context. 


Note that the Bell Labs illustration based the software stack on the use of “free Wi-Fi.” To be sure, Wi-Fi is not actually free. And internet connectivity still is required. But you get the idea: the whole stack (software and business) rests, in part, on connectivity that has become quite inexpensive, on an absolute basis or in terms of cost per bit.  


Hence the language shift from “we connect you” to other values. That might include productivity or experience, possibly shifting beyond sight and sound to other dimensions such as taste and touch. The point is that the industry will be searching for better ways to position its value beyond “we connect you.”


And all that speaks to the relative perception of the value of “connections.” As foundational and essential as that might be, “mere” connectivity is not viewed as an attractive value proposition going forward. 


It remains to be seen how effective such efforts will be. The other argument is that, to be viewed as supplying more value, firms must actually become suppliers of products and solutions with higher perceived value.


And that tends to mean getting into other parts of the value chain recognized to supply such value. If applications generally are deemed to drive higher financial valuations, for example, then firms have to migrate into those areas, if higher valuations are desired.


If applications are viewed as scaling faster, and producing more new revenue than connectivity, and if suppliers want to be such businesses, then ways should be sought to create more ownership of such assets. 


The core problem, as some might present it, is that the “experience” benefits are going to be supplied by the apps themselves, not the network transporting the bits. It is fine to suggest that value propositions extend beyond “connectivity.” 


source: Interdigital, Omdia


The recurring problem is that, in a world where layers exist, where functions are disaggregated, connectivity providers are hard pressed to become the suppliers of app and business layer value. So long as connectivity exists, value and experience drivers will reside at higher layers of the business stack. 


Unless connectivity providers become asset owners at the higher levels, they will not be able to produce the sensory and experience value that produces business benefits. Without such ownership, the value proposition remains what it has always been: “we connect you.”


If so, the rebranding will fail. Repositioning within the value chain, even if difficult, is required, if different outcomes are to be produced.


So long as humans view the primary communications industry value as "connections," all rebranding focusing on higher-order value will fail. It is the apps that will be given credit for supplying all sorts of new value.


Directv-Dish Merger Fails

Directv’’s termination of its deal to merge with EchoStar, apparently because EchoStar bondholders did not approve, means EchoStar continue...