Friday, December 16, 2022

Lack of Profits Drives Video Ecosystem Evolution

Professionally-produced long-form original video content is expensive to produce. Combine that with high promotion costs and small audiences and you have economic pressures that are very simple to understand, whether you are in the production, packaging or distribution parts of the ecosystem (studio, network or internet service provider). 


Big audiences are required to generate the big revenues that support big production budgets and lots of productions. Small audiences can only be supported if subsidized by profits from serving big audiences. 


And if profit margins get squeezed at every segment of the ecosystem, the profits to subsidize niche programming go away. 


There’s no point in lamenting this fact. It is simple economics. Whatever participants once believed, video streaming seemingly is a case of exchanging “analog dollars for digital dimes.” Revenue does not match cost, it has become clear. So video streaming suppliers are retrenching on the amount of money they spend to create original programming.


That leads observers to lament the content cutbacks. Similar concerns often are raised about the costs of internet access or linear video packages, but cost pressures related to programming rights play a role there as well. 


You might complain about the “below the line” surcharges on a linear video bill for the cost of carrying local broadcast signals, but those charges are a significant and growing part of the cost of gaining rights to retransmit that programming. Sports content is a key part of that cost structure. 


source: Bloomberg  


Costs to carry local TV stations likewise have been rising rapidly for two decades. Costs for a cable operator for local ABC, CBS, FOX, and NBC broadcast stations grew more than  600 percent since 2006, one observer noted in 2019. And that has happened as broadcast audiences shrink. 


One might well note that it is precisely those shrinking audiences--with revenue implications--that have caused local TV stations to lean on cable operator fees to compensate for the lost revenue. 


source: Armstrong 


All those costs get passed on to subscribers. Blame the cable companies if you want, but they must pay broadcasters for those content rights, whether the charges appear above the line or below it. And so long as marketing battles pivot on content and price, and so long as search engines can rank services by price, every service provider has an incentive to quote the lowest-possible price. 


So those content fees get shown below the line, as add-on fees, and not as part of the headline monthly recurring cost. We might all agree it is a bit deceptive. But we might also agree that there are clear business reasons why the policy exists. 


Music streaming provides an analogy, but one that will be hard for video programmers to support: buy the song, not the album. That shift to “a la carte” generally is resisted by networks and distributors alike. 


Such distribution wrecks havoc with brand investments, reduces the profit upside of bundling and reduces the power and leverage of networks. So do not count on a la carte (“buy the song; buy the movie, the show or episode”) becoming ubiquitous. Still, the pressure in that direction remains. 


Linear video subscriptions are losing distribution share because the value proposition keeps getting worse, in comparison to video streaming alternatives. So value propositions will evolve; content strategies will shift; bundles will be redefined; business models will have to change.  So long as professional content production remains expensive, it is inevitable.


Thursday, December 15, 2022

Risk of Stranded Assets Now Looms Large

Stranded assets--capital deployed in infrastructure that does not produce revenue--now have become a key business issue for internet service providers upgrading their networks. 


As a rule, policymakers tend to overstate customer demand for high-speed internet access. “Everyone” laments slow speeds or non-existent service. But even when service availability is high, and speeds similarly are at gigabit levels, actual customer demand is relatively low for those headline speed services. 


In a capital-intensive business such as access networks, it is dangerous to deploy capital to create facilities that are too far ahead of expected demand. Some of you might remember the overspending on spectrum that happened in Europe around 3G spectrum, for example. Others will remember the vast overspending on wide area optical networks that happened around the turn of the century. 


That is not to say policymakers are wrong to push for ubiquitous availability and quality speeds. It is to note that policies can push beyond actual customer demand, and endanger the sustainability of the internet service providers charged with delivering the services. 


Note, for example, the share of home broadband accounts now supplied by upstarts and attackers, compared to incumbent “telcos.”  In a variety of markets, attackers hold close to 40 percent share on the low end, and nearly 90 percent share on the high end. 


The United States and India have the greatest amount of share held by attackers. The point is that capital investment has to lead demand, but not by too much. Among the reasons are stranded asset dangers. Where share is less than 40 percent, 60 percent of invested assets are stranded (do not produce revenue). In markets where share is 10 percent of less, 90 percent of investment might be stranded. 


source: Ofcom 


According to Ofcom, giagbit per second home broadband now is available to 70 percent of U.K. residences. The issue now is what percentage of homes actually buy service at gigabit speeds. Not all that many, it appears. According to Ofcom data, uptake at gigabit speeds might be as low as one percent to two percent. 


About 42 percent of U.K. homes are passed by fiber-to-home facilities. Where fiber-to-home service is available, about 25 percent of homes buy the service. Only nine percent of those customers buy service at the fastest-available speed, however. 


source: Ofcom 


That relatively-small percentage of customers buying the fastest home broadband service is not unusual. Across a range of nations, uptake of gigabit services tends to be in single digits to  low-double digits. 

source: Ofcom 


All of which should remind us that there is a big difference between making a product available and demand for those products. Broadband policy success might be defined as supplying X amount of bandwidth to Y homes. 


It is less clear that “success” can be measured by the degree of uptake of services at any given level. Customers still have the right to buy the products they believe represent the best overall value proposition. In most cases, that means buying a level of service someplace between the lowest and highest speed tiers. 


For policymakers, making service available is what matters. Uptake is a matter for customers to decide. Take rates for the fastest speed access services always have been lowish. So the risk of ISPs over-investing is quite significant. 


Policymakers who are unrealistic about customer demand can jeopardize the sustainability of their ISPs. ISPs who over-invest can put themselves out of business.


U.K. Gigabit Availability Reaches 70%, Uptake 1% to 2%

According to Ofcom, giagbit per second home broadband now is available to 70 percent of U.K. residences. The issue now is what percentage of homes actually buy service at gigabit speeds. Not all that many, it appears. According to Ofcom data, uptake at gigabit speeds might be as low as one percent to two percent. 


About 42 percent of U.K. homes are passed by fiber-to-home facilities. Where fiber-to-home service is available, about 25 percent of homes buy the service. Only nine percent of those customers buy service at the fastest-available speed, however. 


source: Ofcom 


That relatively-small percentage of customers buying the fastest home broadband service is not unusual. Across a range of nations, uptake of gigabit services tends to be in single digits to  low-double digits. 

source: Ofcom 


All of which should remind us that there is a big difference between making a product available and demand for those products. Broadband policy success might be defined as supplying X amount of bandwidth to Y homes. 


It is less clear that “success” can be measured by the degree of uptake of services at any given level. Customers still have the right to buy the products they believe represent the best overall value proposition. In most cases, that means buying a level of service someplace between the lowest and highest speed tiers.


Availability is one matter; uptake quite another.  

Wednesday, December 14, 2022

Platform, Techco or Digital Services Provider Vision is Not Often Matched By Reality

A recurring discussion over the past couple of decades has been held over the question of whether telcos could become platforms. Perhaps the latest version of this debate asks the question of whether telcos can become techcos or digital services providers. 


None of these concepts are easy to explain, or commonly understood. Some might argue the “digital services provider” evolution means offering services such as internet of things, connected cars, smart cities or smart homes. Presumably the concept is that a connectivity provider supplies the apps that provide the value. 


So a connectivity provider acts as a home security firm, an industrial IoT system supplier, a connected car app, traffic management system or energy management firm. In other words, the connectivity provider is the branded developer and provider of the application, not just the connectivity provider supporting any app or use case. 


It is no easier to explain--and have people agree upon--what a “platform” or “techco” evolution means. 


It never is completely clear why telco executives really mean in touting the transformation from telco to “techco.”


Many telcos--or those who advise and sell to them--say telcos need to become techcos. So what does that mean?


At least as outlined by Mark Newman, Technotree chief analyst and Dean Ramsay, principal analyst, there are two key implications: a culture shift and a business model.


The former is more subjective: telco organizations need to operate “digitally.” The latter is harder: can telcos really change their business models; the ways they earn revenue; their customers and value propositions?


source: TM Forum


It might be easier to describe the desired cultural or technology changes.  Digital touchpoints; higher research and development spending; use of native cloud computing; a developer mindset and data-driven product development or use of use artificial intelligence all might be said to be part of becoming a “techco.”


Changing the business model is the more-problematic objective. 


As helpful as it should be to adapt to native cloud, developer-friendly applications and networks, use data effectively or boost research or development, none of those attributes or activities necessarily changes the business model. 


If “becoming a techco” means lower operating costs; lower capital investment; faster product development or happier customers, that is a good thing, to be sure. Such changes can help ensure that a business or industry is sustainable. 


The change to “techco” does not necessarily boost the equity valuation of a “telco,” however. To accomplish that, a “telco” would have to structurally boost its revenue growth rates to gain a higher valuation; become a supplier of products with a higher price-to-earnings profile, higher profit margins or business moats. 


What would be more relevant, then, is the ability of the “change from telco to techco” to serve new types of customers; create new and different revenue models; develop higher-value roles and products or add new roles  “telcos” can perform in the value chain or ecosystem. 


We face the same sorts of problems when trying to explain what a “platform” looks like. 


Korea Telecom wants to become a digital platform company, not a telco. That ambition arguably is shared somewhat widely among tier-one connectivity service providers globally and has been a strategy recommended in some form by most bigger consulting companies. 


Simply, becoming a platform company changes the business model from direct supplier of products to a role as an ecosystem organizer or marketplace. That arguably is an aspirational goal more than anything else. 


What that aspiration means in practice is that KT as a digico “will shift our focus from the telecommunications sector, where growth is stalled due to government regulations, to artificial intelligence (AI), big data, and cloud computing businesses to become the nation's number-one platform operator in the B2B market," said KT CEO Koo Hyun-mo.


So there are qualifications. KT, if successful, would become a platform in the business market, not the consumer market. It would explicitly aim to become the center and organizer of an ecosystem for artificial intelligence, big data analytics and cloud computing. 


Purists and researchers will likely argue about whether all of that actually adds up to KT becoming a platform, in the sense that Amazon, eBay, Alibaba, ridesharing or lodging apps  might be considered platforms. 


A platform, definitionally, makes its money putting buyers and sellers and ecosystem participants together. In computing, a platform is any combination of hardware and software used as a foundation upon which applications, services, processes, or other technologies are built, hosted or run.


Operating systems are platforms, allowing software and applications to be run. Devices are platforms. Cloud computing might be said to be a platform, as systems are said to be platforms. 


Standards likely are thought of as platforms by some. 


In other cases components such as central processing units, physical or software interfaces (Ethernet, Wi-Fi, 5G, application programming interfaces) are referred to as platforms. Browsers might be termed platforms by some. Social media apps are seen as platforms as well. 


The platform business model requires creation of a marketplace or exchange that connects different participants: users with suppliers; sellers with buyers. A platform functions as a matchmaker, bringing buyers and sellers together, but classically not owning the products sold on the exchange. 


A platform orchestrates interactions and value. In fact, a platform’s value may derive in large part from the actions and features provided by a host of ecosystem participants. Facebook’s content is created by user members. Amazon’s customer reviews are a source of value for e-tailing buyers. 


Consumers and producers can swap roles on a platform. Users can ride with Uber today and drive for it tomorrow; travelers can stay with AirBNB one night and serve as hosts for other customers the next. Customers of pipe businesses--an airline, router or phone suppliers, grocery stores-- cannot do so. 


So KT can increase the percentage of revenue it earns from supplying digital, computing, application or non-connectivity services without becoming a platform. As a practical matter, that is what most telco executives have in mind when talking about becoming platforms. 


For KT, even limiting its ambitions to generating more digital and non-connectivity revenue does not make it a platform. That would still be an important, valuable and value-sustaining move. But KT has a very long ways to go, even in its stated objectives of becoming a B2B platform.


Total KT revenue is about 24 trillion won. All B2B revenues at the end of 2020 were about 2.78 trillion won (about 11.5 percent). Information technology services were about 1 trillion won, or about four percent of total revenues. AI and other digital services were about 0.5 trillion won, or about two percent of total revenues. 


It might be a long time between non-connectivity revenues in the B2B part of its business are as much as half of total revenues. And those revenues might not represent a platform transformation of the business model.


KT could win significantly without ever becoming a platform. And some might argue few telcos can ever actually hope to become platforms in the classic sense. Perhaps the more important goal is simply to reduce reliance on traditional connectivity revenues.


Unfortunately, what platform, techco or digital services provider actually means in practice falls far short of the grander visions.


Monday, December 12, 2022

"Telecom" is No Longer Seen as a "Natural Monopoly," But Might Some View it as a Functional Monopoly?

Before the 1980s, global telecom regulators universally considered telecommunications to be a “natural monopoly.” 


Nobody uses the term anymore, as it is obvious connectivity services are not a “natural” monopoly. In most countries, an oligopoly tends to exist at the top of the market, though there can be hundreds to thousands of smaller contestants in large continental markets. 


On the other hand, in some markets, we might find policymakers concluding that access services ("telecommunications") is a functional monopoly, if not necessarily "natural."


In the monopoly era, the owner and operator often was a national government. Then began a worldwide shift to deregulation and privatization as a prelude to allowing more competition in formerly-restricted monopoly telecommunications. Often that takes the form of promoting wholesale arrangements that allow retailers to use a single national network. 


Relatively fewer countries have seen significant fixed network competition based on alternate facilities, but facilities-based competition has been the norm for mobile services.


New wireless licenses issued to many smaller firms will be cited as potential new sources of competition as well, since most connectivity services competition has occurred on the mobile networks. 


Of course, access services remain a scale game. There is no contradiction between services provided by hundreds of small firms with small customer bases and domination of the market by three to four providers. 


But several decades of competition at scale have produced a business where profits are hard to come by, while heavy capital investment, if anything, seems to be increasing. The near-term result has been waves of consolidation and a market structure that is oligopolistic.


There always are at least three sets of opinions  in that regard. Some believe monopoly is the ultimate outcome, as fixed networks will simply be too expensive, with too little revenue, in a facilities-based competitive scenario. 


Some who view facilities monopoly as inevitable therefore argument for a robust wholesale monopoly to support retail competition using the one network. 


Others argue  that a duopoly based on facilities ownership might be the best sustainable outcome, and might produce more innovation than a wholesale approach. By definition, in a wholesale-only framework, the capabilities of the network can be purchased by all retailers, at prices that are differentiated only by possible volume discounts. 


Competitive differentiation then mostly occurs when contestants bundle other non-access services or can leverage some operational or marketing advantage. 


Perhaps the easiest way to illustrate that potential is note that telcos using fiber-to-home; cable operators using hybrid fiber coax and mobile operators using distinct physical platforms can create services aimed at different market segments and customers precisely because their platforms are distinct in terms of cost and capabilities. 


So the issue is how much consolidation will happen, and at what point--if at all--supply and demand in the connectivity business are at equilibrium. In other words, what structure will emerge that allows service providers to sustain themselves with adequate profit levels, while still ensuring the benefits of competition for consumers of those services?


If access services are not a "natural monopoly," might they be oligopolies in most cases (mobile and fixed providers able to sustain themselves)? Still, in some cases, policymakers might conclude that either mobility or fixed services are better supported by a monopoly provider, albeit with strong wholesale arrangements.


That might especially be the case in markets where new services beyond bandwidth are likely to be big opportunities.


Sunday, December 11, 2022

How Big a Deal is Edge Computing as a Revenue Driver for Connectivity Providers?

Edge computing possibly can grow to generate a minimum of $1 billion in annual new revenues for some tier-one service providers. The same might be said for service-provider-delivered and operated  private networks, internet of things services or virtual private networks. 


But none of those services seem capable of driving the next big wave of revenue growth for connectivity providers, as their total revenue contribution does not seem capable of driving 80 percent of total revenue growth or representing half of the total installed base of revenue. 


In other words, it does not appear that edge computing, IoT, private networks or network slicing can rival the revenue magnitude of voice, texting, video subscriptions, home broadband or mobile subscription revenue. 


It is not clear whether any of those new revenue streams will be as important as MPLS or SD-WAN, dedicated internet access or Ethernet transport services, for example. All of those can be created by enterprises directly, on a do-it-yourself basis, from the network edge. 


The point is that even when some new innovations are substantial generators of revenue and activity, it is not automatically connectivity providers who benefit, in terms of direct revenue. 


One rule of thumb I use for determining whether any proposed new line of business makes sense for tier-one connectivity providers is whether the new line has potential to produce a minimum of $1 billion in annual revenues for a single provider in some definable time span (five years for a specific product. 


By that rule of thumb, tier-one service providers might be able to create edge computing revenue streams that amount to as much as $1 billion in annual revenue for some service providers. But most will fail to achieve that level of return in the next five to seven years.


That is not to say "computing at the edge" will be a small business. Indeed, it is likely to account for a growing part of public cloud computing revenues, eventually. And that is a big global business, already representing more than $400 billion in annual revenues, including both public cloud revenues as well as  infrastructure spending to support cloud computing; the value of business applications and associated consulting and services to implement cloud computing.


The leading public cloud computing hyperscalers themselves represent about $72 billion or more in annual revenues already. All the rest of the revenue in the ecosystem comes from sales of software, hardware and services to enable cloud computing, both public and private.




source: IoT Analytics


It is likely a reasonable assumption that most public edge computing revenue is eventually earned by the same firms leading public cloud computing as a service.


Perhaps service provider revenues from edge computing could reach at least $20 billion, in about five years. By that standard, multi-access edge computing barely qualifies as "something worth pursuing," at least for tier-one connectivity service providers.


In other words, MEC is within the category of products that offers reasonable hope of payback, but is not yet in the category of “big winners” that add at least $100 billion to $200 billion in global service provider revenues. 


In other words, MEC is not “mobile phone service; home broadband. Perhaps it will be as big as MPLS or SD-WAN. For tier-one connectivity providers, perhaps MEC is more important than business voice (unified communications as a service). 


source: STL, KBV Research 


As with many other products, including Wi-Fi, SD-WAN, MPLS, 4G or 5G private networks, local area networks in general and  enterprise voice services, most of the money is earned by suppliers of software (business functionality) and hardware platforms, not end-user-facing services. 


The reason is that such solutions can be implemented on a do-it-yourself basis, directly by enterprises and system integrators, without needing to buy anything from tier-one connectivity providers but bandwidth or capacity. 


So one reason why I believe that other new connectivity services enabled by 5G likely do not have the potential to substantially move the industry to the next major revenue model is that none of those innovations are very likely to produce much more than perhaps one percent of total service revenues for the typical tier-one service provider. 


The opportunity for big public connectivity providers lies in use cases related to the wide area network rather than the domain of indoor and private networks. That is why the local area networks industry has always been dominated by infra providers (hardware platforms) and users who build and own their own networks (both enterprise and consumer). 


And most of the proposed “new revenue sources” for 5G are oriented towards private networks, such as private enterprise local area networks. Many of the other proposed revenue generators can be done by enterprises on a DIY basis (edge computing, internet of things). Some WAN network services--such as network slicing--attack problems that can be solved with DIY solutions.


Edge computing is a solution for some problems network slicing is said to solve, for example. 


None of the new 5G services--or new services in aggregate-- is believed capable of replacing half of all current mobile operator revenues, for example. And that would be the definition of a “new service” that transforms the industry. 


All of which suggests there is something else, yet to be discovered, that eventually drives industry revenue forward once mobility and home broadband have saturated. So far, nobody has a plausible candidate for that new service.


Edge computing might be helpful. So might network slicing, private networks or internet of things. But not even all of them together are a solution for industry revenue drivers once home broadband and mobile service begin to decline as producers of at least half of industry revenues.


It already seems clear that others in the edge computing ecosystem--including digital infra providers and hyperscale cloud computing as a service suppliers--will profit most from edge computing.


Friday, December 9, 2022

Power Users Aren't What They Used to Be

“Power users,” defined as accounts using far more data than the typical home broadband user, are not necessarily what you might think. Though we might traditionally have thought of such power users as major content creators or users with extraordinary downloading behavior, that arguably is no longer the case. 


The phrase “yesterday’s power user is today’s typical user” is apt. Where perhaps 11 percent of home broadband users in the third quarter of 2022 were power users, consuming at least a terabyte of data each month, the typical or average account used perhaps 496 gigabytes, with a median consumption of perhaps 324 gigabytes, 


According to Openvault, about 16 percent of accounts in the third quarter of 2022 were “power users” consuming at least a terabyte of data per month. Perhaps we once thought of power users as people with much higher than average computing skills, perhaps including software code writers, very-active content creators and sharers or online gaming enthusiasts. 


These days, the popularity of video streaming adds a more mundane class of users: people who watch lots of entertainment video. Perhaps a working definition is a person or household that streams at least eight hours of video each day. Do that and it is easy to top a terabyte of usage in a month. 

source: T-Mobile 


It should then come as no surprise that Openvault data shows a continued increase in gigabit service plan adoption, as well as migration of subscribers to speeds of 200 Mbps or higher. Though there is no linear casual relationship between access speeds and total data consumption, the two phenomena are correlated. 


Faster speeds allow more to be done in any X amount of time, so more data can be consumed in X amount of time, for example. Over time, applications also are designed to take advantage of higher bandwidths (speed), such as embedding autoplay full-motion video into apps. That increases “involuntary” data consumption. 


Some 15 percent of U.S. households purchased gigabit tier plans in the third quarter of 2022, an increase of 35 percent  over the 11.4 percent market share 12 months prior, says Openvault. 


As always, typical speeds also increased for typical accounts. The percentage of accounts buying service in the 200 Mbps to 400 Mbps range doubled to 54.8 percent  from 27.4 percent over the last year. 


At the end of the third quarter, only 4.7 percent  of all subscribers were provisioned for speeds of less than 50 Mbps, a reduction of more than half  from the third quarter 2021  figure of 9.8 percent. .


Average monthly usage of 495.5 GB was up 13.9 percent from 3Q21’s average of 434.9 GB, and represented a slight increase over 2Q22’s 490.7 GB. Median broadband was up 14.3 percent year over year, representing broader growth across all subscribers.


Year-over-year growth of power users of 1TB or more was 18 percent, to 13.7 percent of all subscribers, while the super power user category of consumers of 2 TB or more rose almost 50 percent during the same time frame. 


source: Openvault  


Directv-Dish Merger Fails

Directv’’s termination of its deal to merge with EchoStar, apparently because EchoStar bondholders did not approve, means EchoStar continue...