Tuesday, November 8, 2022

Would Home Broadband "Utility" Regulation Lead to Lower Prices?

It never is entirely clear to me what people mean when they argue internet access or home broadband home broadband “should be a utility,” or that such services already are a utility similar to  electricity, gas, water, sewers, phone lines, roads, seaports or airports. 


Some might mean home broadband should be, or is, a public utility in the sense of “common carrier” with obligations to serve the general public. Though most of us would undoubtedly agree with that notion, telecom policy already has such goals. That is why we have universal support support funds and subsidies for operators in high cost areas. 


Others might mean essential or regulated in terms of price or conditions of service. That might imply regulated prices and terms and conditions of service. 


Others might fix on the used everyday sense of the term, which is that internet access is fundamental for inclusion in normal life, as are electricity, fresh water, wastewater services, garbage collection. It might mean that home broadband is essential in the same way that roads, schools, medical care, food supply, airports and seaports are necessary to support life. 


None of that seems to capture the implied meaning that home broadband should be a utility. More likely, there is some expectation that things would be better if prices, coverage, terms and condition of service were regulated in ways that led to lower prices, less competition or some combination of the two. 


And that should raise serious questions. There was a time when all “telecom services” were regulated as monopoly public utilities. But prices were high and innovation low, under that framework. Ironically, if what people mean is that internet access should be a regulated monopoly, the outcome would almost certainly be higher prices and less innovation; lower rates of quality improvement and other forms of customer value. 


Were home broadband regulated, we would see less innovation and investment as well, as potential suppliers would find they cannot make a positive business case. 


   

source: Market Business News 


As it pertains to “home broadband,” generally the term refers to fixed network supply of home broadband, not mobile network supply. 


The expectation that utility regulation would lead to lower prices is almost certainly wrong.


Most of us are too young ever to have experienced “connectivity services” as a public utility. But prices were not uniformly low. 


In 1984, before the breakup of the U.S. AT&T monopoly, calling between states cost about 90 cents a minute. In 1955, a phone call between Los Angeles and San Francisco (not even interstate) cost about 70 cents a minute, not adjusted for inflation.


In 2022 currency that would be about $7.75 per minute. So, no, prices were not uniformly lower under monopoly or public utility regulation. 


Of course, that was by policy design. High long distance charges and high business services were intended to subsidize consumer local calling. 


Were home broadband to become a regulated service, something similar would happen. While prices for some features and plans might be price controlled, other elements of value would increase sharply in price. 


And price is only one element of value. Service innovation was sharply limited in the monopoly era. In the U.S. market, consumers could not own their own phones, or attach third party devices to the network. All consumer premises gear had to be purchased from the phone company, for example. 


To be sure, AT&T Bell Labs produced many innovations. But they were not directly applied to the “telephone service” experience. Those included Unix, satellite communications, the laser, the solar cell, the transistor, the cellular phone network, television and television with sound. 


Though ultimately quite important, none of those innovations arguably applied directly to the consumer experience of the “phone network” or its services. 


The point is that monopoly regulation tends to produce varied prices for different products (some subsidized products, some high-cost products), but also low rates of innovation in the core services. 


Utility regulation would likely not wind up being as beneficial as some seem to believe. Be careful what you wish for.


Sunday, November 6, 2022

"Sending Party Pays" is a Classic Example of Channel Conflict

Whatever positions one takes on whether a few hyperscale app providers ought to pay fees to internet service providers, there is no question that the emergence of the internet as the next-generation “telco” platform raises tricky issues about business models, competitive dynamics and available supplier responses. 


Differences in regulation of “public telephone networks,” radio and TV broadcast, cable TV and data networks always have existed. Those differences are exacerbated now that the internet has effectively become a universal distribution system for all content, communications and media. 


“Sending party pays” is a new concept that would make a few hyperscalers pay ISPs for usage by ISP customers. Ignore for the moment whether that is just, fair or reasonable. The concept highlights new business model strains in the internet ecosystem between content owners and distributors. 


Sending party pays also illustrates changes in the ways regulators might--or could--change their thinking about how to regulate various communication networks. There also are major issues around how much value chain participants can, or should, work out business agreements between themselves. 


That also necessarily raises questions about where value lies in the ecosystem, and what policies best promote the growth and health of the ecosystem. Industrial policy also is inextricably interwoven in those choices. 


Value chains are different for the internet, compared to traditional “telecommunications.” Traditional voice is a vertically-integrated app created, controlled and sold by telcos over their own networks. Enterprise wide area data networks provide another example. 


The internet is different: it consists of loosely-coupled ecosystem partners operating on “open” rather than “closed” networks. No app or content or commerce provider needs an internet service provider’s permission to be used by any internet-connected user (government permission is another matter). 


In other words, an ISP’s customer buys internet access service. The ISP does not control access to any internet-available app, service or site, and does not participate in a direct way in monetization of those apps, services and sites. 

source: Kearney 


Like it or not, an ISP’s role in the ecosystem lies in supplying internet access to its own customers. Some ISPs might also participate in other roles, but in their role as access provider, their revenues are based on access customer payments, supplemented in some cases by universal service payments, government subsidies or, in a few cases, advertising. 


That does not mean ISPs are barred from other roles and revenue streams. It does mean that in their role as access providers, their customers are the revenue drivers. 


That has been the general pattern for home broadband and mobile internet access: customers pay based on consumption, or potential consumption, with mobile services having the clearest consumption-based pricing. 


Mobile buckets of usage differentiated by potential consumption limits have been the norm, where for fixed networks “speed” has been the mechanism for pricing differential. 


The big principle is that the usage is paid for by the access customer. The proposed new taxes on content providers move beyond that framework, making a few content providers liable for usage, not just the access customers. 


At a high level, this is a somewhat normal sparring between buyers and sellers in a value chain, where one partner’s costs are another partner’s revenue. But there are issues. If an electrical utility requires more generation capacity, it has to build new power plants, encourage conservation or take other steps to match generation with consumption. 


If a water utility has to support more customers, homes and businesses, it has to increase supply, by building dams, acquiring new rights to tap aquifers or other bodies of water, or discourage consumption restraint, or both. 


There is an obvious remedy that ISPs have not taken, possibly because they feel they cannot do so: raise prices on customers (subscribers) that recover the costs of network capacity. Nor do ISPs generally take any measures to encourage conservation. They could do so; they simply do not. 


With the caveat that there are revenue or business reasons for such inaction, it nevertheless remains the case that ISPs could act themselves to better match capacity supply with customer demand.


Assuming network neutrality rules are not a fundamental issue, ISPs also could institute policies for trading partners that likewise discourage “wasteful” bandwidth consumption practices, such as enabling autoplay video. 


ISPs need the right to do so, if such practices benefit their customers by reducing the need to invest in new capacity at high rates without any compensation for doing so. 


To be sure, the problem results from the economics of delivery networks. Content delivery networks are most efficient when they can operate in multicasting mode (broadcasting). Those networks are least efficient when they must operate in unicast mode (traditional voice sessions or any form of on-demand access). 


In principle, edge-based content delivery networks help reduce wide area network capacity demand. It is never so clear even content delivery networks alleviate access network congestion, though. 


That leaves a few levers yet not pulled: raise subscriber prices to approach the full costs of actual usage, and create incentives for conservation. Subscribers could be rewarded for downloading content overnight (when networks have spare capacity), stored locally and then consumed later. 


Stripped to its essentials, channel conflict is what the telco-hyperscaler “sending party pays” proposals are about.


Saturday, November 5, 2022

Big Companies Good at Innovation are Rarities

Practitioners of innovation almost always believe their chances of succeeding are quite high. They would not make the effort unless they did believe that was the case. But, statistically, innovation tends to be quite hard. Consider venture capital, which is innovation with clear or+metrics for success


A general rule of thumb for venture capitalists is that 75 percent of venture capital startups fail completely. Another three or four return the original investment, and one or two produce virtually all the significant financial returns. 


Also, keep in mind that perhaps one percent of proposals actually wind up getting funding. 


According to Cambridge Associates. Information technology digital media startups from 2001 to 2011 produced uneven multiples of the original investment. In more than 60 percent of cases, the startups did not earn enough to produce a return on invested  capital. About seven percent of all funded companies are able to produce returns in excess of five times the original investment. 

VC hit rate

source: jtangoVC 


So outright failure is the case at least 63 percent of the time. Another 30 percent produce an actual return. Less than one in ten are big winners. 


Some studies suggest 74 percent of digital transformation efforts fail. Historically, most big information technology projects fail. BCG research suggests that 70 percent of digital transformations fall short of their objectives. 


From 2003 to 2012, only 6.4 percent of federal IT projects with $10 million or more in labor costs were successful, according to a study by Standish, noted by Brookings. IT project success rates range between 28 percent and 30 percent, Standish also notes. The World Bank has estimated that large-scale information and communication projects (each worth over U.S. $6 million) fail or partially fail at a rate of 71 percent. 


McKinsey says that big IT projects also often run over budget. Roughly half of all large IT projects—defined as those with initial price tags exceeding $15 million—run over budget. On average, large IT projects run 45 percent over budget and seven percent over time, while delivering 56 percent less value than predicted, McKinsey says. 


Beyond IT, virtually all efforts at organizational change arguably also fail. The rule of thumb is that 70 percent of organizational change programs fail, in part or completely. 


Of the $1.3 trillion that was spent on digital transformation--using digital technologies to create new or modify existing business processes--in 2018, it is estimated that $900 billion went to waste , say Ed Lam, Li & Fung CFO, Kirk Girard, former Director of Planning and Development in Santa Clara County and Vernon Irvin Lumen Technologies president of Government, Education, and Mid & Small Business. 


All that accumulated experience helps us understand why innovation so often comes from the young, who have less to lose; from small firms rather than big, established firms; from outside an industry rather than from within it. 


A rational actor in any large, established industry or firm has more to lose than to gain from an attempt at innovation: odds of success are three in 10. A small attacker might well conclude that those odds are worth the effort, especially if the attacker is led by young people who can survive an early failure or two with little long-term damage. 


Quite the opposite is true for older leaders who have risen to the top precisely because they know how the legacy business runs, and benefit from it. A professional manager who expects to remain in the top post for less than a decade has much more to lose than to gain by any serious effort to transform the existing business model. 


When the person at the top of any big organization is three to five years away from retirement, what else would you expect, other than behavior that is basically “do not mess it up?” 


The upshot is that innovation is risky, destined to fail seven times out of 10. “Letting someone else take the risk of attempting innovation” therefore can appear a wise strategy. The exceptions often occur when a firm’s core business model is unraveling. Then the risk of trying to innovate is less than the risk of staying a failing course. 


There seems to be far less research done on how successful firms are at rescuing themselves from failing business models. Impressionistically, the odds are even worse than seven out of 10, as the common remedy is a sale of the asset to some other entity, assuming outright bankruptcy is avoided.


Thursday, November 3, 2022

After the Big Fiber Builds, Consolidation

Somewhere in excess of 68 percent of U.K. consumers now can buy home broadband services at gigabit speeds, though fiber to the premises covers only about 37 percent of U.K. homes and business locations, according to Ofcom. 

source: Ofcom 


More important, from BT’s standpoint, is actual retail customer adoption, which seems to be about 27 percent of homes passed. That is a problem, balanced somewhat by Openreach wholesale sales


If one assumes that any fiber-to-home network is sustainable with a minimum of about 30 percent take rates (actual paying customers as a percentage of locations passed), then BT has a bit of a way to go to reach sustainability. 


“For their models to work, most operators assume a 40 to 50 percent penetration rate,” say analysts at Kearney. That arguably applies to larger internet service providers with legacy operations, rather than upstarts that typically have lower operating costs. 


“The difference in net present value between a 50 percent and a 30 percent penetration rate may well be the difference between a positive NPV and a loss,” they note. 


That can be difficult in a multi-supplier market with three or more competent suppliers of fixed access, plus two or three suppliers of fixed wireless targeting. 


If one believes that three to six suppliers is too many in the home broadband business, then consolidation seems inevitable.


Saturday, October 29, 2022

Linear TV Value Prop Keeps Getting Worse

Linear TV subscriptions have been cannibalized by video streaming alternatives for a decade, partly because on-demand provides more value; partly because some like the lower cost per service; partly because linear value now increasingly comes down to sports, news and unscripted reality content. 


But lower overall cost seems less and less a value driver, as many consumers buy four or more streaming services. Most scripted content has largely shifted to streaming delivery. 


source: Ark Investment 


A recent survey by FinanceBuzz of 1,000 U.S. adults found that 24 percent of households are buying “at least three additional streaming services than they did one year ago,” while another 21 percent of respondents are now paying for two more streaming services.


The point is that consumers are buying multiple services at a higher rate than they did five to 10 years ago. 

source: Financebuzz 


About 25 percent of respondents spend more than $75 per month on streaming subscriptions. If you assume the typical linear service costs between $60 and $80 a month, it is clear that consumers are not buying streaming services because the necessarily save money. 


The issue is where linear TV value will lie, as the shift of scripted content to streaming continues. Even sports now are shifting to streaming delivery, leaving unscripted reality shows and news as the anchors for linear. That will be a declining value proposition for a greater range of customers over time. 


I, for example, never watch anything but sports or news on linear, so the entire value proposition comes from just those two types of programming. Most channels never get watched, which always was true, even before streaming. 


But the whole value-price relationship keeps getting worse. Not that many younger people watch news channels and more sports content is moving to streaming, perhaps leaving unscripted reality TV as the last bastion of “value” for linear TV, assuming one watches that genre. 


The point is that it is fairly hard to justify spending $60 to $80 a month for what boils down to regular viewing of two news channels and one-season viewing of up to two sports channels. The only channels that get watched all year are the two news channels. Looked at that way, the value proposition is even worse.


Is Net Neutrality a Problem Creator?

Network neutrality has been worse than a solution in search of a genuine problem: it actually causes actual problems to fester. Consider only auto-run video. 


Consider the amount of data consumption by most customers of any access network. How much consumption actually is quite unintentional and caused by auto-run video? Quite a lot, as it turns out. 


You might argue data consumption of the unwanted sort would be lessened if auto-run video could be lessened if internet service providers could prevent auto-run videos from playing “automatically.” But that would be considered traffic shaping that network neutrality does not allow. 


But some forms of traffic shaping already seem to be quite common, despite net neutrality rules. Many mobile service plans specifically limit video resolution, for example, even if content providers and advertisers create their content at high-definition and 4K image quality levels. In principle, limiting resolution of some content could be considered a net neutrality violation as it shapes traffic (imposes rate limits), though applied in a way that all content suppliers are equally affected. 


Though advertisers and content owners arguably prefer auto-run, that practice increases data consumption for customers and ISPs who do not benefit from the practice. For both access network customers and ISPs, it is unwanted data consumption.


And the data usage is  “unintentional” only on the customer’s part. That unwanted usage is quite intentional on the advertiser’s part. Advertisers and content providers like--and embed--auto-run functions to increase “views.”


Keep in mind the various constituencies in the content value chain whose behavior shapes data consumption. To limit auto-run requires lots of agreement to do so, and such agreements are not necessarily in an advertiser’s or content provider’s or commerce platform’s interest. 


In principle, the ecosystem could agree to reduce auto-run video and take other measures to reduce customer or ISP unwanted data consumption. But such agreements require key stakeholders to agree, even when auto-run furthers some business interest. 


And network neutrality rules make some decisions impossible, such as adjusting quality differentially. That Is more a problem on fixed networks, which bear the brunt of net neutrality rules, than on mobile networks, where the rules are less stringent in application.  


Devices of various screen sizes consume different amounts of data, but screen resolution can only benefit so much from higher-resolution settings that are more data-intensive. On a small mobile phone screen, the human eye cannot discern the difference between 4K and high-definition quality. And HD arguably does not improve user experience over standard definition.


So it is possible for many mobile service plans to deliberately degrade image resolution without running afoul of net neutrality rules. 


Network management rules also are allowed, even when net neutrality rules hold. Downloads and software updates can be managed to avoid doing so at times and locations of peak congestion, for example, without necessarily violating net neutrality rules. 


So is traffic shaping of this sort simple "network management," or is it a violation of network neutrality?


The point is that net neutrality represents a policy that apparently was not needed, aimed to solve a problem that did not arise, and more importantly, prevents ISPs and other partners from taking measures that enhance user experience while avoiding unnecessary and unwanted data consumption. 


Some argue voluntary agreements can be crafted. Sure, Wi-Fi offload helps. More-affordable infrastructure helps. But the internet ecosystem necessarily is loosely-coupled. In a closed ecosystem, everything might be optimized to avoid excessive and unwanted data consumption. That is not how the internet ecosystem works. 


Voluntary agreements can be crafted, but only when all the affected parties agree it is in their own interests. To be honest, app and content providers, and their business model partners, likely have no reason to limit auto-run video. 


Network neutrality rules arguably impede creation of access rules that would reduce unwanted data consumption, helping ISPs on the cost side without harming customer experience. But it is not so clear advertisers and content owners would agree. Auto-run video exists because it has a perceived business benefit. 


How often in life do we see ecosystem partners voluntarily harming their business models to make others in the ecosystem happy by aiding the models of the other partners?


We might need to be rid of netowrk neutrality to allow ISPs to craft policies that avoid significant costs on their part, without harming their customers' user experience. Voluntary rules are possible if we can resolve issues of vested business interests of others in the value chain.


Friday, October 28, 2022

Metaverse or AR/VR? It Matters Which Definition One Uses

With the caveat that we all can be wrong when predicting the future, a new study of 350 chief technology officers, chief information officers and IT director technology leaders from the U.S., U.K., China, India, and Brazil suggests 2023 technologies of note are cloud computing (40 percent), 5G (38 percent), metaverse (37 percent), electric vehicles (EVs) (35 percent), and the Industrial Internet of Things (IIoT) (33 percent).


What might be shocking is the appearance of "metaverse" on the list of 2023 priorities. All the others seem uncontroversial to a large extent. The inclusion of metaverse is the surprise. But the key to understanding the response is to note that the functional representation of metaverse is goggles, headsets or glasses used for VR or AR experiences and content.


Many of us would not consider those to be "metaverse."


As you might expect, respondents identified 5G and ubiquitous connectivity  (71 percent); virtual reality (VR) headsets (58 percent) and augmented reality (AR) glasses (58 percent) as the near-term key technologies. 


The IEEE study  probed for views about 2023 technologies expected to be important even for longer-run developments, including the metaverse.


It is unclear how aggressively respondents will be pursuing use of what we might term “pre-metaverse” tools in 2023, however. As with many relatively open-ended surveys of attitudes, respondents might not have had to make firm predictions about how important, and when, various important technologies would correlate with actual information technology spending. 


Views about use of artificial intelligence also vary, but might also be considered less likely to drive major investments in 2023. 


And despite slow going at first, respondents expect 5G  to affect vehicle connectivity and automation in 2023, fully 97 percent of survey respondents agree.


Most affected in 2023 are: 

  • (56 percent) remote learning and education

  • (54 percent) telemedicine, including remote surgery, health record transmissions

  • (51 percent) entertainment, sports, and live event streaming

  • (49 percent) personal and professional day-to-day communications

  • (29 percent) transportation and traffic control

  • (25 percent) manufacturing/assembly

  • (23 percent) carbon footprint reduction and energy efficiency


Also, 95 percent believe satellites for remote mobile connectivity will be a game-changer in 2023. 


As always, when there are no consequences for being “wrong,” the predictions are to be considered indicative of possible future trends rather than correlated directly with 2023 spending. 


All of us who have had to make technology forecasts have a poor track record, as predicting the future is inherently difficult. Nor did the survey force respondents to consider all the other assumptions a fully-formed forecast would require. 


It might not be wrong to argue that most predictions are wrong, not only in terms of what happened, but also how long it took to get there. There are many examples of how we get it wrong all the time.    

Directv-Dish Merger Fails

Directv’’s termination of its deal to merge with EchoStar, apparently because EchoStar bondholders did not approve, means EchoStar continue...