Saturday, December 17, 2022

Marginal Cost Pricing and "Near Zero Pricing" are Correlated

Digital content and related businesses such as data transport and access often face profitability issues because of marginal cost pricing, in a broad sense. Marginal cost pricing is the practice setting the price of a product to equal the extra cost of producing an extra unit of output.


Of course, digital goods are prime examples. What is the additional cost of delivering one more song, one more text message, one more email, one more web page, one more megabyte, one more voice conversation? What is the marginal cost of one more compute cycle, one more gigabyte of storage, one more transaction? 


Note that entities often use marginal cost pricing during recessions or in highly-competitive markets where price matters. Marginal cost pricing also happens in zero-sum markets, where a unit sold by one supplier must come at the expense of some other supplier.


In essence, marginal cost pricing is the underlying theory behind the offering of discounts. Once a production line is started, once a network or product is built, there often is little additional cost to sell the next unit. If marginal cost is $1, and retail price is $2, any sale above $1 represents a net profit gain. 


Of course, price wars often result, changing expected market prices in a downward direction. 


But marginal cost pricing has a flaw: it only recovers the cost of producing the next unit. It does not aid in the recovery of sunk and capital costs. Sustainable entities must recoup their full costs, including capital and other sunk costs, not simply their cost to produce and sell one more unit. 


So the core problem with pricing at marginal cost (the cost to produce the next unit), or close to it, is that the actual recovery of sunk costs does not happen. Sometimes we are tempted to think the problem is commoditization, or low perceived value, and that also can be an issue.


One arguably sees this problem in wide area data transport and internet transit pricing, for example. 


Software suppliers have an advantage, compared to producers of physical products, as the marginal costs to replicate one more instance are quite low, compared to the cost of adding another production line or facility; the cost of building additional access networks or devices. \\A company that is looking to maximize its profits will produce “up to the point where marginal cost equals marginal revenue.” In a business with economies of scale, increasing scale tends to reduce marginal costs. Digital businesses, in particular, have marginal costs quite close to zero.


source: Praxtime


The practical result is a drop in retail pricing, such as music streaming average revenue per account, mobile service average revenue per user, the cost of phone calls, sending text, email or multimedia messages, the cost of videoconferencing or price of products sold on exchanges and marketplaces. 


Of course, there are other drivers of cost, and therefore pricing. Marketing, advertising, power costs, transportation and logistics, personnel and other overhead do matter as well. But most of those are essentially sunk costs. Many of those costs do not change as one incremental unit is produced and sold. 


Which is why some argue the price of digital goods tends toward zero, considering only production costs. Most of the other price drivers for digital goods might not be related directly to production cost, however. Competition, brand investments and bargaining power can be big influences as well. 


Still, marginal cost pricing is a major reason why digital goods prices can face huge pricing pressure.


Friday, December 16, 2022

"Churn and Return" is a Big Reason Why Video Analog Dollars Become Digital Dimes

“Churn and return” seems to be a key part of the reason video streaming business models have become more challenging. In a nutshell, the problem is that customers sign up on any particular service to watch a new hit series, then churn off once they have finished watching it. 


That, in turn, is related to the traditional difficulty of creating a hit series in the first place. No single streaming service has found a way to consistently produce new “must see” content on a repeatable basis. And if that is what drives subscriptions and retention, no single service is going to have predictable cash flows. 


The older model is the “catalog” approach, which relies on a deep, broad catalog of archived content to anchor customers and create some amount of loyalty (repeat buying). 


The exceptions are the video streamers that have some other business model. YouTube makes money from advertising, not subscriptions. Amazon Prime arguably makes money from e-commerce. Apple makes money selling devices. In all those three cases, there is some other business model that streaming helps support, but without the full requirement to throw off enough cash to be a significant and stand-alone revenue driver. 


Netflix is changing its revenue model to incorporate advertising revenue. Warner Brothers Discovery has to find a balance between subscriptions and advertising, as do the other smaller providers. 


The point is that the subscription-driven services have work to do, to compete with the “I have some other revenue model” providers.


Was There Ever a Strategy That Could Have Saved AT&T, MCI or Sprint?

I realize it is a contrarian and extreme minority view, but perhaps no company I have followed for decades has had more notable “failed” growth strategies than AT&T, even when the broad strategy was undeniably correct. We might blame execution, but the big problem always seems to be the debt load required to implement the strategy.


In other words, AT&T was right about the strategy, perhaps ineffective at tactics, but in the end simply could not afford to implement the strategy without incurring debt loads so heavy the investments had to be unwound. 


As they say in sports, “a win is a win,” a loss is still ultimately a loss, as there are no “moral victories.” 


Most recently, AT&T has “shed” its ambitions in content ownership, linear video and video streaming by spinning out DirecTV and Warner Media. Keep in mind that AT&T shareholders still owned about 70 percent of each asset after the spinouts. 


That is universally described as a move by AT&T to get out of content and video and focus on its mobility and fixed network business. That is true in a large sense. Where I disagree is with the characterization of the strategy of getting into linear video, video streaming and content ownership as a “wrong strategy.”


AT&T made a similar huge move into the cable TV business two decades ago, as a way of getting into the local access business at a time when it was essentially only a provider of long distance telephone service, albeit a provider with a huge brand reputation. 


All that likewise was unwound as the debt burden became the issue, not the strategy. But the success of cable operators in seizing overwhelming and leading market share in the home broadband business, as well as healthy shares of the landline voice business and a rapidly-growing share of the mobile business, suggests that the strategy of building an advanced “communications” business on the cable platform was not wrongheaded. 


AT&T later became the largest linear video subscription provider when it acquired DirecTV. And AT&T essentially emulated the successful Comcast model in acquiring Warner Brothers for its content capabilities. Comcast now has distinct revenue streams as a content producer, content distributor and communications provider. 


AT&T was simply following that playback in amalgamating content, video distribution and local access network assets. 


All those initiatives were eventually reversed because the company could not bear the debt burden. One might argue that the strategy was wrong in all instances precisely because AT&T could not take on the debt to do so. Or, one might argue the strategy was correct, but execution wrong, as some lower-debt option had to be undertaken.


But it can be argued there simply was no “manageable debt” option. In fact, AT&T eventually failed outright as a stand-alone entity because it could not create some sustainable means of evolving from a long distance voice provider into something that looked more like Comcast. 


As it turned out, AT&T sold itself to SBC, so the whole company “failed,” living on only  because the brand name was adopted by SBC.


As with MCI, AT&T’s main competitor that also went out of business, being acquired by Verizon, or Sprint long distance, which was acquired first by T-Mobile and then essentially given away to Cogent, none of the three big long distance service providers managed to save themselves. 


Perhaps one might argue there was no strategy that would have worked. Perhaps all three were destined to be acquired as long distance voice ceased to be the industry revenue driver, and none of those firms had the capital to build or acquire their own local access facilities. 


I would agree that “no possible survival strategy” is an apt “in retrospect” assessment. I do not know what else they could have done other than harvest revenues for as long as possible before selling. 


That noted, the only conceivable survival strategy was that embraced by AT&T several times: acquire big stakes in existing businesses with sufficient cash flow to offer a hope of paying back borrowed money. 


Sure, it never worked. But I’ve never found other “strategies” for any of the firms that did not involve selling the companies or their asset bases, with one exception. Sprint eventually became a mobility company that also owned wide area network assets. But Sprint never found a way to break into leadership of mobile service provider market share. 


To reiterate, AT&T’s strategy was not conceptually flawed. It simply could not generate enough cash flow, fast enough, to handle the debt it took on to drive the strategy. One might well argue AT&T essentially had no strategy. And that is fair enough. But no public company executive can actually say in public that “the company is doomed and our only long term option is to sell the assets.”


Lack of Profits Drives Video Ecosystem Evolution

Professionally-produced long-form original video content is expensive to produce. Combine that with high promotion costs and small audiences and you have economic pressures that are very simple to understand, whether you are in the production, packaging or distribution parts of the ecosystem (studio, network or internet service provider). 


Big audiences are required to generate the big revenues that support big production budgets and lots of productions. Small audiences can only be supported if subsidized by profits from serving big audiences. 


And if profit margins get squeezed at every segment of the ecosystem, the profits to subsidize niche programming go away. 


There’s no point in lamenting this fact. It is simple economics. Whatever participants once believed, video streaming seemingly is a case of exchanging “analog dollars for digital dimes.” Revenue does not match cost, it has become clear. So video streaming suppliers are retrenching on the amount of money they spend to create original programming.


That leads observers to lament the content cutbacks. Similar concerns often are raised about the costs of internet access or linear video packages, but cost pressures related to programming rights play a role there as well. 


You might complain about the “below the line” surcharges on a linear video bill for the cost of carrying local broadcast signals, but those charges are a significant and growing part of the cost of gaining rights to retransmit that programming. Sports content is a key part of that cost structure. 


source: Bloomberg  


Costs to carry local TV stations likewise have been rising rapidly for two decades. Costs for a cable operator for local ABC, CBS, FOX, and NBC broadcast stations grew more than  600 percent since 2006, one observer noted in 2019. And that has happened as broadcast audiences shrink. 


One might well note that it is precisely those shrinking audiences--with revenue implications--that have caused local TV stations to lean on cable operator fees to compensate for the lost revenue. 


source: Armstrong 


All those costs get passed on to subscribers. Blame the cable companies if you want, but they must pay broadcasters for those content rights, whether the charges appear above the line or below it. And so long as marketing battles pivot on content and price, and so long as search engines can rank services by price, every service provider has an incentive to quote the lowest-possible price. 


So those content fees get shown below the line, as add-on fees, and not as part of the headline monthly recurring cost. We might all agree it is a bit deceptive. But we might also agree that there are clear business reasons why the policy exists. 


Music streaming provides an analogy, but one that will be hard for video programmers to support: buy the song, not the album. That shift to “a la carte” generally is resisted by networks and distributors alike. 


Such distribution wrecks havoc with brand investments, reduces the profit upside of bundling and reduces the power and leverage of networks. So do not count on a la carte (“buy the song; buy the movie, the show or episode”) becoming ubiquitous. Still, the pressure in that direction remains. 


Linear video subscriptions are losing distribution share because the value proposition keeps getting worse, in comparison to video streaming alternatives. So value propositions will evolve; content strategies will shift; bundles will be redefined; business models will have to change.  So long as professional content production remains expensive, it is inevitable.


Thursday, December 15, 2022

Risk of Stranded Assets Now Looms Large

Stranded assets--capital deployed in infrastructure that does not produce revenue--now have become a key business issue for internet service providers upgrading their networks. 


As a rule, policymakers tend to overstate customer demand for high-speed internet access. “Everyone” laments slow speeds or non-existent service. But even when service availability is high, and speeds similarly are at gigabit levels, actual customer demand is relatively low for those headline speed services. 


In a capital-intensive business such as access networks, it is dangerous to deploy capital to create facilities that are too far ahead of expected demand. Some of you might remember the overspending on spectrum that happened in Europe around 3G spectrum, for example. Others will remember the vast overspending on wide area optical networks that happened around the turn of the century. 


That is not to say policymakers are wrong to push for ubiquitous availability and quality speeds. It is to note that policies can push beyond actual customer demand, and endanger the sustainability of the internet service providers charged with delivering the services. 


Note, for example, the share of home broadband accounts now supplied by upstarts and attackers, compared to incumbent “telcos.”  In a variety of markets, attackers hold close to 40 percent share on the low end, and nearly 90 percent share on the high end. 


The United States and India have the greatest amount of share held by attackers. The point is that capital investment has to lead demand, but not by too much. Among the reasons are stranded asset dangers. Where share is less than 40 percent, 60 percent of invested assets are stranded (do not produce revenue). In markets where share is 10 percent of less, 90 percent of investment might be stranded. 


source: Ofcom 


According to Ofcom, giagbit per second home broadband now is available to 70 percent of U.K. residences. The issue now is what percentage of homes actually buy service at gigabit speeds. Not all that many, it appears. According to Ofcom data, uptake at gigabit speeds might be as low as one percent to two percent. 


About 42 percent of U.K. homes are passed by fiber-to-home facilities. Where fiber-to-home service is available, about 25 percent of homes buy the service. Only nine percent of those customers buy service at the fastest-available speed, however. 


source: Ofcom 


That relatively-small percentage of customers buying the fastest home broadband service is not unusual. Across a range of nations, uptake of gigabit services tends to be in single digits to  low-double digits. 

source: Ofcom 


All of which should remind us that there is a big difference between making a product available and demand for those products. Broadband policy success might be defined as supplying X amount of bandwidth to Y homes. 


It is less clear that “success” can be measured by the degree of uptake of services at any given level. Customers still have the right to buy the products they believe represent the best overall value proposition. In most cases, that means buying a level of service someplace between the lowest and highest speed tiers. 


For policymakers, making service available is what matters. Uptake is a matter for customers to decide. Take rates for the fastest speed access services always have been lowish. So the risk of ISPs over-investing is quite significant. 


Policymakers who are unrealistic about customer demand can jeopardize the sustainability of their ISPs. ISPs who over-invest can put themselves out of business.


U.K. Gigabit Availability Reaches 70%, Uptake 1% to 2%

According to Ofcom, giagbit per second home broadband now is available to 70 percent of U.K. residences. The issue now is what percentage of homes actually buy service at gigabit speeds. Not all that many, it appears. According to Ofcom data, uptake at gigabit speeds might be as low as one percent to two percent. 


About 42 percent of U.K. homes are passed by fiber-to-home facilities. Where fiber-to-home service is available, about 25 percent of homes buy the service. Only nine percent of those customers buy service at the fastest-available speed, however. 


source: Ofcom 


That relatively-small percentage of customers buying the fastest home broadband service is not unusual. Across a range of nations, uptake of gigabit services tends to be in single digits to  low-double digits. 

source: Ofcom 


All of which should remind us that there is a big difference between making a product available and demand for those products. Broadband policy success might be defined as supplying X amount of bandwidth to Y homes. 


It is less clear that “success” can be measured by the degree of uptake of services at any given level. Customers still have the right to buy the products they believe represent the best overall value proposition. In most cases, that means buying a level of service someplace between the lowest and highest speed tiers.


Availability is one matter; uptake quite another.  

Wednesday, December 14, 2022

Platform, Techco or Digital Services Provider Vision is Not Often Matched By Reality

A recurring discussion over the past couple of decades has been held over the question of whether telcos could become platforms. Perhaps the latest version of this debate asks the question of whether telcos can become techcos or digital services providers. 


None of these concepts are easy to explain, or commonly understood. Some might argue the “digital services provider” evolution means offering services such as internet of things, connected cars, smart cities or smart homes. Presumably the concept is that a connectivity provider supplies the apps that provide the value. 


So a connectivity provider acts as a home security firm, an industrial IoT system supplier, a connected car app, traffic management system or energy management firm. In other words, the connectivity provider is the branded developer and provider of the application, not just the connectivity provider supporting any app or use case. 


It is no easier to explain--and have people agree upon--what a “platform” or “techco” evolution means. 


It never is completely clear why telco executives really mean in touting the transformation from telco to “techco.”


Many telcos--or those who advise and sell to them--say telcos need to become techcos. So what does that mean?


At least as outlined by Mark Newman, Technotree chief analyst and Dean Ramsay, principal analyst, there are two key implications: a culture shift and a business model.


The former is more subjective: telco organizations need to operate “digitally.” The latter is harder: can telcos really change their business models; the ways they earn revenue; their customers and value propositions?


source: TM Forum


It might be easier to describe the desired cultural or technology changes.  Digital touchpoints; higher research and development spending; use of native cloud computing; a developer mindset and data-driven product development or use of use artificial intelligence all might be said to be part of becoming a “techco.”


Changing the business model is the more-problematic objective. 


As helpful as it should be to adapt to native cloud, developer-friendly applications and networks, use data effectively or boost research or development, none of those attributes or activities necessarily changes the business model. 


If “becoming a techco” means lower operating costs; lower capital investment; faster product development or happier customers, that is a good thing, to be sure. Such changes can help ensure that a business or industry is sustainable. 


The change to “techco” does not necessarily boost the equity valuation of a “telco,” however. To accomplish that, a “telco” would have to structurally boost its revenue growth rates to gain a higher valuation; become a supplier of products with a higher price-to-earnings profile, higher profit margins or business moats. 


What would be more relevant, then, is the ability of the “change from telco to techco” to serve new types of customers; create new and different revenue models; develop higher-value roles and products or add new roles  “telcos” can perform in the value chain or ecosystem. 


We face the same sorts of problems when trying to explain what a “platform” looks like. 


Korea Telecom wants to become a digital platform company, not a telco. That ambition arguably is shared somewhat widely among tier-one connectivity service providers globally and has been a strategy recommended in some form by most bigger consulting companies. 


Simply, becoming a platform company changes the business model from direct supplier of products to a role as an ecosystem organizer or marketplace. That arguably is an aspirational goal more than anything else. 


What that aspiration means in practice is that KT as a digico “will shift our focus from the telecommunications sector, where growth is stalled due to government regulations, to artificial intelligence (AI), big data, and cloud computing businesses to become the nation's number-one platform operator in the B2B market," said KT CEO Koo Hyun-mo.


So there are qualifications. KT, if successful, would become a platform in the business market, not the consumer market. It would explicitly aim to become the center and organizer of an ecosystem for artificial intelligence, big data analytics and cloud computing. 


Purists and researchers will likely argue about whether all of that actually adds up to KT becoming a platform, in the sense that Amazon, eBay, Alibaba, ridesharing or lodging apps  might be considered platforms. 


A platform, definitionally, makes its money putting buyers and sellers and ecosystem participants together. In computing, a platform is any combination of hardware and software used as a foundation upon which applications, services, processes, or other technologies are built, hosted or run.


Operating systems are platforms, allowing software and applications to be run. Devices are platforms. Cloud computing might be said to be a platform, as systems are said to be platforms. 


Standards likely are thought of as platforms by some. 


In other cases components such as central processing units, physical or software interfaces (Ethernet, Wi-Fi, 5G, application programming interfaces) are referred to as platforms. Browsers might be termed platforms by some. Social media apps are seen as platforms as well. 


The platform business model requires creation of a marketplace or exchange that connects different participants: users with suppliers; sellers with buyers. A platform functions as a matchmaker, bringing buyers and sellers together, but classically not owning the products sold on the exchange. 


A platform orchestrates interactions and value. In fact, a platform’s value may derive in large part from the actions and features provided by a host of ecosystem participants. Facebook’s content is created by user members. Amazon’s customer reviews are a source of value for e-tailing buyers. 


Consumers and producers can swap roles on a platform. Users can ride with Uber today and drive for it tomorrow; travelers can stay with AirBNB one night and serve as hosts for other customers the next. Customers of pipe businesses--an airline, router or phone suppliers, grocery stores-- cannot do so. 


So KT can increase the percentage of revenue it earns from supplying digital, computing, application or non-connectivity services without becoming a platform. As a practical matter, that is what most telco executives have in mind when talking about becoming platforms. 


For KT, even limiting its ambitions to generating more digital and non-connectivity revenue does not make it a platform. That would still be an important, valuable and value-sustaining move. But KT has a very long ways to go, even in its stated objectives of becoming a B2B platform.


Total KT revenue is about 24 trillion won. All B2B revenues at the end of 2020 were about 2.78 trillion won (about 11.5 percent). Information technology services were about 1 trillion won, or about four percent of total revenues. AI and other digital services were about 0.5 trillion won, or about two percent of total revenues. 


It might be a long time between non-connectivity revenues in the B2B part of its business are as much as half of total revenues. And those revenues might not represent a platform transformation of the business model.


KT could win significantly without ever becoming a platform. And some might argue few telcos can ever actually hope to become platforms in the classic sense. Perhaps the more important goal is simply to reduce reliance on traditional connectivity revenues.


Unfortunately, what platform, techco or digital services provider actually means in practice falls far short of the grander visions.


It Will be Hard to Measure AI Impact on Knowledge Worker "Productivity"

There are over 100 million knowledge workers in the United States, and more than 1.25 billion knowledge workers globally, according to one A...