Showing posts sorted by date for query near zero pricing. Sort by relevance Show all posts
Showing posts sorted by date for query near zero pricing. Sort by relevance Show all posts

Tuesday, April 18, 2023

Non-Linear Development and Even Near-Zero Pricing are Normal for Chip-Based Products

It is clear enough that Moore’s Law played a foundational role in the founding of Netflix, indirectly led to Microsoft and underpins the development of all things related to use of the internet and its lead applications. 


All consumer electronics, including smartphones, automotive features, GPS, location services; all leading apps, including  social media, search, shopping, video and audio entertainment; cloud computing, artificial intelligence and the internet of things are built on the foundation of ever-more-capable and cheaper computing, communications and storage costs. 


For connectivity service providers, the implications are similar to the questions others have asked. Reed Hastings asked whether enough home broadband speed would exist, and when, to allow Netflix to build a video streaming business. 


Microsoft essentially asked itself whether dramatically-lower hardware costs would create a new software business that did not formerly exist. 


In each case, the question is what business is possible if a key constraint is removed. For software, assume hardware is nearly free, or so affordable it poses no barrier to software use. For applications or computing instances, remove the cost of wide area network connections. For artificial intelligence, remove the cost of computing cycles.


In almost every case, Moore’s Law removes barriers to commercial use of technology and different business models. The fact that we now use millimeter wave radio spectrum to support 5G is precisely because cheap signal processing allows us to do so. We could not previously make use of radio signals that dropped to almost nothing after traveling less than a hundred feet. 


Reed Hastings, Netflix founder, based the viability of video streaming on Moore’s Law. At a time when dial-up modems were running at 56 kbps, Hastings extrapolated from Moore's Law to understand where bandwidth would be in the future, not where it was “right now.”


“We took out our spreadsheets and we figured we’d get 14 megabits per second to the home by 2012, which turns out is about what we will get,” says Reed Hastings, Netflix CEO. “If you drag it out to 2021, we will all have a gigabit to the home." So far, internet access speeds have increased at just about those rates.


The point is that Moore’s Law enabled a product and a business model  that was not possible earlier, simply because computation and communications capabilities had not developed. 


Likewise, Microsoft was founded with an indirect reliance on what Moore’s Law meant for computing power. 


“As early as 1971, Paul (Allen) and I had talked about the microprocessor,” Bill Gates said in a 1993 interview for the Smithsonian Institution, in terms of what it would mean for the cost of computing. "Oh, exponential phenomena are pretty rare, pretty dramatic,” Gates recalls saying. 


“Are you serious about this? Because this means, in effect, we can think of computing as free," Gates recalled. 


That would have been an otherwise ludicrous assumption upon which to build a business. Back in 1970 a “computer” would have cost millions of dollars. 

source: AEI 


The original insight for Microsoft was essentially the answer to the question "What if computing were free?". Recall that Micro-Soft (later changed to MicroSoft before becoming today’s Microsoft) was founded in 1975, not long after Gates apparently began to ponder the question. 


Whether that was a formal acknowledgement about Moore’s Law or not is a question I’ve never been able to firmly pin down, but the salient point is that the microprocessor meant “personal” computing and computers were possible. 


A computer “in every house” meant appliances costing not millions of dollars but only thousands. So three orders of magnitude price improvements were required, in less than half a decade to a decade. 


“Paul had talked about the microprocessor and where that would go and so we had formulated this idea that everybody would have kind of a computer as a tool somehow,” said Gates.


Exponential change dramatically extends the possible pace of development of any technology trend. 


Each deployed use case, capability or function creates a greater surface for additional innovations. Futurist Ray Kurzweil called this the law of accelerating returns. Rates of change are not linear because positive feedback loops exist.


source: Ray Kurzweil  


Each innovation leads to further innovations and the cumulative effect is exponential. 


Think about ecosystems and network effects. Each new applied innovation becomes a new participant in an ecosystem. And as the number of participants grows, so do the possible interconnections between the discrete nodes.  

source: Linked Stars Blog 

 

So network effects underpin the difference in growth rates or cost reduction we tend to see in technology products over time, and make linear projections unreliable.


Saturday, December 17, 2022

Marginal Cost Pricing and "Near Zero Pricing" are Correlated

Digital content and related businesses such as data transport and access often face profitability issues because of marginal cost pricing, in a broad sense. Marginal cost pricing is the practice setting the price of a product to equal the extra cost of producing an extra unit of output.


Of course, digital goods are prime examples. What is the additional cost of delivering one more song, one more text message, one more email, one more web page, one more megabyte, one more voice conversation? What is the marginal cost of one more compute cycle, one more gigabyte of storage, one more transaction? 


Note that entities often use marginal cost pricing during recessions or in highly-competitive markets where price matters. Marginal cost pricing also happens in zero-sum markets, where a unit sold by one supplier must come at the expense of some other supplier.


In essence, marginal cost pricing is the underlying theory behind the offering of discounts. Once a production line is started, once a network or product is built, there often is little additional cost to sell the next unit. If marginal cost is $1, and retail price is $2, any sale above $1 represents a net profit gain. 


Of course, price wars often result, changing expected market prices in a downward direction. 


But marginal cost pricing has a flaw: it only recovers the cost of producing the next unit. It does not aid in the recovery of sunk and capital costs. Sustainable entities must recoup their full costs, including capital and other sunk costs, not simply their cost to produce and sell one more unit. 


So the core problem with pricing at marginal cost (the cost to produce the next unit), or close to it, is that the actual recovery of sunk costs does not happen. Sometimes we are tempted to think the problem is commoditization, or low perceived value, and that also can be an issue.


One arguably sees this problem in wide area data transport and internet transit pricing, for example. 


Software suppliers have an advantage, compared to producers of physical products, as the marginal costs to replicate one more instance are quite low, compared to the cost of adding another production line or facility; the cost of building additional access networks or devices. \\A company that is looking to maximize its profits will produce “up to the point where marginal cost equals marginal revenue.” In a business with economies of scale, increasing scale tends to reduce marginal costs. Digital businesses, in particular, have marginal costs quite close to zero.


source: Praxtime


The practical result is a drop in retail pricing, such as music streaming average revenue per account, mobile service average revenue per user, the cost of phone calls, sending text, email or multimedia messages, the cost of videoconferencing or price of products sold on exchanges and marketplaces. 


Of course, there are other drivers of cost, and therefore pricing. Marketing, advertising, power costs, transportation and logistics, personnel and other overhead do matter as well. But most of those are essentially sunk costs. Many of those costs do not change as one incremental unit is produced and sold. 


Which is why some argue the price of digital goods tends toward zero, considering only production costs. Most of the other price drivers for digital goods might not be related directly to production cost, however. Competition, brand investments and bargaining power can be big influences as well. 


Still, marginal cost pricing is a major reason why digital goods prices can face huge pricing pressure.


Saturday, November 12, 2022

Dramatic Rethinking of Where FTTH Makes Sense Earns Mercury Broadband PE Investment

Mercury Broadband, an internet service provider serving rural customers near Topeka, Ks., has gotten an investment of up to $230 million from private equity  investment firm Northleaf Capital Partners.


The investment is part of a larger trend of private equity firms investing in digital infrastructure assets. The latest wrinkle is an increase in investment in rural ISPs able to leverage new government funding in rural areas where the bigger ISPs are unlikely to want to compete. 


Mercury Broadband is among firms that have secured Rural Digital Opportunity Fund (RDOF) grants by the Federal Communications Commission to bring fiber optic broadband to underserved communities, especially in rural areas. 


The construction is expected to add more than 12,000 plant miles and pass tens of thousands of locations. 


As a rule, the availability of RDOF grants lowers the fiber access plant capital investment hurdle by 30 percent or so. That, among other things, has dramatically reshaped thinking about when FTTH deployments make financial sense. 


The economics of connectivity provider fiber to the home  have always been daunting, but they are, in some ways, more daunting in 2022 than they were a decade ago. The biggest new hurdle is that expected revenue per account metrics have been cut in half or two thirds. That would be daunting for any supplier in any industry. 


These days, the expected revenue contribution from a home broadband account hovers around $50 per month to $70 per month. Some providers might add linear video, voice or text messaging components to a lesser degree. 


But that is a huge change from revenue expectations in the 1990 to 2015 period, when $150 per customer was the possible revenue target.  


You might well question the payback model for new fiber-to-home networks which assume recurring revenue between $50 and $70 per account, per month, with little voice revenue and close to zero video revenue; take rates in the 40-percent range; and network capital investment between $800 and $1000 per passing and connection costs of perhaps $300 per customer. 


But that is the growing reality. Among the reasons: higher government subsidies; indirect revenue contributions and a different investor base. 


All that has shifted fiber-to-home business models in ways that might once have been thought impossible. 


In the face of difficult average revenue per account metrics, co-investment and ancillary revenue contributions have become key. Additional subsidies for home broadband also will reduce FTTH deployment costs. All that matters as revenue expectations are far different from assumptions of two decades ago. 


“Our fiber ARPU was $61.65, up 5.3 percent year over year, with gross addition intake ARPU in the $65 to $70 range,” said John Stankey, AT&T CEO, of second quarter 2022 results. “We expect overall fiber ARPU to continue to improve as more customers roll off promotional pricing and on to simplified pricing constructs.”


Lumen reports its fiber-to-home average revenue per user at about $58 per month. For those of you who have followed fiber-to-home payback models for any length of time, and especially for those of you who have followed FTTH for many decades, that level of ARPU might come as a shock. 


Though some honest--and typically off the record--evaluations by some telco executives 25 years ago would have described the FTTH business model as “you get to keep your business” rather than “we boost revenue.” Few financial analysts would have been impressed with that argument. 


But revenue expectations were quite different back then. The thinking was that per-home revenue could range as high as $130 to $200 per month. 


Contrast that with today’s view, which is that monthly review per account is closer to $60 a month. And yet investors believe that provides a reasonable business case, in the right markets, with subsidies and a chance to create competitive moats.


Saturday, June 12, 2021

"The End" of Connectivity Services Revenue Model is Coming

“The end of communications services as we know them” is coming, argues the IBM Institute for Business Value. That puts the institute on one pole of an argument that never seems settled: should service providers build strategies around connectivity revenue sources, or look elsewhere for growth?”


In other words, should service providers focus horizontally on connectivity services or vertically integrate?  


In part, the study team--consultants Chad Andrews, Steve Canepa, Bob Fox and Marisa Viveros--base those recommendations on shrinking profit margins and lessened value of core connectivity services. That is what I have called near zero pricing.  


“There is evidence that connectivity may commoditize more suddenly and dramatically than expected,” the team argues. In other words, both lower prices per bit and lower demand for other legacy services will continue as a fundamental trend. 


“Margins for connectivity are likely to fall,” they note, as data consumption keeps increasing. “On the surface, exponential increases in scale would seem like a good thing for CSPs, but only if pricing keeps pace with the rate of expansion,” they say. “History and data suggest it will not.”


Near zero pricing is the term I use to describe the larger framework of connectivity provider pressures towards ever-lower prices. Others might prefer to emphasize marginal cost pricing. The point is that there is a reason the phrase dumb pipe exists. What we need to remember is that dumb pipe now is the foundation of the whole connectivity business


A caveat is that what people usually mean by “dumb pipe” is that a product has low value, is sold at low prices and generates low profit margins. That always is positioned as a bad thing. 


But think about it: industry revenue growth now is lead by broadband services (internet access), which is, by definition, a dumb pipe service. It is a way to get access to applications, not an actual application itself. Nor are profit margins always low. Broadband access now routinely has higher profit margins than entertainment subscriptions or most voice services or messaging can generate. 


The other issue is that value and revenue within the information technology and communications spheres keeps shifting away from “data access and transport.”



“In just three to four years,” sustaining growth may require “most CSPs (communication service providers)...to develop new competencies and assert themselves in new roles in value chains,” the IBM Institute for Business Value says. 


To be sure, that is not a novel bit of advice. “Think beyond connectivity” is about as standard a recommendation as one is likely to find any analyst, consultant recommending as a revenue growth strategy.  


“If CSPs are to thrive, most will need to develop new competencies and assert themselves in new roles in value chains,” the institute also says. As always, that advice runs counter to that of virtually all equity analysts, who always seem to want service providers to “stick to their connectivity roots.” 


Where the institute urges pursuit of different and new roles within the internet ecosystem, equity analysts want a focus on connectivity services. “Up the stack” or not remains a key argument. 


“CSPs should seek new ways to make money, beyond metering connectivity and access to data, as these traditional mainstays of CSP business models are likely to commoditize,” the institute says. 


“CSPs should seek new ways to make money, beyond metering connectivity and access to data, as these traditional mainstays of CSP business models are likely to commoditize,” the institute says. 


The institute identifies 5G and edge computing as key platforms, in that regard. Again, rather obvious suggestions. The institute also recommends a hybrid cloud strategy. That is not too surprising, given IBM’s commitment to hybrid cloud as its core strategy.


The institute also argues that telcos must become platforms. Again, not a novel view. The term platform is misunderstood by most, however. It has a different meaning when referring to computing than to business models. It is the former instance, not the latter, that seems generally meant by the phrase “becoming a platform.”


In the sense the institute sees matters, becoming a platform is meant in the sense of computing platforms


A platform in that sense is a group of technologies that are used as a base upon which other applications, processes, services, or technologies are developed. Platforms can be hardware (e.g., chips, devices) or software. Types of software platforms include operating systems, development environments (e.g., Java, .NET), and digital platforms. Digital platforms are highly configurable/extensible software tools that sit above traditional development platforms. 


Most observers would agree that core connectivity revenue streams are under pressure and are likely to stay that way. Where opinion really diverges is “what to do” about those circumstances. 


As wise as diversifying into new roles might be, history suggests how difficult that will prove to be for most service providers. That noted, most would also agree that opportunity exists in a number of areas including internet of things, edge computing and advanced networks. How to seize opportunities remains a subject of debate.


Saturday, January 2, 2021

Even as a "Platform," Telcos Would Not Escape Near Zero Pricing

The reality of very low and declining per-unit prices is well attested in the connectivity business. Many suggest a way out of the conundrum is for at least some connectivity providers to transform themselves as platforms. 


Ignore for the moment whether this is generally possible, and to what extent. 


Life as a platform would ultimately be based on very low per-unit prices. In fact, as many platforms feature zero marginal cost, they also tend towards near zero pricing


Virtually all platforms feature lower prices per unit than rival pipe businesses, for a number of reasons. Typically making extensive use of internet and computing resources to radically lower transaction and information discovery costs, etailing platforms inevitably push cost out of retail transactions. Platforms reduce friction. 


In other cases, platforms are able to mobilize and put into commercial use assets that otherwise lie fallow. Uber provides a good example. Personally-owned vehicles tend to sit parked and unused 95 percent of the time. Uber allows those otherwise idle assets to be put to commercial use. 


And though firms often are urged to become platforms, few actually can do so, and not for reasons of technology deployment, skill or type of product. Successful platforms are relatively rare because they require scale, and few businesses can afford to invest to scale. 


Most firms in the connectivity business will not be able to transform as platforms, leaving only other possible options. If one believes that prices for telecom products are destined to keep declining, or that more for the same price is the trend, then there are a couple of logical ways to “solve” such problems. 


Firms might try to gain scale to lower unit costs, change the cost model in other ways to enhance profitability, exit the business or change the game being played. Moving “up the stack,” across the ecosystem or into new or adjacent roles within the value chain can “change the game.” That is the strategy behind Comcast and AT&T moving into the content ownership business, or moves by other tier-one service providers into new lines of business outside the connectivity core. 


That is one way to attempt to escape the trap of marginal cost pricing, which might be the connectivity industry’s existential problem


But it also is reasonable to assume that even a successful shift to a platform model will be based on near zero marginal cost, and near zero pricing. The reason is simply that most platforms also feature near zero pricing.

Wednesday, December 2, 2020

Telcos Not Much Closer to Success in Enterprise Solutions Market, ABI Research Says

Telcos might not be much closer in 2021 to a goal of becoming relevant in the cloud-based enterprise solutions market, Don Alusha, ABI Research senior analyst suggests. 


That should surprise almost nobody, as the application value chain now rests on the internet, software platforms, and the cloud. That allows near-zero distribution and near-zero transactional costs, Alusha says. 


Direct competition from hyperscalers is a factor as well. Amazon offers multiple devices for either edge or on-premises deployments: Snowcone, Snowball, Wavelength, Outposts, and Greengrass IoT, for example. 


Hyperscalers also are already deploying enterprise digital solutions, most of which are usage- or subscription-based instead of upfront, capex-based solutions that telco solutions arguably require, says Dimitris Mavrakis, ABI Research senior research director. 


Telco operators will need to adapt to opex models to survive, especially in the small and medium enterprise (SME) segments of the market, he says.


“2020 has seen AWS, Google, and Microsoft all advancing and underlining their telco ambitions to provide enterprise connectivity solutions” says Mavrakis.


“Their existing ties with enterprises for cloud storage, as well as their general openness toward service-based offerings, will make them particularly attractive to enterprises,” says Leo Gergs, ABI analyst. 


That might also include hyperscalers supplying access services as well, he says. 


Hyperscalers are leading the market in consumption economics, says Alusha. In a sense, that is a continuation of a trend we saw a decade or two ago, as the enterprise software business became “consumerized,” with many consumer apps adopted first by employees and then supported by enterprise IT. 


Enterprise IT of the past was based on high switching costs, relatively low volume, high price, and a pay-up-front capital investment model. 


The future purchase pattern will instead be based on high volumes, low pricing, and an opex model. 


Telcos and their suppliers are not yet ready to fully embrace consumption economics, says Alusha. 


Telecommunications is an asset-intensive industry with expertise in managing factories and supply chains, developing technologies, and understanding the cost of goods sold, inventory turns, and manufacturing. 


Human-intensive services are entirely different, Alusha says. For example, in services and opex-based models, technology providers do not manufacture a product to then sell it. Instead, they sell a capability or knowledge, created “at the same time they deliver it,” he says. In other words, software is sold as a service, not a product. 


Tuesday, November 3, 2020

Competition and New Technology Underpin Near-Zero Pricing Trend

It is a truism that competition and new technology, in combination, have fundamentally changed the global telecom business. We all intuitively understand that competition leads to lower prices, or that technology allows disintermediation of value chains, which removes cost. 

source: A.D. Little 


One of the few core assumptions I always have used in my analytical work concerning the connectivity business is near zero pricing is a foundational trend for all connectivity products, as it tends to be also for computing products. Consider internet transit pricing, for example 


Back in 2014, Cloudflare estimated the cost of wide area network bandwidth as being lowest in Europe, in large part because so much internet traffic used peering rather than transit. 


source: Cloudflare


Two years later, in 2016, costs had dropped. The Middle East has the lowest WAN costs, and costs in other reasons had dropped significantly. Where Australia’s costs were as much as 20 times higher than Europe’s costs, two years later the Australian costs were six times higher than Europe’s costs. 

source: Cloudflare


None of you would be surprised if transit prices continued to fall. Transit to Sydney, for example, had declined to about $5 per Mbps, where back in 2014 prices had been about $100 per Mbps. 

source: TeleGeography


Both Netflix and Microsoft business models seem to have been built on an expectation of  

near-zero pricing for a core input, computing cost for Microsoft, bandwidth cost for Netflix. 


The most-startling strategic assumption ever made by Bill Gates was his belief that horrendously-expensive computing hardware would eventually be so low cost that he could build his own business on software for ubiquitous devices. .


How startling was the assumption? Consider that, In constant dollar terms, the computing power of an Apple iPad 2, when Microsoft was founded in 1975, would have cost between US$100 million and $10 billion.


source: Hamilton Project


The point is that the assumption by Gates that computing operations would be so cheap was an astounding leap. But my guess is that Gates understood Moore’s Law in a way that the rest of us did not.


Reed Hastings, Netflix founder, apparently made a similar decision. For Bill Gates, the insight that free computing would be a reality meant he should build his business on software used by computers.


Reed Hastings came to the same conclusion as he looked at bandwidth trends in terms both of capacity and prices. At a time when dial-up modems were running at 56 kbps, Hastings extrapolated from Moore's Law to understand where bandwidth would be in the future, not where it was “right now.”


“We took out our spreadsheets and we figured we’d get 14 megabits per second to the home by 2012, which turns out is about what we will get,” says Reed Hastings, Netflix CEO. “If you drag it out to 2021, we will all have a gigabit to the home." So far, internet access speeds have increased at just about those rates.


The scary point is that prices in the telecom business seem to have a “near-zero” trend. That does not mean absolute zero, but simply prices so low users and customers do not have to think much about using the products. 


That, of course, has fundamental implications for owners of connectivity businesses. Near-zero pricing helps create demand for internet access services, even as substitutes emerge for core voice and messaging services. 


Near-zero pricing enables the construction and operation of the networks and creation of the apps and services delivered over the networks. Near-zero pricing also enables new business models that were impossible in the analog era.


AI Will Improve Productivity, But That is Not the Biggest Possible Change

Many would note that the internet impact on content media has been profound, boosting social and online media at the expense of linear form...