Wednesday, April 19, 2023

Who Sells a Product--Not Just What the Product Is--Matters

Firms whose core business model relies on network effects--including all firms using a platform business model--are essentially forced to develop pro timesies for network effects that drive their success, as network effects cannot be directly measured using standard financial reporting metrics. 


Airbnb, for e timesample, might track the total number of lodging listings; the number of guest reviews; the number of booked stays; the number of site visits or completed transactions. 


Uber might track the number of drivers; the number of riders or trips. E-commerce sites such as eBay might measure the number of buyers, the number of sellers or the number of listed items for sale or the number of completed transactions. 


Social media networks such as Meta might measure the number of users; the amount of time users spend on the platform or the number of interactions between users. 


PayPal might measure the number of active registered accounts and the volume of transactions. Amazon might measure network effects for its third-party merchant services by noting the number of sellers; the number of products listed for sale; the number of reviews or the volume of product sales.


Connectivity providers using platform business models might track  the number of users; the amount of bandwidth or computing services purchased; or the number of transactions.


And though such “network effects” performance metrics are not often directly reflected in firm valuations using traditional accounting metrics, it is reasonable to assume they exist. 


For example, some might argue that Equinix's EV/EBITDA multiple was 25.7 times in 2022, while AT&T's EV/EBITDA multiple was 12.5 times. 


Equinix's price/sales multiple might have been 18.4 times in 2022, while AT&T's price/sales multiple was 5.7 times. 


Equinox’s price/earnings multiple might have reached 100.5 times in 2022, while AT&T's price/earnings multiple was 15.4 times, at least at market highs for that year. 


Equinix's multiple of annual revenue was 13.8 times in 2022, while AT&T's multiple of annual revenue was 6.4 times.


Since AT&T’s revenue was primarily driven by connectivity services, while Equinix is valued as a data center or a digital infrastructure or real estate asset, each firm’s total valuation tends to reflect the primary importance of each firm’s core business. 


But each firm sells interconnection or connectivity services that are functionally the same: dark fiber, lit fiber, Ethernet transport and optical wave services. 


The point is that the same product (interconnection) is valued differently at each firm, since each is in a different category of business. 


For executives constantly concerned about the commoditization of their connectivity services, such differences show that the value of particular interconnections can vary based on who is offering the services, or what and where the connections are made. 


Of course, larger firms often are awarded higher multiples than smaller companies. Networking specialist Megaport's EV/EBITDA multiple was 10.5 times in 2022, while AT&T's EV/EBITDA multiple was 12.5 times.


On the other hand, Megaport's price/sales multiple was 12.4 times in 2022, while AT&T's price/sales multiple was 5.7 times.


Megaport's price/earnings multiple was 24.3 times in 2022, while AT&T's price/earnings multiple was 15.4 times. 


Megaport's multiple of annual revenue was 8.1 times in 2022, while AT&T's multiple of annual revenue was 6.4 times.


So even when a smaller firm competes with a larger and more-established firm, valuation multiples for the smaller firm can be higher.


Tuesday, April 18, 2023

Non-Linear Development and Even Near-Zero Pricing are Normal for Chip-Based Products

It is clear enough that Moore’s Law played a foundational role in the founding of Netflix, indirectly led to Microsoft and underpins the development of all things related to use of the internet and its lead applications. 


All consumer electronics, including smartphones, automotive features, GPS, location services; all leading apps, including  social media, search, shopping, video and audio entertainment; cloud computing, artificial intelligence and the internet of things are built on the foundation of ever-more-capable and cheaper computing, communications and storage costs. 


For connectivity service providers, the implications are similar to the questions others have asked. Reed Hastings asked whether enough home broadband speed would exist, and when, to allow Netflix to build a video streaming business. 


Microsoft essentially asked itself whether dramatically-lower hardware costs would create a new software business that did not formerly exist. 


In each case, the question is what business is possible if a key constraint is removed. For software, assume hardware is nearly free, or so affordable it poses no barrier to software use. For applications or computing instances, remove the cost of wide area network connections. For artificial intelligence, remove the cost of computing cycles.


In almost every case, Moore’s Law removes barriers to commercial use of technology and different business models. The fact that we now use millimeter wave radio spectrum to support 5G is precisely because cheap signal processing allows us to do so. We could not previously make use of radio signals that dropped to almost nothing after traveling less than a hundred feet. 


Reed Hastings, Netflix founder, based the viability of video streaming on Moore’s Law. At a time when dial-up modems were running at 56 kbps, Hastings extrapolated from Moore's Law to understand where bandwidth would be in the future, not where it was “right now.”


“We took out our spreadsheets and we figured we’d get 14 megabits per second to the home by 2012, which turns out is about what we will get,” says Reed Hastings, Netflix CEO. “If you drag it out to 2021, we will all have a gigabit to the home." So far, internet access speeds have increased at just about those rates.


The point is that Moore’s Law enabled a product and a business model  that was not possible earlier, simply because computation and communications capabilities had not developed. 


Likewise, Microsoft was founded with an indirect reliance on what Moore’s Law meant for computing power. 


“As early as 1971, Paul (Allen) and I had talked about the microprocessor,” Bill Gates said in a 1993 interview for the Smithsonian Institution, in terms of what it would mean for the cost of computing. "Oh, exponential phenomena are pretty rare, pretty dramatic,” Gates recalls saying. 


“Are you serious about this? Because this means, in effect, we can think of computing as free," Gates recalled. 


That would have been an otherwise ludicrous assumption upon which to build a business. Back in 1970 a “computer” would have cost millions of dollars. 

source: AEI 


The original insight for Microsoft was essentially the answer to the question "What if computing were free?". Recall that Micro-Soft (later changed to MicroSoft before becoming today’s Microsoft) was founded in 1975, not long after Gates apparently began to ponder the question. 


Whether that was a formal acknowledgement about Moore’s Law or not is a question I’ve never been able to firmly pin down, but the salient point is that the microprocessor meant “personal” computing and computers were possible. 


A computer “in every house” meant appliances costing not millions of dollars but only thousands. So three orders of magnitude price improvements were required, in less than half a decade to a decade. 


“Paul had talked about the microprocessor and where that would go and so we had formulated this idea that everybody would have kind of a computer as a tool somehow,” said Gates.


Exponential change dramatically extends the possible pace of development of any technology trend. 


Each deployed use case, capability or function creates a greater surface for additional innovations. Futurist Ray Kurzweil called this the law of accelerating returns. Rates of change are not linear because positive feedback loops exist.


source: Ray Kurzweil  


Each innovation leads to further innovations and the cumulative effect is exponential. 


Think about ecosystems and network effects. Each new applied innovation becomes a new participant in an ecosystem. And as the number of participants grows, so do the possible interconnections between the discrete nodes.  

source: Linked Stars Blog 

 

So network effects underpin the difference in growth rates or cost reduction we tend to see in technology products over time, and make linear projections unreliable.


Monday, April 17, 2023

How Much Value Might Network APIs Unlock?

Most consumers indirectly use application programming interfaces every day, to check weather, make travel arrangements, use social networks or search, make payments or log in to sites using another existing account. Use of Google Maps or shopping provide other examples where APIs are used. 


Basically, APIs allow different applications to exchange data


Connectivity providers now believe they can make revenue, and increase the value of their services, while possibly building bigger ecosystems of value, by exposing network functions as APIs. Value added will determine how successful operators are at unlocking value from exposing network features for use by third parties.


source: Nokia 


In the internet era, connectivity is essential for almost all applications, use cases and functions. So we should never be surprised that app developers note the value of access to the internet, and connectivity in general. Without internet access, most apps will not work. 


source: STL Partners 


But there is a key distinction. Internet access does not hinge on a business relationship between the app and the access or transport networks. By design, any app compliant with TCP/IP, and any user with access to TCP/IP networks, will be able to use internet-supported apps, so long as they have the credentials to do so. 


So access providers hope that new network APIs will create a new revenue stream, while possibly increasing the value of network features. 


Initially, for example, the GSMA Open Gateway APIs support SIM Swap, Quality on Demand, Device Status (Connected or Roaming Status), Number Verify, Edge Site Selection and Routing, Number Verification (SMS 2FA) and Carrier Billing features such as Check Out and Device Location (Verify Location). Other APIs will be added, of course. 


The issue, as with all engineering choices, is whether there are other proxies or substitutes for those sorts of functions. Payments can be made in other ways than using carrier billing. Device location might be available using device or application features.


In other cases, the use of an API might, or might not, be a feature of some other service, such as a network slice. In some cases, quality issues will be rectified by use of edge computing or content delivery networks. 


In other words, it remains to be seen whether, and how much, app developers see value in the new APIs, and where value can be created using substitute methods. 


source: RingCentral 


APIs have been used to “voice enable” other apps, for example. The hope is that many other network-based parameters or functions likewise can be exposed, with access to those features generating revenue for network operators.


Architecturally, of course, APIs are a way of reintegrating functions that are separated into layers. Also, to the extent that many industries evolve towards ecosystems, APIs add value in allowing ecosystem partners to interoperate. 


source: McKinsey 


The issue for connectivity providers is the additional value apps gain when network functions are available to exchange data. Generally speaking, it is application-to-application data that adds the most value. 


The thinking is that latency-sensitive functions and apps will benefit most from edge computing, as they have benefitted from content delivery networks. Perhaps such apps also will benefit from network APIs. 


Everything hinges on value added. So the big question right now is where value can be generated when network functions are available.


Sunday, April 16, 2023

We Will Overestimate what Generative AI can Accomplish Near Term

For most people, it seems as though artificial intelligence has suddenly emerged as an idea and set of possibilities. Consider the explosion of interest in large language models or generative AI.


In truth, AI has been gestating for many many decades. And forms of AI already are used in consumer applicances such as smart speakers, recommendation engines and search functions.


What seems to be happening now is some inflection point in adoption. But the next thing to happen is that people will vastly overestimate the degree of change over the near term, as large language models get adopted, just as they overestimate what will happen longer term.


That is an old--but apt--story.


“Most people overestimate what they can achieve in a year and underestimate what they can achieve in ten years” is a quote whose provenance is unknown, though some attribute it to Standord computer scientist Roy Amara. Some people call it the “Gate’s Law.”


The principle is useful for technology market forecasters, as it seems to illustrate other theorems including the S curve of product adoption. The expectation for virtually all technology forecasts is that actual adoption tends to resemble an S curve, with slow adoption at first, then eventually rapid adoption by users and finally market saturation.   


That sigmoid curve describes product life cycles, suggests how business strategy changes depending on where on any single S curve a product happens to be, and has implications for innovation and start-up strategy as well. 


source: Semantic Scholar 


Some say S curves explain overall market development, customer adoption, product usage by individual customers, sales productivity, developer productivity and sometimes investor interest. It often is used to describe adoption rates of new services and technologies, including the notion of non-linear change rates and inflection points in the adoption of consumer products and technologies.


In mathematics, the S curve is a sigmoid function. It is the basis for the Gompertz function which can be used to predict new technology adoption and is related to the Bass Model.


Another key observation is that some products or technologies can take decades to reach mass adoption.


It also can take decades before a successful innovation actually reaches commercialization. The next big thing will have first been talked about roughly 30 years ago, says technologist Greg Satell. IBM coined the term machine learning in 1959, for example, and machine learning is only now in use. 


Many times, reaping the full benefits of a major new technology can take 20 to 30 years. Alexander Fleming discovered penicillin in 1928, it didn’t arrive on the market until 1945, nearly 20 years later.


Electricity did not have a measurable impact on the economy until the early 1920s, 40 years after Edison’s plant, it can be argued.


It wasn’t until the late 1990’s, or about 30 years after 1968, that computers had a measurable effect on the US economy, many would note.



source: Wikipedia


The S curve is related to the product life cycle, as well. 


Another key principle is that successive product S curves are the pattern. A firm or an industry has to begin work on the next generation of products while existing products are still near peak levels. 


source: Strategic Thinker


There are other useful predictions one can make when using S curves. Suppliers in new markets often want to know “when” an innovation will “cross the chasm” and be adopted by the mass market. The S curve helps there as well. 


Innovations reach an adoption inflection point at around 10 percent. For those of you familiar with the notion of “crossing the chasm,” the inflection point happens when “early adopters” drive the market. The chasm is crossed at perhaps 15 percent of persons, according to technology theorist Geoffrey Moore.

source 


For most consumer technology products, the chasm gets crossed at about 10 percent household adoption. Professor Geoffrey Moore does not use a household definition, but focuses on individuals. 

source: Medium


And that is why the saying “most people overestimate what they can achieve in a year and underestimate what they can achieve in ten years” is so relevant for technology products. Linear demand is not the pattern. 


One has to assume some form of exponential or non-linear growth. And we tend to underestimate the gestation time required for some innovations, such as machine learning or artificial intelligence. 


Other processes, such as computing power, bandwidth prices or end user bandwidth consumption, are more linear. But the impact of those linear functions also tends to be non-linear. 


Each deployed use case, capability or function creates a greater surface for additional innovations. Futurist Ray Kurzweil called this the law of accelerating returns. Rates of change are not linear because positive feedback loops exist.


source: Ray Kurzweil  


Each innovation leads to further innovations and the cumulative effect is exponential. 


Think about ecosystems and network effects. Each new applied innovation becomes a new participant in an ecosystem. And as the number of participants grows, so do the possible interconnections between the discrete nodes.  

source: Linked Stars Blog 


Think of that as analogous to the way people can use one particular innovation to create another adjacent innovation. When A exists, then B can be created. When A and B exist, then C and D and E and F are possible, as existing things become the basis for creating yet other new things. 


So we often find that progress is slower than we expect, at first. But later, change seems much faster. And that is because non-linear change is the norm for technology products.


Saturday, April 15, 2023

5G Leaky Bucket Problems

What happens with legacy services is arguably more important, near term, than what happens with new services created by 5G networks. The reasons are obvious: the new services represent smallish revenues while the legacy services represent most of the total revenue.


Small percentage declines in core legacy services have more revenue and profit margin impact than all the new services put together. The image of a hamster running on a wheel might not be appetizing, but that is the situation connectivity providers face.


Or, if you like, a leaky water bucket where new water is poured into the bucket as water continues to leak from holes.


Most connectivity service providers serving well-served and nearly-saturated mass markets would be happy if annual revenue growth chugged along at about a two-percent rate. Service providers in some markets can expect higher growth rates, but the global average will probably be in the two-percent range. 


Given some deterioration in legacy lines of business (negative growth rates), growth rates in one or more new areas might have to happen at higher-than-two-percent rates to maintain an overall growth rate of two percent. 


And that is the problem for new 5G services in the edge computing, private networks or internet of things areas, for example. The new revenue streams will be small in magnitude, while even a modest decline in a legacy service can--because of the larger size of the existing revenue streams--can pose big problems. 


Many service providers, for example, expect big opportunities in business services, which underpins hopes for private networks, edge computing and IoT. But revenue magnitudes matter. 


Consumer revenue always drives the bulk of mobile operator service revenues. And revenue growth is the key issue.  


But it will be hard for new 5G services for enterprises and business to move the revenue needle. 


Edge computing possibly can grow to generate a minimum of $1 billion in annual new revenues for some tier-one service providers. The same might be said for service-provider-delivered and operated  private networks, internet of things services or virtual private networks. 


But none of those services seem capable of driving the next big wave of revenue growth for connectivity providers, as their total revenue contribution does not seem capable of driving 80 percent of total revenue growth or representing half of the total installed base of revenue. 


In other words, it does not appear that edge computing, IoT, private networks or network slicing can rival the revenue magnitude of voice, texting, video subscriptions, home broadband or mobile subscription revenue. 


It is not clear whether any of those new revenue streams will be as important as MPLS or SD-WAN, dedicated internet access or Ethernet transport services, for example. All of those can be created by enterprises directly, on a do-it-yourself basis, from the network edge. 


source: STL, KBV Research 


In the forecast shown above, for example, services includes system integration and consulting, certain to be a bigger revenue opportunity than new sales of connectivity services. 


And though it might seem far fetched, the lead service sold by at least some connectivity providers might not yet have been invented.  


At least so far, 5G fixed wireless is the only new 5G service that is meaningful and material as a revenue source for at least some mobile operators. Even if network slicing, edge computing, private networks and sensor network support generate some incremental revenues, the volume of incremental revenue will not be as large as many hope to gain.


It is conceivable that mobile operators globally will make more money providing home broadband using fixed wireless than they will earn from the flashier, trendy new revenue sources such as private networks, edge computing and internet of things. 

source: Ericsson 


Wells Fargo telecom and media analysts Eric Luebchow and Steven Cahall predict fixed wireless access will grow from 7.1 million total subscribers at the end of 2021 to 17.6 million in 2027, growth that largely will come at the expense of cable operators. 

source: Polaris Market Research 

If 5G fixed wireless accounts and revenue grow as fast as some envision, $14 billion to $24 billion in fixed wireless home broadband revenue would be created in 2025. 


The point is that the actual amount of new revenue mobile service providers can earn from new services sold to enterprises is more limited than many suspect.

Directv-Dish Merger Fails

Directv’’s termination of its deal to merge with EchoStar, apparently because EchoStar bondholders did not approve, means EchoStar continue...