Sunday, April 23, 2023

Will Investable Categories of Digital Infrastructure Broaden?

It is perhaps predictable that, at some point, private equity investors will start to broaden the categories of assets they seek to acquire. Some of the largest firms have already begun to invest in some assets beyond data centers, cell tower networks or optical fiber access networks.


Giant Blackstone has made some investments in software assets including artificial intelligence and cybersecurity, some might note. KKR likewise seems to have made some early steps to broaden beyond data centers and fiber networks, and into software. 


Apollo Global Management likewise is making similar moves, as is Macquarie Infrastructure and Real Assets. The issue for many smaller funds and firms is that the expertise to conduct prospecting and due diligence in the newer areas requires expertise not presently onboard. 


It also will be harder to find assets that offer the same predictable cash flow as the physical assets. But assets such as Intrado, which supports emergency calling capabilities, and was recently acquired by Stonepeak, provide an illustration. 


Still, some investments arguably are more risky, from a predictable cash flow perspective. 


Apollo Global Management, for example, recently took Intrado private, for example. Intrado is perhaps best known for supplying systems supporting emergency calling features and services. So the issue is how many more such franchises might be available in spaces adjacent to classic digital infra. 


At first glance, that might be somewhat rare, though over time some quasi-franchises could develop in specialized operating systems, perhaps database management and likely security. Some might propose virtualization software used by connectivity providers as a possible future opportunity, once marketplace standardization happens. 


Others might propose that dedicated internet access networks merit consideration, especially when provided by specialist firms. Content delivery networks are another possibility. And though most of the funding for artificial intelligence firms will come from venture capitalists, eventually some of those assets will mature to the point where PE gets interested.


Hedge funds arguably are more likely to take stakes in software or hardware firms than private equity. Elliott Management, properly a hedge fund, acquired Gigamon, a supplier of security products.


Firms that specialize in software deals also might be buyers of assets that could overlap with digital infra investors at some point. Coupa was acquired by Thoma Bravo, which specializes in software deals. Arguably most software investments are made by venture capital firms, at least for startups. 


As a rule, the attraction includes the expectation that financial performance, and therefore asset value, can be enhanced by the private equity owners and managers. In some cases, perhaps nearly all cases, digital infra assets (which are a mix of data center, cell tower and access network assets) are valued more highly than operating businesses using their own infrastructure. 


Also, firms that use platform business models (outside the connectivity businesses) also tend to have higher valuations than connectivity operating businesses. Since true platform business models are rare in the connectivity business, and tend, even when present, to represent only a smallish portion of total revenues, it remains unclear how valuations could develop. 


But it might be reasonable to expect a further boost in valuation metrics of some magnitude. 


Valuation Metric

Digital Infrastructure

Telco

Platform Business Models

Price-to-earnings (P/E) ratio

20-25x

15-20x

30-35x

Price-to-sales (P/S) ratio

10-15x

8-12x

15-20x

Price-to-book (P/B) ratio

1-2x

0.8-1.2x

1.5-2x

Enterprise value (EV)/EBITDA

20-25x

18-22x

25-30x

Dividend yield

3-5%

2-4%

1-2%


Obviously, any such comparisons are suggestive, as ratios change based on other conditions such as the size of the firm, the geographies where it operates, firm growth rates, product mix and degree of competition in its markets. 


That is why managed service firms in the connectivity industry, deemed to operate with higher or specialized value added, tend to earn higher valuation multiples, all other things being equal. 


Valuation Metrics

Telco

Cable TV Operator

Managed Service Provider Businesses

Price-to-Earnings Ratio (P/E Ratio)

20.25

17.93

28.10

Price-to-Sales Ratio (P/S Ratio)

1.65

2.29

8.31

Enterprise Value-to-Revenue (EV/R) Ratio

1.73

2.56

18.07


Wednesday, April 19, 2023

Who Sells a Product--Not Just What the Product Is--Matters

Firms whose core business model relies on network effects--including all firms using a platform business model--are essentially forced to develop pro timesies for network effects that drive their success, as network effects cannot be directly measured using standard financial reporting metrics. 


Airbnb, for e timesample, might track the total number of lodging listings; the number of guest reviews; the number of booked stays; the number of site visits or completed transactions. 


Uber might track the number of drivers; the number of riders or trips. E-commerce sites such as eBay might measure the number of buyers, the number of sellers or the number of listed items for sale or the number of completed transactions. 


Social media networks such as Meta might measure the number of users; the amount of time users spend on the platform or the number of interactions between users. 


PayPal might measure the number of active registered accounts and the volume of transactions. Amazon might measure network effects for its third-party merchant services by noting the number of sellers; the number of products listed for sale; the number of reviews or the volume of product sales.


Connectivity providers using platform business models might track  the number of users; the amount of bandwidth or computing services purchased; or the number of transactions.


And though such “network effects” performance metrics are not often directly reflected in firm valuations using traditional accounting metrics, it is reasonable to assume they exist. 


For example, some might argue that Equinix's EV/EBITDA multiple was 25.7 times in 2022, while AT&T's EV/EBITDA multiple was 12.5 times. 


Equinix's price/sales multiple might have been 18.4 times in 2022, while AT&T's price/sales multiple was 5.7 times. 


Equinox’s price/earnings multiple might have reached 100.5 times in 2022, while AT&T's price/earnings multiple was 15.4 times, at least at market highs for that year. 


Equinix's multiple of annual revenue was 13.8 times in 2022, while AT&T's multiple of annual revenue was 6.4 times.


Since AT&T’s revenue was primarily driven by connectivity services, while Equinix is valued as a data center or a digital infrastructure or real estate asset, each firm’s total valuation tends to reflect the primary importance of each firm’s core business. 


But each firm sells interconnection or connectivity services that are functionally the same: dark fiber, lit fiber, Ethernet transport and optical wave services. 


The point is that the same product (interconnection) is valued differently at each firm, since each is in a different category of business. 


For executives constantly concerned about the commoditization of their connectivity services, such differences show that the value of particular interconnections can vary based on who is offering the services, or what and where the connections are made. 


Of course, larger firms often are awarded higher multiples than smaller companies. Networking specialist Megaport's EV/EBITDA multiple was 10.5 times in 2022, while AT&T's EV/EBITDA multiple was 12.5 times.


On the other hand, Megaport's price/sales multiple was 12.4 times in 2022, while AT&T's price/sales multiple was 5.7 times.


Megaport's price/earnings multiple was 24.3 times in 2022, while AT&T's price/earnings multiple was 15.4 times. 


Megaport's multiple of annual revenue was 8.1 times in 2022, while AT&T's multiple of annual revenue was 6.4 times.


So even when a smaller firm competes with a larger and more-established firm, valuation multiples for the smaller firm can be higher.


Tuesday, April 18, 2023

Non-Linear Development and Even Near-Zero Pricing are Normal for Chip-Based Products

It is clear enough that Moore’s Law played a foundational role in the founding of Netflix, indirectly led to Microsoft and underpins the development of all things related to use of the internet and its lead applications. 


All consumer electronics, including smartphones, automotive features, GPS, location services; all leading apps, including  social media, search, shopping, video and audio entertainment; cloud computing, artificial intelligence and the internet of things are built on the foundation of ever-more-capable and cheaper computing, communications and storage costs. 


For connectivity service providers, the implications are similar to the questions others have asked. Reed Hastings asked whether enough home broadband speed would exist, and when, to allow Netflix to build a video streaming business. 


Microsoft essentially asked itself whether dramatically-lower hardware costs would create a new software business that did not formerly exist. 


In each case, the question is what business is possible if a key constraint is removed. For software, assume hardware is nearly free, or so affordable it poses no barrier to software use. For applications or computing instances, remove the cost of wide area network connections. For artificial intelligence, remove the cost of computing cycles.


In almost every case, Moore’s Law removes barriers to commercial use of technology and different business models. The fact that we now use millimeter wave radio spectrum to support 5G is precisely because cheap signal processing allows us to do so. We could not previously make use of radio signals that dropped to almost nothing after traveling less than a hundred feet. 


Reed Hastings, Netflix founder, based the viability of video streaming on Moore’s Law. At a time when dial-up modems were running at 56 kbps, Hastings extrapolated from Moore's Law to understand where bandwidth would be in the future, not where it was “right now.”


“We took out our spreadsheets and we figured we’d get 14 megabits per second to the home by 2012, which turns out is about what we will get,” says Reed Hastings, Netflix CEO. “If you drag it out to 2021, we will all have a gigabit to the home." So far, internet access speeds have increased at just about those rates.


The point is that Moore’s Law enabled a product and a business model  that was not possible earlier, simply because computation and communications capabilities had not developed. 


Likewise, Microsoft was founded with an indirect reliance on what Moore’s Law meant for computing power. 


“As early as 1971, Paul (Allen) and I had talked about the microprocessor,” Bill Gates said in a 1993 interview for the Smithsonian Institution, in terms of what it would mean for the cost of computing. "Oh, exponential phenomena are pretty rare, pretty dramatic,” Gates recalls saying. 


“Are you serious about this? Because this means, in effect, we can think of computing as free," Gates recalled. 


That would have been an otherwise ludicrous assumption upon which to build a business. Back in 1970 a “computer” would have cost millions of dollars. 

source: AEI 


The original insight for Microsoft was essentially the answer to the question "What if computing were free?". Recall that Micro-Soft (later changed to MicroSoft before becoming today’s Microsoft) was founded in 1975, not long after Gates apparently began to ponder the question. 


Whether that was a formal acknowledgement about Moore’s Law or not is a question I’ve never been able to firmly pin down, but the salient point is that the microprocessor meant “personal” computing and computers were possible. 


A computer “in every house” meant appliances costing not millions of dollars but only thousands. So three orders of magnitude price improvements were required, in less than half a decade to a decade. 


“Paul had talked about the microprocessor and where that would go and so we had formulated this idea that everybody would have kind of a computer as a tool somehow,” said Gates.


Exponential change dramatically extends the possible pace of development of any technology trend. 


Each deployed use case, capability or function creates a greater surface for additional innovations. Futurist Ray Kurzweil called this the law of accelerating returns. Rates of change are not linear because positive feedback loops exist.


source: Ray Kurzweil  


Each innovation leads to further innovations and the cumulative effect is exponential. 


Think about ecosystems and network effects. Each new applied innovation becomes a new participant in an ecosystem. And as the number of participants grows, so do the possible interconnections between the discrete nodes.  

source: Linked Stars Blog 

 

So network effects underpin the difference in growth rates or cost reduction we tend to see in technology products over time, and make linear projections unreliable.


Monday, April 17, 2023

How Much Value Might Network APIs Unlock?

Most consumers indirectly use application programming interfaces every day, to check weather, make travel arrangements, use social networks or search, make payments or log in to sites using another existing account. Use of Google Maps or shopping provide other examples where APIs are used. 


Basically, APIs allow different applications to exchange data


Connectivity providers now believe they can make revenue, and increase the value of their services, while possibly building bigger ecosystems of value, by exposing network functions as APIs. Value added will determine how successful operators are at unlocking value from exposing network features for use by third parties.


source: Nokia 


In the internet era, connectivity is essential for almost all applications, use cases and functions. So we should never be surprised that app developers note the value of access to the internet, and connectivity in general. Without internet access, most apps will not work. 


source: STL Partners 


But there is a key distinction. Internet access does not hinge on a business relationship between the app and the access or transport networks. By design, any app compliant with TCP/IP, and any user with access to TCP/IP networks, will be able to use internet-supported apps, so long as they have the credentials to do so. 


So access providers hope that new network APIs will create a new revenue stream, while possibly increasing the value of network features. 


Initially, for example, the GSMA Open Gateway APIs support SIM Swap, Quality on Demand, Device Status (Connected or Roaming Status), Number Verify, Edge Site Selection and Routing, Number Verification (SMS 2FA) and Carrier Billing features such as Check Out and Device Location (Verify Location). Other APIs will be added, of course. 


The issue, as with all engineering choices, is whether there are other proxies or substitutes for those sorts of functions. Payments can be made in other ways than using carrier billing. Device location might be available using device or application features.


In other cases, the use of an API might, or might not, be a feature of some other service, such as a network slice. In some cases, quality issues will be rectified by use of edge computing or content delivery networks. 


In other words, it remains to be seen whether, and how much, app developers see value in the new APIs, and where value can be created using substitute methods. 


source: RingCentral 


APIs have been used to “voice enable” other apps, for example. The hope is that many other network-based parameters or functions likewise can be exposed, with access to those features generating revenue for network operators.


Architecturally, of course, APIs are a way of reintegrating functions that are separated into layers. Also, to the extent that many industries evolve towards ecosystems, APIs add value in allowing ecosystem partners to interoperate. 


source: McKinsey 


The issue for connectivity providers is the additional value apps gain when network functions are available to exchange data. Generally speaking, it is application-to-application data that adds the most value. 


The thinking is that latency-sensitive functions and apps will benefit most from edge computing, as they have benefitted from content delivery networks. Perhaps such apps also will benefit from network APIs. 


Everything hinges on value added. So the big question right now is where value can be generated when network functions are available.


Sunday, April 16, 2023

We Will Overestimate what Generative AI can Accomplish Near Term

For most people, it seems as though artificial intelligence has suddenly emerged as an idea and set of possibilities. Consider the explosion of interest in large language models or generative AI.


In truth, AI has been gestating for many many decades. And forms of AI already are used in consumer applicances such as smart speakers, recommendation engines and search functions.


What seems to be happening now is some inflection point in adoption. But the next thing to happen is that people will vastly overestimate the degree of change over the near term, as large language models get adopted, just as they overestimate what will happen longer term.


That is an old--but apt--story.


“Most people overestimate what they can achieve in a year and underestimate what they can achieve in ten years” is a quote whose provenance is unknown, though some attribute it to Standord computer scientist Roy Amara. Some people call it the “Gate’s Law.”


The principle is useful for technology market forecasters, as it seems to illustrate other theorems including the S curve of product adoption. The expectation for virtually all technology forecasts is that actual adoption tends to resemble an S curve, with slow adoption at first, then eventually rapid adoption by users and finally market saturation.   


That sigmoid curve describes product life cycles, suggests how business strategy changes depending on where on any single S curve a product happens to be, and has implications for innovation and start-up strategy as well. 


source: Semantic Scholar 


Some say S curves explain overall market development, customer adoption, product usage by individual customers, sales productivity, developer productivity and sometimes investor interest. It often is used to describe adoption rates of new services and technologies, including the notion of non-linear change rates and inflection points in the adoption of consumer products and technologies.


In mathematics, the S curve is a sigmoid function. It is the basis for the Gompertz function which can be used to predict new technology adoption and is related to the Bass Model.


Another key observation is that some products or technologies can take decades to reach mass adoption.


It also can take decades before a successful innovation actually reaches commercialization. The next big thing will have first been talked about roughly 30 years ago, says technologist Greg Satell. IBM coined the term machine learning in 1959, for example, and machine learning is only now in use. 


Many times, reaping the full benefits of a major new technology can take 20 to 30 years. Alexander Fleming discovered penicillin in 1928, it didn’t arrive on the market until 1945, nearly 20 years later.


Electricity did not have a measurable impact on the economy until the early 1920s, 40 years after Edison’s plant, it can be argued.


It wasn’t until the late 1990’s, or about 30 years after 1968, that computers had a measurable effect on the US economy, many would note.



source: Wikipedia


The S curve is related to the product life cycle, as well. 


Another key principle is that successive product S curves are the pattern. A firm or an industry has to begin work on the next generation of products while existing products are still near peak levels. 


source: Strategic Thinker


There are other useful predictions one can make when using S curves. Suppliers in new markets often want to know “when” an innovation will “cross the chasm” and be adopted by the mass market. The S curve helps there as well. 


Innovations reach an adoption inflection point at around 10 percent. For those of you familiar with the notion of “crossing the chasm,” the inflection point happens when “early adopters” drive the market. The chasm is crossed at perhaps 15 percent of persons, according to technology theorist Geoffrey Moore.

source 


For most consumer technology products, the chasm gets crossed at about 10 percent household adoption. Professor Geoffrey Moore does not use a household definition, but focuses on individuals. 

source: Medium


And that is why the saying “most people overestimate what they can achieve in a year and underestimate what they can achieve in ten years” is so relevant for technology products. Linear demand is not the pattern. 


One has to assume some form of exponential or non-linear growth. And we tend to underestimate the gestation time required for some innovations, such as machine learning or artificial intelligence. 


Other processes, such as computing power, bandwidth prices or end user bandwidth consumption, are more linear. But the impact of those linear functions also tends to be non-linear. 


Each deployed use case, capability or function creates a greater surface for additional innovations. Futurist Ray Kurzweil called this the law of accelerating returns. Rates of change are not linear because positive feedback loops exist.


source: Ray Kurzweil  


Each innovation leads to further innovations and the cumulative effect is exponential. 


Think about ecosystems and network effects. Each new applied innovation becomes a new participant in an ecosystem. And as the number of participants grows, so do the possible interconnections between the discrete nodes.  

source: Linked Stars Blog 


Think of that as analogous to the way people can use one particular innovation to create another adjacent innovation. When A exists, then B can be created. When A and B exist, then C and D and E and F are possible, as existing things become the basis for creating yet other new things. 


So we often find that progress is slower than we expect, at first. But later, change seems much faster. And that is because non-linear change is the norm for technology products.


Directv-Dish Merger Fails

Directv’’s termination of its deal to merge with EchoStar, apparently because EchoStar bondholders did not approve, means EchoStar continue...