Friday, July 1, 2022

Experts Say Metaverse Will Not be Common in Consumer Life in 2040. Why?

Experts surveyed by Pew Research believe that augmented and mixed-reality applications will dominate full virtual reality environments in 2040. But half of the experts also believe the “metaverse” will not be common in the lives of most consumers by that point. 

A table showing two meta themes that anchored many experts' predictions

A table showing the reasons The metaverse will fully emerge as its advocates predict

A table showing the reason thatThe metaverse will not fully emerge in the way today’s advocates hope

source: Pew Research 


This will be unwelcome news for many metaverse proponents. But it is historically realistic. 


Major technology transitions typically take much longer than proponents expect. One common facet of new technology adoption is that change often comes with a specific pattern: a sigmoid curve such as the Gompertz model or Bass model. 


S curves explain overall market development, customer adoption, product usage by individual customers, sales productivity, developer productivity and sometimes investor interest. It often is used to describe adoption rates of new services and technologies, including the notion of non-linear change rates and inflection points in the adoption of consumer products and technologies.


In mathematics, the S curve is a sigmoid function. It is the basis for the Gompertz function which can be used to predict new technology adoption and is related to the Bass Model.


Such curves suggest a longish period of low adoption, followed by an inflection point leading to rapid adoption.


That leads supporters to overestimate early adoption and vastly underestimate later adoption. Mobile phone adoption, and smart phone adoption, illustrate the process. You might think adoption is a linear process. In fact, it tends to be non-linear.


Also, the more fundamental the change, the longer to reach mass adoption. Highly-useful “point technologies” such as telephones, electricity, mobile phones, smart phones, the internet and so forth can easily take a decade to reach 10-percent adoption. Adoption by 40 percent of people can take another decade to 15 years. And adoption by more than 40 percent of people can take another decade to 15 years. 


source: MIT Technology Review 


That suggests a 30-year adoption cycle for a specific innovation that has high value to be used by 40 percent to 70 percent of people. Something such as metaverse, which is far more complicated, could easily take 30 years to reach 40 percent of people in ordinary use. 


That might mean at least a decade before metaverse apps are in common use by 10 percent of people. Even then, use cases are likely to be dominated by gaming, business communications and video entertainment. 


source: Robert Patterson 


The sigmoid function arguably is among the most-important mathematical expressions one ever encounters in the telecom, application and device businesses. It applies to business strategy overall, new product development, strategy for legacy businesses, customer adoption rates, marketing messages and  capital deployment, for example. 


The sigmoid function applies to startups as well as incumbents; software and hardware; products and services; new and legacy lines of business. 

source: Innospective


The concept has been applied to technology adoption in the notion of crossing the chasm of value any technology represents for different users. Mainstream users have different values than early adopters, so value propositions must be adjusted as any new technology product exhausts the market of early adopters. Early adopters can tolerate bugs, workarounds or incomplete on-boarding and support experiences. They tend to be price insensitive. 


It always takes longer than one expects for a major new innovation to become ubiquitous. Metaverse, being a complicated development, might take longer than any point innovation.

Thursday, June 30, 2022

FiberLight Sold to Institutional Investors

A consortium led by H.R.L. Morrison & Co, the Australian Retirement Trust and a managed client of UBS Asset Management are acquiring FiberLight from energy firm Thermo Companies. 


Headquartered in Atlanta, Georgia, FiberLight is a pure-play, top-ten  fiber infrastructure provider in the U.S. market, featuring 18,000 route miles of fiber infrastructure reaching customers in over 30 metropolitan areas, principally in the major markets of Texas and the Northern Virginia area.


FiberLight’s seasoned management team, led by Chief Executive Officer Christopher Rabii, will continue to lead the business, the new owners say. 


That latest deal is another example of institutional investors snapping up digital infrastructure assets. Generally speaking, the attraction is that the assets are “alternative” holdings in portfolios offering the prospect of predictable cash flows over time. 


source: Asian Infrastructure Investment Bank


The interest in selling on the part of digital infrastructure owners is an exit from a business that is getting more capital intensive at the same time return on invested capital is shrinking. 


As with all trends, there are reasons why buyers and sellers are motivated.


The Metaverse Could Easily Take 30 Years to Reach Ubiquity

Major technology transitions typically take much longer than proponents expect. One common facet of new technology adoption is that change often comes with a specific pattern, namely a longish period of low adoption, followed by an inflection point leading to rapid adoption.


That leads supporters to overestimate early adoption and vastly underestimate later adoption. Mobile phone adoption, and smart phone adoption, illustrate the process. You might think adoption is a linear process. In fact, it tends to be non-linear.


Also, the more fundamental the change, the longer to reach mass adoption. Highly-useful “point technologies” such as telephones, electricity, mobile phones, smart phones, the internet and so forth can easily take a decade to reach 10-percent adoption. Adoption by 40 percent of people can take another decade to 15 years. And adoption by more than 40 percent of people can take another decade to 15 years. 


source: MIT Technology Review 


That suggests a 30-year adoption cycle for a specific innovation that has high value to be used by 40 percent to 70 percent of people. Something such as metaverse, which is far more complicated, could easily take 30 years to reach 40 percent of people in ordinary use. 


That might mean at least a decade before metaverse apps are in common use by 10 percent of people. Even then, use cases are likely to be dominated by gaming, business communications and video entertainment. 


source: Robert Patterson 


The sigmoid function arguably is among the most-important mathematical expressions one ever encounters in the telecom, application and device businesses. It applies to business strategy overall, new product development, strategy for legacy businesses, customer adoption rates, marketing messages and  capital deployment, for example. 


The sigmoid function applies to startups as well as incumbents; software and hardware; products and services; new and legacy lines of business. 

source: Innospective


The concept has been applied to technology adoption in the notion of crossing the chasm of value any technology represents for different users. Mainstream users have different values than early adopters, so value propositions must be adjusted as any new technology product exhausts the market of early adopters. Early adopters can tolerate bugs, workarounds or incomplete on-boarding and support experiences. They tend to be price insensitive. 


It always takes longer than one expects for a major new innovation to become ubiquitous. Metaverse, being a complicated development, might take longer than any point innovation.

Are Infrastructure Correlations With Economic Growth Actually Causal?

Virtually all of us act as though better broadband has a positive impact on economic growth, in the same way that electricity is correlated with economic growth. Indeed, historically higher energy consumption is correlated with economic growth. 


source: Telecom Advisory Services 


Likewise, it can be argued that transportation also is correlated with higher economic growth. Even there, however, there are questions about whether better transportation creates or merely redistributes economic growth from one place to others. 


The same can be argued for all other forms of infrastructure, physical or social, ranging from water supplies to education and medical care. There almost always is some degree of correlation between higher economic growth and higher inputs of physical or social infrastructure.   


So everyone might agree that broadband quality and economic growth tend to be correlated, though the degree of correlation varies. Burt as with other sorts of infrastructure, it is hard to argue with certainty that any particular infrastructure investment actually “causes” economic growth. 


That is why economics often use phrases such as impact to describe such correlations. So better broadband might be said to drive economic growth. That is another way of saying causal  but less directly. 


Contribution, component or contributor might be other ways economists might describe the relationship between better broadband, transportation, electricity, water systems, education or income and wealth on economic development. 


Correlation is clear. What is unclear is “causation” of any of the inputs.


When All Home Platforms are the Same, Marketing and Operating Prowess Still Will Matter

Virtually all observers believe hybrid fiber coax eventually will be replaced by fiber to the home. The issue always is the time frame for the transition. What is not so clear is the eventual impact on cable operator market share. Some forecasts might reflect use of HFC platforms rather than internet access (home broadband) market shares, as cable operators gradually move to FTTH themselves. 

source: Point Topic 


Assuming cable operators remain astute operators, a case can be made that even after the shift to FTTH platforms, they should still be able to maintain leading shares in the home broadband markets, much as one now sees in mobility markets globally. Platforms are the same. What matters are operating and marketing prowess. 

source: Point Topic

Wednesday, June 29, 2022

How "Reliable" or "Available" is the Internet? Nobody Can Really Know

End user experience of the internet is virtually impossible to quantify in terms of total availability. In other words, it is objectively impossible to measure. And, no matter what the objective estimates might be, they are conditioned by subjective issues. 


For example, even if we can estimate that the total availability of all internet apps is collectively 90 percent, meaning something, somewhere,  is not available about 10 percent of the time, no user is actually trying to use all those potential apps. 


Outages that happen do not matter for users not using any particular app, data center, internet service provider or backbone network. Furthermore, nobody is actively interacting with internet apps 24 hours a day, seven days a week.


So outages that happen when one is not interacting with an app effectively “do not matter” for that user. 


We long have known that anything related to the internet is not as “reliable” (service availability) as the old public switched telephone network. The telephone network uptime standard was 99.999 percent availability, representing annual downtime of about five minutes.

source: AWS 


Availability for consumer internet apps and services tends to be far lower, in large part because the entire end-to-end transmission chain is not under any single entity’s control. Consider recent availability data for U.K. internet service providers. In all cases, availability was in the range of 97 percent to 90 percent.  


source: Uswitch 


But that was only for the local access link. Those availability figures do not take into account any other sources of availability loss: far end access; app availability; device availability; local power availability or any other platform availability. Since there is no “chain of custody,” it is virtually impossible to estimate average availability for any particular configuration of hardware, software and platforms at any single location or for any single device. 


It is safe to say that 99.999 percent availability for consumer internet “anything,” end to end, is impossible. 


It probably goes without saying that the Internet is a complex system, with lots of servers, transmission paths, networks, devices and software all working together to create a complete value chain.


And since the availability of any complex system is the combined performance of all cumulative potential element failures, it should not come as a surprise that a complete end-to-end user experience is not “five nines.”


Consider a 24×7 e-commerce site with lots of single points of failure. Note that no single part of the whole delivery chain has availability of  more than 99.99 percent, and some portions have availability as low as 85 percent.


In principle, availability could be quite low, without redundancy built in. 


The expected availability of the site would be 85%*90%*99.9%*98%*85%*99%*99.99%*95%, or  59.87 percent. Redundancy is the way performance typically is enhanced at a data center or on a transmission network, to avoid such a state. 


Component

Availability

Web

85%

Application

90%

Database

99.9%

DNS

98%

Firewall

85%

Switch

99%

Data Center

99.99%

ISP

95%


source: IP Carrier 


In practice, cloud data centers have made great strides where it comes to availability, so that dire situation virtually never happens. The other issue is that “user experienced” availability is better than actual end-to-end availability since users are not actually connected and engaged with internet experiences and apps all day, every day. 


A simple table showing downtime per year for 1 through 5 9s


Outages can occur with any user-experienced downtime if any particular user is not actually interacting with the internet at the moments when the outages occur. Even if end-to-end experience is at the 90-percent level, implying outages about 10 percent of the time, it does not affect a user if those outages occur when a person is sleeping or otherwise logged off.


Tuesday, June 28, 2022

WAN Services Now as Little as 3% of Telco Revenues?

Most global telco revenue is earned providing mobile services. Contributions by local fixed network services are lower. Perhaps lowest of all are wide area connectivity services. If total service provider revenue is about $1.6 trillion, mobility represents perhaps 57 percent of the total.  


source: IDC


That is just as true in the consumer segment, which itself represents perhaps 60 percent of total service revenues for a typical service provider. In the United States, for example, fixed network communication services might represent only about 20 percent of total consumer segment revenues. As much as 20 percent of total revenue might be generated by video entertainment subscriptions. 

source: Mobile Experts


Wide area network services sold to businesses are a small revenue contributor, overall. In 2020, for example, wide area network services might have represented $46 billion in service revenue, or possibly three percent of total service provider revenue.  

source: Telegeography 


Total enterprise connectivity revenues might have been in the $70 billion range in 2018, with roughly 30 percent coming in the form of local access.  


The point is that wide area network services--for consumers or businesses--is a small part of total telecom service provider revenues, in the three-percent range in most cases.


Directv-Dish Merger Fails

Directv’’s termination of its deal to merge with EchoStar, apparently because EchoStar bondholders did not approve, means EchoStar continue...