Friday, April 2, 2021

Why Millimeter Wave Matters

Propagation issues notwithstanding, millimeter wave frequencies will be vital for mobile operators. The pressure to achieve lower cost per delivered bit will not cease, forcing service providers to continually deploy new solutions for bandwidth with a lower cost per bit profile. 


Millimeter wave does that. Eventually, so will teraHertz frequencies. 


 

source: GSMA Intelligence 


To be sure, 4G capacity increases will continue for a while. Eventually, though, 4G runs out of gas. Fundamentally, that is why 5G is “necessary.” Beyond all the other new use cases enabled by vaster-lower latency, core network virtualization or 5G-enabled edge computing or internet of things, 5G will supply bandwidth at lower costs than 4G networks. 


Cost per bit matters because customer bandwidth demand grows as much as 40 percent a year, while consumer willingness to pay is limited, essentially remaining flat, year over year. 


If access providers must supply 40 percent more bandwidth per year, while revenue grows one percent per year, bandwidth efficiency must increase significantly. That is the value of millimeter wave spectrum. 


That need for efficiency would be true if access providers owned all the apps used by their customers. In the internet era, access providers own almost none of the apps used by their customers. 


So connectivity providers generate relatively small amounts of revenue from applications they own, and at the same time must supply bandwidth for third party apps at prices their customers consider fair. 


In that context, since most of the bandwidth consumed is video entertainment, and since video is the most bandwidth-intensive app, prices per bit must be low, and constantly get lower. Video economics are dominated by the fact that users will not pay very much for video entertainment, in relation to the bandwidth consumed to support its use. 


For owned apps, revenue per bit for messaging and voice can be as much as two or more orders of magnitude higher than for full-motion video or Internet apps. By some estimates, where voice might earn 35 cents per megabyte, revenue per Internet app might generate a few cents per megabyte. 


The cost of consuming a bit is infinitesimally small. Assume an internet access plan costing $50 a month, with a usage allowance of a terabyte. That, in turn, works out to a cost of about $0.000004 per byte. And even that cost will have to keep dropping. 


The reason is that consumer propensity to pay is only so high. Essentially, internet service providers must continually supply more bandwidth for about the same prices. 

source: GSMA Intelligence

Thursday, April 1, 2021

Why T-Mobile has the Easier Route to Profitable 5G Revenue than AT&T or Verizon

In any market, attackers often have strategy options that incumbents do not have. In the 5G-related revenue growth areas, for example, incumbents are looking at internet of things, edge computing and private networks.

Attackers can choose to look elsewhere, as T-Mobile is doing in the areas of home broadband and business services. In the former market T-Mobile has zero market share, and only has to take a couple of share points to build a substantial new revenue stream. In the latter market, T-Mobile has been under-represented, compared to its two main rivals.

And it is almost always easier to take market share than to create brand new markets. To take share, an attacker does not have to guess about the market size, the value proposition, the distribution channels or pricing.

To create or enter a new market, a firm must make guesses about all those matters.

One of the issues for connectivity providers trying to create new revenue streams--aside from a reputation for not being good at innovation--is the challenge of finding innovations that represent enough incremental revenue to justify the cost of developing them. 

It is one thing to see projections of the new revenue from private 5G networks; something else to figure out how much of that opportunity realistically can be addressed by connectivity providers. 


We face the same problem when trying to estimate the value of edge computing or internet of things markets as well. How much of that opportunity realistically could be converted into revenue for connectivity providers?


Since estimates of edge computing, unified communications, IoT and private 5G always involve a mix of infrastructure sold to create the networks; management solutions of some type; design, installation and operating support and some connectivity revenues, the issue is how to estimate realistic connectivity service provider roles and therefore revenues. 


History suggests connectivity providers might have a role earning up to five percent of any of those proposed new areas of business, based on past experience with local area networks in general, or business services such as enterprise voice, conferencing and collaboration.


The global unified communications  and collaboration market might have reached about $47.2 billion in 2020, IDC says. But most of that revenue was earned by entities other than connectivity providers. 


For example, revenue booked by Microsoft, Cisco, Zoom, Avaya and RingCentral totaled about $26 billion for the year. Those five firms represent 55 percent of total UCC revenues for the year, IDC figures suggest. 


Relatively little UCC market revenue is earned by connectivity service providers. 


Direct connectivity provider revenue from local area networks is almost completely related to broadband access bandwidth sold to enterprises, smaller businesses and consumers. Almost all the rest of the revenue is earned by hardware and software suppliers, third party design, installation and maintenance firms, chip and device vendors.  


The point is that the traditional demarcation point between cabled public networks and private networks--wide area and local networks--happens at the side of a building or in the basement. WAN and connectivity service providers make their revenue.


The demarcation point between mobile customers and the public networks is the device. The capacity services supplier owns everything from spectrum to tower, then tower to switches and other controllers, then the core network. The consumer owns the phone. 


Traditionally, the “private network” has been the province of different firms than public networks, which is why interconnect firms and system integrators or LAN specialists exist. 


Even in some “core” WAN areas--including virtual private networks--third party specialists and infrastructure suppliers dominate the revenue production. Software-defined WANs, for example, can be created at the edge using gear owned by the enterprises who set up the SD-WANs. 


SD-WANs can also be created by managed services firms, which includes connectivity providers. But most of the revenue is earned by infrastructure suppliers or managed services specialists, not connectivity providers. 


Much the same can be said for internet of things revenue upside. Most of the revenue will be earned by LAN hardware and software suppliers, sensor and devices suppliers and app providers. WAN connectivity will be a contest between specialized WAN providers using unlicensed spectrum and mobile operators using licensed spectrum. 


But all WAN connectivity collectively will be a small part of the IoT revenue opportunity. 


 

source: IoT Analytics 


In edge computing, most of the actual “computing” will be done by hyperscalers and others, even when mobile and fixed network operators supply real estate or access connections. It already seems clear that most telcos are not going to try and challenge hyperscalers for the actual “edge computing” function.  


Private 5G is mostly going to create revenue for infrastructure sales (hardware and software), as private 5G or 4G are local area networks, like Wi-Fi. The enterprise or the consumer “owns” that network.  


All of which raises an interesting question. “Everybody” seems to concur that businesses and enterprises will drive most of the incremental new revenue from 5G. What if that expectation is wrong? And it could be wrong, in the early days.            


Consider private 5G or edge computing or IoT opportunities. How much enterprise or business revenue do you actually believe connectivity providers in any single country can generate, compared to any other initiative in consumer segments?


Consider fixed wireless, for example, in the U.S. market. 


You can get a robust debate pretty quickly when asking “how important will 5G fixed wireless be?” in the consumer home broadband market. Will it matter? 


Keep in mind that the fixed network home broadband market presently generates $195 billion worth of annual revenue. Comcast and Charter Communications alone book $150 billion annually from internet access services that largely are generated by home broadband customers. 


Mobile service providers have close to zero--and in some cases actually zero--market share. 


Taking just two percent means new revenues of perhaps $4 billion annually, within a couple of years. How long do you think it will take T-Mobile to earn that much money from IoT, edge computing or 5G private networks? T-Mobile’s effective answer is “too long,” as it is not pursuing those lines of businesses in an active sense. 


T-Mobile is launching new initiatives for consumer home broadband and business mobility services, though. 


And the growth path for T-Mobile is clear. Instead of supplying new customers, with new needs, with new products, T-Mobile in its home broadband push only has to take a few points of market share in an established market. 


So it is possible that early incremental new revenue will be found by at least some mobile operators not in the sexy IoT, edge computing or private networks but in the less-sexy business of home broadband. 


Not to mention profits. The cost of creating a $1 billion revenue stream in IoT, edge computing or private networks--within a few years--will be somewhat daunting. The cost of creating $4 billion in home broadband revenues in the same time frame might be a simpler matter of applying marketing effort.


Wednesday, March 31, 2021

Algorithmic Bias and Algorithmic Choice

Some might agree that algorithmic choice is a reasonable way to deal with fake news or false information on social media sites. Algorithmic choice is inherent in sorting, ranking and targeting of content. To use a simple example, a search engine returning results for any user search has to rank order the items.


Algorithmic choice is essential for online and content businesses that try to tailor information for individuals based on their behavior or stated preferences. And most of us would likely agree that neutral curation, without obvious or intentional bias, is the preferred way of culling and then presenting information to users.


That is a growing issue for hyperscale social media firms, as they face mounting objections to the neutrality of their curation algorithms and practices. It is a delicate issue, to be sure. Decades ago, online sites operated with a loose "community standards" approach that relied on common courtesy and manners.


Today's hyperscale social media seems intentionally to stoke outrageous behavior and dissemination of arguably false or untrue information. Some refer to this as disinformation, the deliberate spreading of known-to-be-untrue facts, with the intention to deceive.


This is not the same thing as mere difference of opinion, the expression of “an idea I abhor” or “an idea I disagree with.” Disinformation is a matter of manipulation and deception. The latter is merely an expressed difference of opinion.


Some might argue that allowing more personalized control by users will help alleviate the problem of false or fake information. Perhaps that can help, somewhat, to the extent that people can block items and content they disagree with. 


That does not address the broader problem of potential bias in the creation and application of algorithmic systems, including rules about what content infringes on community standards, is “untrue” or “misleading” or “false.” 


It is akin to the odd notion of subjective "truth," as in "my truth" and "your truth." If something is objectively "true," my subjective opinion about it matters not.


In that sense, user-defined algorithms do not "solve" the problem of fake news and false information. The application of such algorithms by users only prevents them from exposure to ideas they do not prefer. It is not the same as designing search or culling mechanisms in a neutral and objective way, to the extent possible.


Less charitably, user algorithmic control is a way of evading responsibility for neutral curation by the application provider.


The bigger problem is that any algorithm has to be designed to filter out “truth” from “falsehood.” And that is a judgment call. “Ideas we all disagree with” is a form of bias that seems to be put to work deliberately by many social media. 


Aside from the observation that “ideas” might best be determined to be more true or more false only when there is open and free debate about those ideas, algorithms are biased when certain ideas are deemed to be “false,” even when they clearly are matters of political, cultural, social, economic, scientific or moral and religious import where we all know people disagree. 


And that means algorithm designers must make human judgments about what is “true” and what is “false.” It is an inherently biased process to the extent that algorithm designers are not aware of their own biases. 


And that leads to banning the expression of ideas, not because they are forms of disinformation but simply because the ideas themselves are deemed "untrue" or "dangerous." The issue is that separating the "untrue" from the merely "different" involves choice.




Unified Communications Market Reaches $47 Billion, IDC Says

The global unified communications and collaboration market grew 29.2 percent year over year and 7.1 percent quarter over quarter to $13.1 billion in the fourth quarter of 2020, according to IDC. Revenue growth was also up 24.9 percent for the full year 2020 to $47.2 billion, IDC notes. 


For the full year 2020, public cloud UCaaS revenue increased 21.2 percent to $16.4 billion.


Collaboration (including video conferencing software and cloud services) revenue increased 45 percent annually to reach $22.1 billion. In fact, for a market historically driven by business voice products (phone systems), revenue now is driven by conferencing. 


source: GM Insights 


Sales of IP phones declined 20.4 percent year over year, to about $1.9 billion, IDC reports. 

Enterprise videoconferencing systems (such as video conference room endpoints) increased 12.4 percent to almost $2.6 billion.


The unified communications market always is difficult to explain, as it is a mix of many services and products, ranging from business phones to hosted communications services to enterprise hardware and software to access services such as SIP trunks. 


For example, revenue booked by Microsoft, Cisco, Zoom, Avaya and RingCentral totaled about $26 billion for the year. Those five firms represent 55 percent of total UCC revenues for the year, IDC figures suggest. 


Relatively little UCC market revenue is earned by connectivity service providers.


How Much Does Fixed Wireless Matter?

You can get a robust debate pretty quickly when asking “how important will 5G fixed wireless be?” in the consumer home broadband market. Will it matter? 


Probably. But it also matters more to some than to others, and will matter even if the net result is installed base market share shifts of just a few percentage points. So there is no actual contraction between cable operators saying “fixed wireless is not a threat” and a few firms arguing it will be highly significant as a driver of revenues. 


Keep in mind that the home broadband market generates $195 billion worth of annual revenue. Comcast and Charter Communications alone book $150 billion annually from internet access services that largely are generated by home broadband customers. 


T-Mobile has zero market share in that market. Taking just two percent means new revenues of perhaps $4 billion annually. That really matters, even if cable operators minimize the threat. 


“Addressable market” is a key phrase. Right now, Comcast has (can actually sell service to) about 57 million homes passed.


The Charter Communications network passes about 50 million homes, the number of potential customer locations it can sell to.


Verizon homes passed might number 18.6 to 20 million. To be generous, use the 20 million figure. 


AT&T’s fixed network represents perhaps 62 million U.S. homes passed. CenturyLink never reports its homes passed figures, but likely has 20-million or so consumer locations it can market services to. 


The point is that, up to this point, T-Mobile has had zero addressable home broadband market to chase. Verizon has had 20 million homes to market for that purpose. AT&T has been able to market to perhaps 62 million homes; Comcast 57 million homes and Charter about 50 million homes. 


So T-Mobile and Verizon have the most market share to gain by deploying fixed wireless. And the value will not necessarily be that fixed wireless allows those two providers to “take half the market.” The revenue upside from share shifts in low single digits will be meaningful. 


Some might counter that early fixed wireless will not match the top cabled network speeds. That is true. But it also is true that half of U.S. households buy broadband services running between 100 Mbps and 200 Mbps, with perhaps 20 percent of demand requiring lower speeds than that. 


So even if fixed wireless offers lower speeds than cable hybrid fiber coax or telco FTTH, it might arguably still address 70 percent of the U.S. market.


It is conceptually possible that untethered access could eventually displace a substantial portion of the fixed networks business, longer term. 


Up to this point, mobile networks have not been able to match fixed network speeds or costs per gigabit of usage. But that should change. 


Mobile network speeds will increase at high rates, with a rule of thumb being that speeds grow by an order of magnitude every 10 years. One might argue that is less capacity growth than typically happens with fixed networks. +

 

source: Voyager8 


But that might not be the relevant context. What will matter is how much speed, at what price points, mobile or fixed wireless solutions must offer before becoming a reasonable choice, compared to fixed access. 


Assume that in its last release, 5G offers a top speed of 20 Gbps. The last iteration of 6G should support 200 Gbps. The last upgrade of 7G should support 2 Tbps. The last version of 8G should run at a top speed of 20 Tbps.


At that point, the whole rationale of fixed network access will have been challenged, in many use cases, by mobility, as early as 6G. By about that point, average mobile speeds might be so high that most users can easily substitute mobile for fixed access.


To be sure, cost per GB also has to be roughly comparable. But, at some point, useful bandwidth at a reasonable enough price could allow wireless solutions to take lots of market share from cabled network providers. 

 

We never get away from debates about “which is the better choice?” in the connectivity or computing industries. Nor do we generally remember that “one size fits all” rarely is the case. Additionally, all choices are conditioned by “when, where, by whom and why” technology must be deployed. 


The global choice of internet protocol rather than asynchronous transfer mode as the foundation for all next-generation networking is among the exceptions. That really did result in an “all or nothing” outcome. 


But few other choices are so stark. Consider access network platforms. Decades ago there were serious--if brief--debates about whether “fiber or satellite” technologies were “better” for wide area networks. There was speculation about whether “Wi-Fi or mobile” was the better platform for phone connectivity.


There were debates about whether fiber to the home or hybrid fiber coax was “better” for consumer broadband access. 


Now there are arguments about whether local connections, unlicensed wide area low power networks or mobile networks are “better” for internet of things sensors. 


Such questions, while valid, always have to be qualified by the issue of “better for whom?” It might not make sense for a public network provider to consider HFC as a foundation access technology. It virtually always is a logical choice for a cable operator, for the moment.


 “At some point,” optical fiber is universally seen as the technology of choice for telcos and other “cabled media” providers. But wireless remains the key approach for satellite, wireless ISPs and mobile operators. 


What is “better” cannot be determined without knowing the “for whom” part of the business context; the “when?” part of the discussion or the “under what other circumstances?” detail. Fiber to the home might be the “ultimate” choice, but “when to deploy” or “where to deploy” also matter. 


U.S. cable operators in 2020 had at least 69 percent share of the installed base of accounts, according to Leichtman Research Group. Telcos likely had something less than 28 percent of the installed base, accounting for share held by independent internet service providers (wireless, fixed and satellite). 


source: FCC, Bloomberg 


Without government support, FTTH might never make business sense, in some locations. In other cases the business case is so marginal and risky that an alternative, such as fixed or mobile wireless, might well be the alternate choice. For a telco, a “fiber” upgrade might make sense when existing copper facilities must be retired in any case, and where need is not driven by revenue upside, merely facilities replacement. 


For a cable operator, an FTTH overlay could make near-term sense to support business customers, but not yet consumers. But fixed wireless might also make sense for cable operator “edge out” operations, and for the same financial reasons that telcos used wholesale as a way to enter geographically-adjacent markets. 


The questions are even broader when looking at total demand for broadband access. In terms of total connections, in the U.S. market 75 percent of all internet access connections use mobile networks. Just 16 percent use cable HFC, while perhaps 8.6 percent of connections use either fiber or copper telco connections, while everything else--including satellite and fixed wireless--represents less than one percent. 


source: FCC


The point is, how much faster do untethered services need to be--assuming roughly equivalent terms and conditions of usage and price--before a significant percentage of home broadband users consider an untethered solution a functional substitute for fixed network access?


Matching headline speeds might not matter, as most consumers do not buy those services. Untethered options simply have to be “fast enough, priced well enough” to contend for significant share of the home broadband market.


Tuesday, March 30, 2021

5G Private Network Infrastructure Might Grow at 40% CAGR

The global private 5G market is better evaluated as representing sales of  local area network infrastructure than connectivity provider services, even if there is overlap between the two markets. 


The 5G portion of the private networks market is estimated to be $924 million in 2020, growing at a 40-percent compound annual growth rate to 2028, when annual sales might reach $13.92 billion.  


That is best underwood as the value of infrastructure products and services sold to enterprises running such networks, even if there will be some connectivity provider revenue when acting as system integrators, and a bit of additional connectivity revenue overall. 

source: Polaris Market Research 


Sunday, March 28, 2021

Hard to Know Long-Term Impact of Remote Processes

Nobody knows yet the mix of positive and negative long-term impact of remote working and learning outcomes. In the short term the impact is likely deemed to be far better than expected. Many employees and employers report their belief that productivity, for example, is as good as was expected in the pre-Covid-19 setting.


The unknown issue is long-term effect on employee skill development, enculturation of new employees, innovation, applied creativity and team building. In the near term, all firms are running off of accumulated social capital: already-formed relationships, business culture understanding (“how we do things”) and social and professional networks. That is as true in the connectivity and data center business as in any other industry.


Every entity can, in the short term, sacrifice the intangibles provided by face-to-face interactions, both internally and in terms of relationships with customers and prospects. What remains untested is the long-term impact, as social capital decays. 


Consider opinions on remote learning. A recent survey by McKinsey found that, on average, teachers in all eight countries ranked online instruction at a score of five out of ten. In Japan and the United States, nearly 60 percent of respondents rated the effectiveness of remote learning at between one and three out of ten. 


source: McKinsey 


As always, “averages” can obscure big differences. In Japan, only two percent of teachers felt that online classes were comparable to learning in person; most felt it was much worse. So did most U.S. teachers. 


Just five percent of U.S. teachers agreed that online and remote instruction was as good as in-person teaching.


source: McKinsey 


Conversely, 32 percent of Australian and German teachers deemed remote learning to be as effective as in-person learning. Some 33 percent of Canadian teachers and 30 percent of Chinese teachers thought online instruction was as good as in-person teaching. 


The larger point is that the long-term impact of virtual or remote processes--ranging from education to sales--cannot yet be assessed. Results may well vary by industry, job roles and functions, worker age and experience, cultures and nations. 


Equally challenging will be an assessment of widespread hybrid or flexible work patterns. Knowledge worker or office worker productivity is notoriously hard to measure and the range of hybrid work scenarios might be quite disparate. 


DIY and Licensed GenAI Patterns Will Continue

As always with software, firms are going to opt for a mix of "do it yourself" owned technology and licensed third party offerings....