Tuesday, January 25, 2022

5G Private Networks Do Not Necessarily Displace Wi-Fi

Private 5G networks--where an enteprise uses 5G as a local area network platform--are a "new use case" for 5G networks, beyond the obvious support of the latest-generation mobile network. But different industries seem to have different appetites for private 5G.


Nor, in principle, does a private 5G network necessarily displace the need for Wi-Fi. As virtually all observers note, private 5G might have application where process controls, privacy concerns or extremely low-latency applications must be supported.


In many instances, private 5G will also be used because local computing is desirable, typically for reasons of latency performance, high compute intensity or bandwidth needs as well. So demand for private 5G networks tends to correlate with edge computing, in many cases.


In other cases, network slicing using the public network might also be favored.


In the new Real Wireless white paper How Ready is the UK for 5G? It is clear that private network use cases predominate across a range of industries, especially manufacturing and healthcare. 


Transport and logistics as well as creative industries show a mix of private 5G and public 5G modes. 


In manufacturing, private networks are expected to dominate. 

Source: Real Wireless


Healthcare shows a different pattern: private networks delivered using a public network.


Source: Real Wireless


Transport and logistics shows a mix of private and public modes. 


Source: Real Wireless


Creative industries likewise show a mix of private and public networks. 


Source: Real Wireless


Monday, January 24, 2022

AT&T Launches Commercial 2-Gbps Symmetrical and 5-Gbps Symmetrical Fixed Network Internet Access

As predicted by Edholm’s Law, internet access speeds continue to climb. AT&T, for example, just activated symmetrical 2-Gbps and symmetrical 5-Gbps service for 5.2  million locations across 70 U.S. markets, with plans to deploy across the whole footprint in 2022 and later years. 


Edholm’s Law states that internet access bandwidth at the top end increases at about the same rate as Moore’s Law suggests computing power will increase. 


Nielsen's Law, like Edholm’s Law, suggests a headline speed of 10 Gbps will be commercially available by about 2025, so the commercial offering of 2-Gbps and 5-Gbps is right on the path to 10 Gbps. 

source: NCTA  


Headline speeds in the 100-Gbps range should be commercial sometime around 2030. 


How fast will the headline speed be in most countries by 2050? Terabits per second is the logical conclusion. Though the average or typical consumer does not buy the “fastest possible” tier of service, the steady growth of headline tier speed since the time of dial-up access is quite linear. 


And the growth trend--50 percent per year speed increases--known as Nielsen’s Law--has operated since the days of dial-up internet access. Even if the “typical” consumer buys speeds an order of magnitude less than the headline speed, that still suggests the typical consumer--at a time when the fastest-possible speed is 100 Gbps to 1,000 Gbps--still will be buying service operating at speeds not less than 1 Gbps to 10 Gbps. 


Though typical internet access speeds in Europe and other regions at the moment are not yet routinely in the 300-Mbps range, gigabit per second speeds eventually will be the norm, globally, as crazy as that might seem, by perhaps 2050. 


The reason is simply that the historical growth of retail internet bandwidth suggests that will happen. Over any decade period, internet speeds have grown 57 times. Since 2050 is three decades off, headline speeds of tens to hundreds of terabits per second are easy to predict. 

source: Nielsen Norman Group, Future Speaker

 

Of course, the typical consumer customer does not typically buy the top-available speed at any given point of time. 


AT&T Fiber 2 GIG costs $110 per month plus taxes. AT&T Business Fiber 2 GIG costs $225 per month plus taxes. The “typical” consumer household probably now pays about $50 a month for internet access. 


AT&T Fiber 5 GIG: $180 per month plus taxes. AT&T Business Fiber 5 GIG costs $395 per month plus taxes. 


The point is not that so many consumer households will buy the top offers. The point is that--with higher “fastest” speed tiers, the typical buyer also tends to move up. As more customers buy the 2-Gbps or 5-Gbps services, a change in the adoption rate will happen when we hit about 10-percent adoption of either service. 


Adoption of 1-Gbps service now has passed the 10-percent point, so we should see accelerated adoption of gigabit services. Where half of U.S. consumers now buy services in the 100 Mbps to 200 Mbps range, we should see a shift of those buyers to higher-speed services over the next few years, as the top end also continues to move. 


Once 2-Gbps adoption hits about 10 percent of households, half of the rest of the market will start to move to speeds more in the range of 400 Mbps. Once 5-Gbps adoption hits about 10 percent, half the market will start a move towards gigabit service, history suggests.


LIke it or Not, "More Freedom" Might be the Only Way to Deal with Perceived Unfairness

The U.S. Supreme Court, in 376 U.S. 254 (1964) NEW YORK TIMES CO v. SULLIVAN established a principle that public officials cannot claim damages for “libel” against media entities for defamatory falsehoods “relating to his official conduct unless he proves that the statement was made with "actual malice."


In other words, a plaintiff cannot sue because a media entity published something found to be incorrect, but only if the published material was created with “malice.” It is not enough that something we might consider “libelous” was said; only that it was said intentionally, with malice and a “reckless disregard” for the facts of the matter. 


When a statement concerns a public figure, according to the Court, it is not enough to show that it is false for the press to be liable for libel. Instead, the target of the statement must show that it was made with knowledge of or reckless disregard for its falsity.


Hence the difficulty public officials (or public figures generally) face when suing a media outlet for libel: the plaintiff must prove malicious intent, not simply that something is “false.” 


Of course, such Constitutionally-based protections and rulings were developed before social media.  Whether social media calls for a rethink is an issue. Should social media be covered by the same rules as “media?” Is the “malicious intent” requirement too broad? 


More broadly, does the First Amendment protection of free speech only apply to government suppression of free speech, or to suppression by other means? Right now, social media are considered covered by the First Amendment in terms of all content moderation. They are classified as “speakers” for purposes of First Amendment protections. 


Perhaps there is no other way to consider them. Even if they act in ways many consider biased or unfair, there is a First Amendment argument that this is their right. 


And earlier government efforts to mandate a “Fairness Doctrine” have generally been considered failures, and even unnecessary in an era where so many outlets for speech exist. Mandating “fairness” might prove more onerous or ineffective than not mandating it. 


Still, media law and First Amendment law have evolved with the emergence of new media. The initial responses have often been to “limit freedom.” Over time, law has generally moved in the direction of “more freedom.” 


Even if not considered by many to be “fair,” this might be the only Constitutionally-correct approach: more freedom.


How Much Mobile Substitution for Home Broadband by Fixed Networks?

Mobile substitution has been an on-going trend in the connectivity business for decades. Mobile became the preferred method for consumer voice communications early on. Text messaging and then multi-media messaging displaced other forms of text communication. Then mobile became the dominant way people use social media.


Now more entertainment video is consumed on mobile devices than used to be the case, displacing use of televisions or PC screens.


Next up we might see mobile substitution for fixed network internet access (home broadband).


Though situations will vary from place to place, one potential development with 5G has been its potential to replace some Wi-Fi use, especially where tariffs allow unlimited data usage of 5G. And Wi-Fi is simply the local distribution of fixed network broadband networks.


In New Zealand, which has had 5G live since December 2019, a new Opensignal analysis shows that smartphone users see much faster speeds connected to 5G than when using Wi-Fi. 


“And, for multiplayer mobile gaming 5G is on par with Wi-Fi, unlike older 4G which was noticeably inferior to Wi-Fi,” Opensignal says. 


That could be a significant driver of user behavior in any market where unlimited 5G data usage is typical, as such plans eliminate the cost savings shifting to Wi-Fi offer. If experience is equivalent to Wi-Fi, and there is no cost savings when switching to Wi-Fi, many users might conclude it just makes sense to remain connected to 5G even when Wi-Fi offload is available. 


“Additionally, the quality of the 5G experience shows that 5G has the potential to offer better connectivity inside homes, shops, offices and other locations where Wifi is available,” Opensignal says.


source: Opensignal 


Opensignal tests showed 5G users getting an average 240.7 Mbps on 5G, 4.8 times and 5.7 times faster than the average speeds seen when connected to Wi-Fi or 4G, respectively. 


New Zealand placed in the top 15 markets for 5G download speed and for the improvement in speeds using 5G relative to 4G in recent Opensignal studies. 



source: Opensignal 


Multiplayer mobile games on smartphones showed insignificant differences in experience when on Wi-Fi or 5G, Opensignal says. 


“The arrival of 5G means that cellular connections are no longer always inferior to Wi-Fi,” Opensignal concludes. 


As always, a new platform, coupled with the right pricing, can be disruptive in the access market. 


Potentially, in markets such as New Zealand, where 5G performance is better than Wi-Fi, and unlimited usage or some form of “no overage charge” pricing or “plans big enough to handle all your needs” are  in effect, value can shift towards mobile access and away from fixed access, in at least some cases.


Single-user households would seem most vulnerable to mobile substitution. But even multi-user households could benefit, when all mobile users are on a single account, providing price discounts, and where users think using their phones as hotspots is not worth the bother, compared to using Wi-Fi for game players, security devices, audio players and so forth.


Connectivity providers selling both fixed and mobile internet access likely will continue to discourage such forms of mobile substitution by limiting the amount of data that can be used each month by mobile devices acting as hot spots.


Still, depending on each household's usage profile, full mobile substitution might still make financial sense. But that has to be balanced, case by case, against higher mobile tariffs for unlimited usage, hotspot usage limits and the range of other devices to be supported.


Users of home automation and security products might find mobile substitution operationally difficult.


Saturday, January 22, 2022

Supply Chain Issues Highlight AI Cost Savings More than Revenue Gains in 2020

A McKinsey survey of global enterprise executives suggests they believe artificial intelligence is boosting bottom line results by about five percent. Some 27 percent of respondents agree with that claim. 


source: McKinsey 


And while AI’s revenue benefits have held steady or even decreased since the previous survey—especially for supply-chain management, where AI was unlikely to compensate for the pandemic era’s global supply-chain challenges—the opposite is true of costs, McKinsey says. 


Respondents report significantly greater cost savings from AI than they did previously in every function, with the biggest year-over-year changes in the shares reporting cost takeout from using AI in product and service development, marketing and sales, and strategy and corporate finance.


Will Disaggregation Create New Business Models?

Some 40 years ago, we would have noted that connectivity service providers purchased their networking systems from a mere handful of suppliers. 


That is a change from some 50 years ago, they built their own systems and gear, some of us also would note. 


In the next phase of connectivity industry development, new providers are likely to emerge. Some private networks might use 5G or other protocols in place of Wi-Fi, for example, to support local area networks. 


From a connectivity service provider’s perspective, the bigger challenge might be new entrants we already are seeing more of: home broadband overbuilders (independent internet service providers); app providers as access providers; system integrators taking a bigger role in running wide area networks and possibly app providers as WAN providers and access providers. 


All of that could happen because of  the growing role now played by virtualized and open technologies in networks. Once roles and functions are disaggregated--broken into distinct layers--it is possible for entities to build and operate networks that take advantage of the disaggregation. 


What are the implications of everything as a service for service provider organizations, data center operators and system integrators? Can new suppliers emerge using different combinations of functions and acting as integrators?


What does it mean to say we are moving to a world where all capabilities are location independent; where anything we associate with cloud computing is available as a service? What are the implications for our notions of “connectivity provider” and “app provider?”


How might business models change? What if more of the value supplied to customers and end users is a mix of features that include connectivity? As we might argue Google, Facebook and others give away computing services and apps for free and monetize with advertising, becoming computing service suppliers with advertising business models, might we also argue that Amazon has become a computing supplier with a commerce business model, augmented by advertising?


What would it mean if computing, software, hardware and communications are available as a service? Are there new mixes of value and revenue where internet access is simply a feature of the product? In other words, is it possible that some forms of internet access become “free?”


What if there is no need to worry about what is private cloud and what is public cloud; what is the “access” and what is the “application?”


Computing Improves Linearly; Social, Economic, Political, Behavior Not So Much

Occasionally it is helpful to step back from the day-to-day and review your business, firm, industry or situation with a longer time frame. Sometimes we can only assess where we have been by doing so.


The exercise arguably is more difficult when trying to extrapolate where we are going. “Predictions are hard, especially about the future,” many, including physicist Niels Bohr, have quipped. 


“Extrapolations from techno-scientific innovations have a distressing capacity to be deterministic,” says historian Amanda Rees. One example might be the impact of computing and communications evolution on social, economic, political or scientific endeavors. 


It is easier to describe and predict some changes in computing capability than to predict how they might affect changes in the biological or social world. 


Still,  to the extent that any specific problem can be solved if sufficient computing power is available, at low cost, there are at least some indicators of potential. 


Many of us might note that we are able to use millimeter wave radio frequency spectrum for consumer and business communications only because the cost of signal processing--enabled by the reduction in computing cost, form factor, along with increases in capability--allow us to do so much sophisticated signal processing that the spectrum can be made to work for consumer communications. 


I have argued in the past that an understanding of Moore's Law “saved” the U.S. cable TV industry in the 1980s when high-definition television was developed. 


Perhaps we might also say that those same developments in performance made possible streaming video services that now are cannibalizing cable TV. 


The point is that it is difficult to extrapolate future developments in a linear way from linear improvements in computing capability. But it sometimes helps to think about the application of computing in situations where business models formerly unthinkable can become quite practical. 


Anything we see in consumer internet applications--where capabilities are supplied at no cost to users--provides an excellent illustration. The classic question is what does your business look like if a key cost constraint is removed. 


Though we might have mischaracterized key elements of the argument, ride sharing did raise questions about what it would mean if “cars were free.” They obviously are not “free,” buit musing about changes in personal transportation have happened because of the existence of ride sharing.  


The difficulty always is that other drivers of behavior also exist. Consider consumer demand for mass transit, which seems to be falling as other options--and social changes--develop. Many riders had less need--or no need--during Covid-19 pandemic restrictions on “going to work or school.” But lower mass transit ridership trends were in place even before Covid, both internationally and in the United States.  


But many speculate that the availability of ride sharing has diminished use of public transportation, though other social forces also seem to be operating. 


Likewise, we might argue that vastly-improved computing and storage price-performance curves are good enough to allow applied artificial intelligence in a growing range of use cases. Most of those use cases involve inferences about future impact based on historical metrics. 


Letting farmers know when to water or apply fertilizer, and in what quantities, should lead to improved crop production. Industrial processes likewise should be improved when we can predict when a particular machine will fail, or what must be adjusted in real time to optimize output. 


Lots of other supply chain or process processes likewise should benefit from cheap and ubiquitous ways to manage and optimize present flows of resources, whether that be people walking, cars on highways or other logistics-related issues. 


Computing progress means new applications or use cases can develop in a non-linear way, even when computing rates of development are linear. 


Technologist Ray Kurzweil noted in 2005 that “in 1968, you could buy one (Intel) transistor for a dollar. You could buy 10 million in 2002.”


Looking at the cost of a single compute cycle, Kurzweil also noted in 2005 that “the cost of a cycle of one transistor has been coming down with a halving rate of 1.1 years.”


“You get a doubling of price-performance of computing every one year,” he said. 


One likely impact in the global communications industry is the impact of AI-assisted networks on worker skill requirements, to say nothing of the improvements in network performance or availability. 


Some argue that skills will need to be upgraded as networks get smarter. The countervailing argument is that skill requirements might change, but not as much as people think. When the networks are smarter, they will be able to predict potential outages or degradations, allowing automatic or manual changes to prevent problems. 


Outside plant or core networking work might not become more complicated at all; the work might become less complicated, in terms of adjustments and maintenance. That might shift possible priorities in other ways that involve different tasks and skills, though not necessarily “higher” skills (depending on how one defines “higher). 


Perhaps more effort shifts to marketing and away from plant maintenance. That might involve different skills, but not necessarily “higher” skills for most people. When social media algorithms dictate marketing actions, the heavy lifting is done by the algorithms. 


People at firms simply need to know what outcomes they wish to achieve.


Can Blockchain Remedy Some of the Internet's "Anonymity" Problems?

The value of data transport protocols other than TCP/IP has been growing for the past decade. Those of you with long memories might recall that the global telecom industry debated the protocols for its next-generation network in the 1990s, with many favoring asynchronous transfer mode. 


To make a long story short, the industry chose TCP/IP, itself once believed to be a “transition” protocol. There were lots of reasons, but chief among them was the cost of connecting. ATM was relatively expensive; TCP/IP was radically cheaper. And not even volume deployment was going to eradicate the price differential. 


But there is another sense--beyond transmission costs--that is likely to become even more important in the decade ahead: the business value. We often forget that TCP/IP is based on “layers” that separate functions from each other.


That has led to the “over the top” way applications are created and owned on all networks using TCP/IP, and at this point that is virtually all networks. In part, that is because all networks now are computer networks, and TCP/IP was originally conceived as a data networking protocol. 


As we create applications on computing networks--applications separated from connectivity--so we also create applications running on public and private wide area networks. 


“Anonymity” has been one feature of the internet that can be troublesome, for social reasons such as enabling bullying, financial fraud, phishing, spoofing and catfishing, to note a few problems. 


But “trust” has become a significant business issue in the internet era. Is it possible that business models that inherently have better “trust” attributes could supplant much of the “zero trust” nature of the internet?


Some might argue blockchain is a candidate to change the “trust” dimension of the internet, for consumers and business users. It once was argued that anonymity was important for political dissidents living under repressive regimes, and there is logic there. 


But that same anonymity arguably makes the consumer internet less useful, and positively harmful, for a similar reason: anonymity frees human beings from the in-person courtesy and respect they might otherwise show people. Anonymity encourages extreme expressions. 


Known identities have become more important, for all sorts of good reasons. Maybe a shift to blockchain--with a new emphasis on verifiable identities, will be a good thing.


$50 a Month for Speeds Between 100 Mbps and 200 Mbps is the "Sweet Spot" for U.S. Home Broadband

The “sweet spot” for U.S. home broadband is a monthly recurring cost around $50 and speeds between 100 Mbps and 200 Mbps, which is purchased by about half of all U.S. home broadband customers. 


Pricing by independent internet service provider Vyve Broadband shows the packaging reflecting buying patterns. The 200-Mbps package sells for $50 a month. The gigabit package, likely bought by about 11 percent to 12 percent of homes, sells for about $70 a month. 


The lowest tier offers 100 Mbps for $40 a month. 


 

source: Vyve Broadband 


It might seem curious, but the new payback analysis for home broadband using fiber to the home also is about $50 per customer location, at take rates close to $50 a month, according to AT&T.


For those of you who follow the payback models for FTTH, that is somewhat shocking, as models from 20 years ago would have assumed per-customer revenue closer to $100 to $130 per month, to make the model work. 


That the revenue assumptions have changed so much reflect secular changes (declining demand for fixed network voice and linear video entertainment) as well as changes in cost structure related to operating cost and capital requirements for home broadband as well. 


It also is noteworthy that T-Mobile’s 5G home broadband service is priced at $50 a month. Though T-Mobile no longer seems to emphasize “speed,” it had in April 2021 talked about speeds up to 100 Mbps. 


Verizon’s 5G fixed wireless has recently been repriced to $50 per month, with speeds up to perhaps 300 Mbps. 


All that tells you where the mass market demand is believed to exist. The packaging will change, of course, in terms of typical speeds and prices. "More speed for the same price" as well as "significantly faster speeds for a higher price" are the two trends that will likely remain in place.


Friday, January 21, 2022

IP Was a Business Model Change, Not Just a New Networking Platform

Almost always, big changes in networking architecture and platform change the range of possible business models and market shares in the application, connectivity and infrastructure businesses. 


For example, the move to disaggregated, open and virtual networks automatically creates new potential roles for system integrators. Where platforms could be purchased monolithically, new networks can be assembled from various suppliers. 


To note only the most-obvious possible changes, monolithic platform suppliers could lose some market share to new suppliers and network integrators who supply the network components and the complete networks. 


In other words, when we move to disaggregate functions and elements, we automatically create a new need for system integration. 


So we must now look for the emergence of new names in the system integration business, as it applies to core networks and access networks. At some point, if they are willing or forced to concede some roles and revenues, the legacy monolithic network suppliers also are in line to act as system integrators, using elements and software sourced from any number of possible suppliers. 


Consider an earlier change that produced precisely those results. Because “layers” are the technology architecture, disaggregation is both possible and desirable. 


Look at data center or server businesses. Hyperscalers now build their own servers, they do not buy them. They can do so because layers make it possible. Custom software can run on commodity hardware; and commodity hardware can be built “in house.”


Hyperscalers build and own their own wide area networks, they do not have to buy services from WAN suppliers on their core data center to data center routes. 


Hyperscalers build their computing fabrics from modular arrays of servers, not monolithic mainframes. Whenever possible, they virtualize both compute and storage operations, rather than dedicating hardware to those functions. 


In a related process, “everything” is moving to virtualized supply. Enterprises and consumers can buy “services” rather than owning their own hardware and software licenses. Customers can buy computing or storage features “by the instance,” as a service, rather than building and operating their own data centers. 


It is an under-appreciated fact that when the global telecom industry selected internet protocol as its next generation platform, it also--knowingly or not--chose a layered business model. 


IP is not simply a framework for moving bits around; it is a business and revenue architecture as well, separating logical functions in ways that allow whole industry segments to emerge in a disaggregated way. 


In other words, the salience of the term “over the top” is precisely the result of a “layered” approach to building communication networks. When we disaggregate edge devices and functions from transport layer functions, and those from application functions, the revenue streams and possible business models also are disaggregated. 


That is why Meta, Amazon, Netflix and others can build businesses using networks without owning networks. 


IP was not just a technology platform change. It was a profound business model change.


"Open:" How We Got Here

Among the various conversations people had at the #PTC’22 conference are those about where networks are going, where the business is going and where revenue is to be found. Among the topics, 5G and Wi-Fi 6, network slicing, edge networks and the complications of in-home environments have been prominent. 


Among the potentially most far-ranging were questions posed by Robert Pepper, Meta head of global connectivity policy. Use of open technology is simply the latest in a series of transitions that have happened in the networking business over the last 40 years, Pepper said. 


“Disaggregated network elements are 40 years in the making,” Pepper said. The industry transitioned from analog to digital; then hardware to software functions, he noted.


The “next transition is from integrated and proprietary to open and modular networks,” he said. 


There will be big repercussions for suppliers of networking infrastructure. Where telcos 50 years ago developed and made their own gear, they then switched to buying complete networks from a handful of global suppliers. That obviously created huge new businesses, but also made telcos “captive” to a few suppliers and “vendor lock in.”


Suppliers might like that state of affairs, but buyers (telcos) hate it, it is fair to say. In a broad sense, the shift to open and modular networks also represents a shift from vendor-led to operator-led infrastructure development and supply. 


It also is fair to note that there always are private interests that benefit from any wider shift in framework. Perceived benefit hinges on where a firm or industry segment operates in the complete value chain. 


Application supplier business models depend on ubiquitous, high-quality and low-cost  internet access. Access providers are not similarly situated within the value chain. For app providers, high-quality, low-cost internet access is a prerequisite for business. For connectivity providers, access is the business. 


For an app provider, internet access is a cost of doing business. For a connectivity provider access is the core revenue stream. What the former wants is lowest-possible cost and highest-possible quality, the latter wants highest-possible revenue with minimum-possible cost. 


You might argue it is in Meta’s interest for internet access to be universal and good, as it is in a connectivity provider’s interest to reap the highest revenue from access services, with the highest margins consistent with long-term sustainability. 


If Meta is right, economics are moving in the direction of what is favorable for application creators. 


There are clear analogies in the data center or server businesses as well. Hyperscalers build their own servers, they do not buy them.  Hyperscalers build and own their own wide area networks, they do not buy services from WAN suppliers on their core data center to data center routes. 


Hyperscalers build their computing fabrics from modular arrays of servers, not monolithic mainframes. Whenever possible, they virtualize both compute and storage operations, rather than dedicating hardware to those functions. 


Moderator Gary Kim, a PTC volunteer and consultant, noted that when the global telecom industry selected internet protocol as its next generation platform, it also--knowingly or not--chose a layered business model. 


IP is not simply a framework for moving bits around; it is a business and revenue architecture as well, separating logical functions in ways that allow whole industry segments to emerge in a disaggregated way.


In other words, the salience of the term “over the top” is precisely the result of a “layered” approach to building communication networks. When we disaggregate edge devices and functions from transport layer functions, and those from application functions, the revenue streams and possible business models also are disaggregated. 


That is why Meta, Amazon, Netflix and others can build businesses using networks without owning networks. 


IP was not just a technology platform change. It was a profound business model change.




On the Use and Misuse of Principles, Theorems and Concepts

When financial commentators compile lists of "potential black swans," they misunderstand the concept. As explained by Taleb Nasim ...