Thursday, July 8, 2021

Can Broadband Definitions be Changed in a Platform-Neutral Way?

In principle, broadband speeds will keep increasing almost indefinitely. A reasonable projection is that headline speed be in most countries by 2050 will be in the terabits per second range. 


Though the average or typical consumer does not buy the “fastest possible” tier of service, the steady growth of headline tier speed since the time of dial-up access is quite linear. 


And the growth trend--50 percent per year speed increases--known as Nielsen’s Law--has operated since the days of dial-up internet access. Even if the “typical” consumer buys speeds an order of magnitude less than the headline speed, that still suggests the typical consumer--at a time when the fastest-possible speed is 100 Gbps to 1,000 Gbps--still will be buying service operating at speeds not less than 1 Gbps to 10 Gbps. 


Though typical internet access speeds in Europe and other regions at the moment are not yet routinely in the 300-Mbps range, gigabit per second speeds eventually will be the norm, globally, as crazy as that might seem, by perhaps 2050. 


The reason is simply that the historical growth of retail internet bandwidth suggests that will happen. Over any decade period, internet speeds have grown 57 times. Since 2050 is three decades off, headline speeds of tens to hundreds of terabits per second are easy to predict. 

source: FuturistSpeaker 


Some will argue that Nielsen’s Law cannot continue indefinitely, as most would agree Moore’s Law cannot continue unchanged, either. Even with some significant tapering of the rate of progress, the point is that headline speeds in the hundreds of gigabits per second still are feasible by 2050. And if the typical buyer still prefers services an order of magnitude less fast, that still indicates typical speeds of 10 Gbps 30 Gbps or so. 


Speeds of a gigabit per second might be the “economy” tier as early as 2030, when headline speed might be 100 Gbps and the typical consumer buys a 10-Gbps service. 


source: Nielsen Norman Group 


So there is logic to altering minimum definitions over time, as actual usage changes. In fact, typical speeds might be increasing faster than anticipated. 


Also, there is a difference between availability and actual customer demand. Fewer than 10 percent of potential customers who can buy gigabit service actually do so, at the moment, in the U.S. market. 


Perhaps 30 percent of customers buying service at 100 Mbps or higher believe it is more than they need. Perhaps 40 percent of customers buying gigabit services believe it is more than they need. 


The issue is that no definition can be technologically neutral, either for upstream or downstream speeds. Since 75 percent to 80 percent of U.S. customers already buy fixed network service at speeds of 100 Mbps to 1,000 Mbps, a minimum definition of “broadband” set at 100 Mbps is not unreasonable. 


The issue is that some platforms, including satellite, fixed wireless and digital subscriber line, will have a tough time meeting those minimums. 


Cable operators are virtually assured they can do so with no extra effort. 


Upstream bandwidth poses issues for most platforms other than modern fiber-to-home platforms, if the definition were set at 100 Mbps upstream, for example. 


It’s tricky, and almost impossible to reset broadband definitions in a platform-neutral way.


No comments:

Will AI Actually Boost Productivity and Consumer Demand? Maybe Not

A recent report by PwC suggests artificial intelligence will generate $15.7 trillion in economic impact to 2030. Most of us, reading, seein...