Monday, April 8, 2019

Can 5G Providers Sell QoS?

Can 5G service providers charge a premium for low-latency performance guarantees, when the stated latency parameters--best effort--are already so low? That is a question that also might be asked in other ways.

Will 5G best effort service be good enough, latency-wise and bandwidth, to obviate the need for any additional quality of service features to preserve low latency and bandwidth?

Is there a market for quality of service when delivered bandwidth rates are so high, and latency performance so much better than 4G? In a broader sense, as network performance keeps getting better on both latency and bandwidth dimensions, can connectivity providers actually sell customers on QoS-assured services?

Also, some would argue, it becomes problematic to try and maintain QoS packets are encrypted at the edge. A service provider cannot prioritize what it cannot see. And that is the growing trend as most traffic gets encrypted.  

By about 2020, estimates Openwave Mobility, fully 80 percent of all internet traffic will be encrypted. In 2017, more than 70 percent of all traffic is encrypted.

The other change is the emergence of edge computing for latency-sensitive applications. We can assume that the whole point of edge computing is to provide a level of quality assurance that cannot otherwise be obtained.

As content delivery networks provide such assurances to enterprises and content suppliers for a fee, so it is likely that edge computing networks or other networks relying on network slicing to maintain low-latency performance will be sold as a service to enterprises who want that latency protection.

Such deals do not violate network neutrality rules, which do not apply to business services such as content delivery networks. So, ultimately, between encryption, network slicing, edge computing and CDNS, there might actually not be much of a market for consumer services featuring QoS.

Best-effort-only never has been part of the vision for next-generation networks, whatever might have been proposed for the public internet. According to the International Telecommunications Union, “a Next Generation Network (NGN) is a packet-based network able to provide services including Telecommunication Services and able to make use of multiple broadband, QoS-enabled transport technologies and in which service-related functions are independent from underlying transport-related technologies.”

Telecom Industry About to be Amazoned

Question: What happens to any market Amazon enters? Answer: public market valuations drop; market share shifts (or at least people expect that to happen with time).

Rhetorical question: What happens when Amazon enters the telecom business? And it will. Amazon is moving to commercialize its own fleet of nearly 4,000 low earth orbit satellites, to provide internet access to literally every square inch of the earth’s surface.

Amazon wants to be an ISP for the same reason Google and Facebook do, with one major difference: the revenue model. As any ad-supported app provider’s success hinges on the total number of people able to use the apps, so any commerce supplier’s fortunes rest on the number of consumers it can establish direct or indirect relationships with.

And Amazon believes it will do better when more connections can be made directly, without relying on the goodwill of governments or other private firms.

“Four billion new customers” is a big enough carrot to justify launching a constellation of nearly 4,000 low earth orbit satellites, making Amazon one of the world’s potentially biggest internet service provider firms.

It is easy to predict what the implications are for others in the connectivity services ecosystem.

The "Amazon effect" refers to the impact created by the online, e-commerce or digital marketplace on the traditional brick and mortar business model due to the change in shopping patterns, customer expectations and a new competitive landscape. https://www.investopedia.com/terms/a/amazon-effect.asp

Some note that Amazon’s business activities prevent inflation. That’s another way of saying prices cannot rise much.  


Surveys tend to show that contestants in any business believe entry by Amazon into their own markets affects gross revenue, profit margins and distribution channels.


To be sure, margin pressure already is a huge issue in the global telecom business for other reasons (competition, changes in end user demand). Amason’s entry into one or more parts of the connectivity value chain will only worsen those pressures.

The big observation is that the telecom industry is about to be “Amazoned.”

Saturday, April 6, 2019

Will ITU Refarm All C-Band Spectrum for 5G?

Later in 2019, at the World Radio Conference later this year (WRC-19), it is perhaps likely that the entire C-band spectrum presently used by satellite operators will be reallocated for IMT-2000 (5G) purposes.

That is some of the backdrop to current discussions by the satellite, cable TV and other interests, including the Federal Communications Commission, about reallocating up to 500-MHz of spectrum presently allocated for C-band satellite, to 5G, itself a component of the overall 5G FAST plan.  

“I believe the best option would be to pursue a proposal put forth by a large, ad hoc coalition of equipment manufacturers, wireless providers, and unlicensed users,” said FCC Commissioner Michael O'Rielly. “They recommend that the FCC allocate spectrum now used for satellite C-Band downlinks (3.7 to 4.2 GHz) for licensed mobile communications and designate 6 GHz spectrum (5.925 to 7.125), which includes the C-Band uplink, for unlicensed use.”

If approved, this approach would free up 1700 megahertz of spectrum, 500 megahertz for licensed and up to 1.2 gigahertz for unlicensed purposes.

Satellite and mobile interests seem always at odds about spectrum allocation, so positions on the latest efforts in C-band will not be foreign.

A report by an advisory committee to the U.S. Secretary of Defense, co-written by vice president of wireless at Google, Milo Medin and tech venture capitalist Gilman Louie makes the point that such a development hinges on use of spectrum sharing, as has been pioneered by Citizens Broadband Radio Service.

The report recommends the “NTIA, FCC and Department of State should advocate the reallocation of the C-band satellite spectrum to IMT-2000 5G use at the World Radio Conference later this year (WRC-19), and take measures to adopt sharing in all 500 MHz of the band in the United States on an accelerated basis for fixed operations.”

A shift of former C-band satellite spectrum in the 4-GHz region might also be more important than some believe, if global 5G supply chains and service providers build product volume in the 3-GHz to 4-GHz frequency ranges.

“In the near term, 3 and 4 GHz spectrum will likely serve as the dominant global bands that drive volume in infrastructure and device deployments,” the authors argue.

And that provides some idea of the importance of how the Federal Communications Commission sets policy for refarming as much as 500 MHz of C-band spectrum in the United States, which is in the crucial band the authors say will be an area of robust supply chain focus.

Friday, April 5, 2019

After Edge Computing, Network Slicing, CDNs, is There Any Market for QoS Internet Access?

Can 5G service providers charge a premium for low-latency performance guarantees, when the stated latency parameters--best effort--are already so low? That is a question that also might be asked in other ways.

Will 5G best effort service be good enough, latency-wise and bandwidth, to obviate the need for any additional quality of service features to preserve low latency and bandwidth?

Is there a market for quality of service when delivered bandwidth rates are so high, and latency performance so much better than 4G? In a broader sense, as network performance keeps getting better on both latency and bandwidth dimensions, can connectivity providers actually sell customers on QoS-assured services?

Also, some would argue, it becomes problematic to try and maintain QoS packets are encrypted at the edge. A service provider cannot prioritize what it cannot see. And that is the growing trend as most traffic gets encrypted.  

By about 2020, estimates Openwave Mobility, fully 80 percent of all internet traffic will be encrypted. In 2017, more than 70 percent of all traffic is encrypted.

The other change is the emergence of edge computing for latency-sensitive applications. We can assume that the whole point of edge computing is to provide a level of quality assurance that cannot otherwise be obtained.

As content delivery networks provide such assurances to enterprises and content suppliers for a fee, so it is likely that edge computing networks or other networks relying on network slicing to maintain low-latency performance will be sold as a service to enterprises who want that latency protection.

Such deals do not violate network neutrality rules, which do not apply to business services such as content delivery networks. So, ultimately, between encryption, network slicing, edge computing and CDNS, there might actually not be much of a market for consumer services featuring QoS.

Best-effort-only never has been part of the vision for next-generation networks, whatever might have been proposed for the public internet. According to the International Telecommunications Union, “a Next Generation Network (NGN) is a packet-based network able to provide services including Telecommunication Services and able to make use of multiple broadband, QoS-enabled transport technologies and in which service-related functions are independent from underlying transport-related technologies.”

Will 6G be Based on Frequencies at and Above 95 GHz?





Some already speculate, based at least in part on actions by the U.S. Federal Communications Commission, that future mobile platforms, including 6G, will use almost-impossibly-high frequencies in the bands above 95 GHz. What applications could develop based on such frequencies are yet unknown. Something beyond artificial reality, augmented reality and artificially-intelligent apps are where we'd be looking. 

What Digital Transformation Really Means

Digital transformation is one of those somewhat-nebulous terms one hears all the time, where it comes to what enterprises need to do to survive and thrive in their future markets. One hears all sorts of near-platitudes about how companies must now be continuously reinventing their business processes.

The not-as-often mentioned reason for all this “digital reinvention” is that firms and industries must adapt to an era of direct-to-consumer business models that disrupt or destroy traditional distribution channels.

“Digital connections will "cut out the middleman” while “manufacturers will sell directly to customers,” Forrester Research researchers say. All of that means “changing the economics of selling, service, and fulfillment.”

In other words, the carrot is better performance in a direct-to-consumer world. The stick is business disruption and loss of markets.

The sort of typical way this is stated is that firms must create the ability to deliver easy, effective and emotional customer experiences that customers value. Many would say the winners are working from the customer’s perspective, not the organization’s view. That is almost too lyrical.

Digital transformation is much more raw; a response to more-difficult markets characterized by growing transparency of supply and prices that combine to attack profit margins.

In other words, though we often think of digital transformation as something firms need to do--and though that is an apt characterization--digital transformation speaks to the “Amazoning” or “Alibaba-ing” of virtually all markets.

“Using hardware, software, algorithms, and the internet, it's 10 times cheaper and faster to engage customers, create offerings, harness partners, and operate your business,” say researchers at Forrester Research.

Ability to create and support digital marketplaces is one angle. But It is more than widespread “online commerce.” Nor is it just the ability to create digital products and services.
“You want to be customer obsessed, not competitor obsessed,” Forrester researchers say.

All true. But what is really happening is a drastic altering of the balance of power between customers and companies.

And that means lower revenue and lower profit margins as transparency destroys pricing premiums created by consumer lack of knowledge or accessibility.

Non-efficient pricing becomes nearly impossible.

So, ignoring all the fancy clauses, firm digital transformation aims to prepare firms for a direct-to-consumer world, with all that implies for the ways services and products are created, marketed, sold and supported.

Thursday, April 4, 2019

Will AR and VR Finally Make the Net Neutrality Debate Superfluous?

“Network neutrality” rules never have been designed to prevent business services from providing different levels of service; prioritized delivery or quality of service. That is precisely why content delivery networks add value.

The issue is whether rules mandating that nothing other than “best effort” internet access for consumers actually is good policy, going forward, if one assumes that any number of new apps and services, based on augmented reality or virtual reality, are going to be important some day, for consumers.

With the caveat that there is much nonsense in the arguments made in favor of network neutrality rules--“save the internet” being among the most obvious examples of that--it seems obvious that if VR and AR require stringent control of latency, that is an obvious example of why forbidding anything other than best-effort internet access is going to be an obstacle to AR and VR apps and services.

For gaming apps, a human requires 13 milliseconds or more to detect an event. A motor response by a gamer might add 100 ms of latency, just to react. But then consider artificial reality or augmented reality use cases.

To be nearly indistinguishable from reality, one expert says a VR system should ideally have a delay of seven milliseconds to 15 ms ms between the time a player moves their head and the time the player sees a new, corrected view of the scene.

The Oculus Rift can achieve latency of about 30 ms or 40 ms under perfectly optimized conditions, according to Palmer Luckey.

There also are other latency issues, such as display latency. A mobile phone, for example, might add 40 ms to 50 ms to render content on the screen. Any display device is going to add about that much latency, in all likelihood.

The point is that end-to-end latency is an issue for VR apps, and edge computing helps address a potentially-important part of that latency.  

To have any hope of reducing latency to tolerances imposed by the displays themselves, VR and AR content will have to have extensive forms of quality of service guarantees, almost certainly by caching content at the very edges of the network, and using networks such as 5G with very low latency.

To be sure, it is not clear that something other than 5G best effort latency is a problem if the edge data centers are close enough to the radio sites. On the other hand, neither is it obvious that an edge services provider can be legally barred from charging for the use of what is a next-generation content delivery network.

And that might ultimately be the practical resolution of the “best effort only” conundrum. Perhaps standard best-effort delivery on a 5G network is good enough to support VR and AR, so long as content is edge cached. So there are no fast lanes or slow lanes: all lanes are fast.

On the other hand, edge computing services can charge a market rate for use of their edge computing networks.

Net AI Sustainability Footprint Might be Lower, Even if Data Center Footprint is Higher

Nobody knows yet whether higher energy consumption to support artificial intelligence compute operations will ultimately be offset by lower ...