Friday, April 5, 2019

Will 6G be Based on Frequencies at and Above 95 GHz?





Some already speculate, based at least in part on actions by the U.S. Federal Communications Commission, that future mobile platforms, including 6G, will use almost-impossibly-high frequencies in the bands above 95 GHz. What applications could develop based on such frequencies are yet unknown. Something beyond artificial reality, augmented reality and artificially-intelligent apps are where we'd be looking. 

What Digital Transformation Really Means

Digital transformation is one of those somewhat-nebulous terms one hears all the time, where it comes to what enterprises need to do to survive and thrive in their future markets. One hears all sorts of near-platitudes about how companies must now be continuously reinventing their business processes.

The not-as-often mentioned reason for all this “digital reinvention” is that firms and industries must adapt to an era of direct-to-consumer business models that disrupt or destroy traditional distribution channels.

“Digital connections will "cut out the middleman” while “manufacturers will sell directly to customers,” Forrester Research researchers say. All of that means “changing the economics of selling, service, and fulfillment.”

In other words, the carrot is better performance in a direct-to-consumer world. The stick is business disruption and loss of markets.

The sort of typical way this is stated is that firms must create the ability to deliver easy, effective and emotional customer experiences that customers value. Many would say the winners are working from the customer’s perspective, not the organization’s view. That is almost too lyrical.

Digital transformation is much more raw; a response to more-difficult markets characterized by growing transparency of supply and prices that combine to attack profit margins.

In other words, though we often think of digital transformation as something firms need to do--and though that is an apt characterization--digital transformation speaks to the “Amazoning” or “Alibaba-ing” of virtually all markets.

“Using hardware, software, algorithms, and the internet, it's 10 times cheaper and faster to engage customers, create offerings, harness partners, and operate your business,” say researchers at Forrester Research.

Ability to create and support digital marketplaces is one angle. But It is more than widespread “online commerce.” Nor is it just the ability to create digital products and services.
“You want to be customer obsessed, not competitor obsessed,” Forrester researchers say.

All true. But what is really happening is a drastic altering of the balance of power between customers and companies.

And that means lower revenue and lower profit margins as transparency destroys pricing premiums created by consumer lack of knowledge or accessibility.

Non-efficient pricing becomes nearly impossible.

So, ignoring all the fancy clauses, firm digital transformation aims to prepare firms for a direct-to-consumer world, with all that implies for the ways services and products are created, marketed, sold and supported.

Thursday, April 4, 2019

Will AR and VR Finally Make the Net Neutrality Debate Superfluous?

“Network neutrality” rules never have been designed to prevent business services from providing different levels of service; prioritized delivery or quality of service. That is precisely why content delivery networks add value.

The issue is whether rules mandating that nothing other than “best effort” internet access for consumers actually is good policy, going forward, if one assumes that any number of new apps and services, based on augmented reality or virtual reality, are going to be important some day, for consumers.

With the caveat that there is much nonsense in the arguments made in favor of network neutrality rules--“save the internet” being among the most obvious examples of that--it seems obvious that if VR and AR require stringent control of latency, that is an obvious example of why forbidding anything other than best-effort internet access is going to be an obstacle to AR and VR apps and services.

For gaming apps, a human requires 13 milliseconds or more to detect an event. A motor response by a gamer might add 100 ms of latency, just to react. But then consider artificial reality or augmented reality use cases.

To be nearly indistinguishable from reality, one expert says a VR system should ideally have a delay of seven milliseconds to 15 ms ms between the time a player moves their head and the time the player sees a new, corrected view of the scene.

The Oculus Rift can achieve latency of about 30 ms or 40 ms under perfectly optimized conditions, according to Palmer Luckey.

There also are other latency issues, such as display latency. A mobile phone, for example, might add 40 ms to 50 ms to render content on the screen. Any display device is going to add about that much latency, in all likelihood.

The point is that end-to-end latency is an issue for VR apps, and edge computing helps address a potentially-important part of that latency.  

To have any hope of reducing latency to tolerances imposed by the displays themselves, VR and AR content will have to have extensive forms of quality of service guarantees, almost certainly by caching content at the very edges of the network, and using networks such as 5G with very low latency.

To be sure, it is not clear that something other than 5G best effort latency is a problem if the edge data centers are close enough to the radio sites. On the other hand, neither is it obvious that an edge services provider can be legally barred from charging for the use of what is a next-generation content delivery network.

And that might ultimately be the practical resolution of the “best effort only” conundrum. Perhaps standard best-effort delivery on a 5G network is good enough to support VR and AR, so long as content is edge cached. So there are no fast lanes or slow lanes: all lanes are fast.

On the other hand, edge computing services can charge a market rate for use of their edge computing networks.

Will Amazon Get into the LEO Constellation Business?

Connectivity service providers, at a practical level, have to define their competition as “other connectivity providers” in a tactical sense. When assessing market share, they benchmark against other firms selling the same products.

At a strategic level, few really believe their main competition is “other connectivity firms,” anymore. Instead, the key strategic challengers are big application and platform suppliers who supply the end user value enabled by networks.

But, sometimes, the strategic foes become tactical realities. So it is that Amazon plans to launch a constellation of low earth orbit satellites, which will instantly make Amazon a global internet service provider.

Project Kuiper is the code name for a “big, audacious space project” involving satellites and space-based systems. Filings made with the International Telecommunications Union provide some outlines.

The proposed constellation will feature 3,236 satellites in low Earth orbit — including 784 satellites at an altitude of 367 miles (590 kilometers); 1,296 satellites at a height of 379 miles (610 kilometers); and 1,156 satellites in 391-mile (630-kilometer) orbits.

One might question whether all the major LEO constellations will be sustainable. So far, the big ventures include SpaceX, OneWeb, Telesat and Leosat, though a few other firms, including Boeing and Facebook,  also have been talking about the possibilities.

Virtually nobody thinks the market is big enough to support all those ventures. So consolidation will happen, and some of the ventures will not launch.

Still, the point is that sometimes a strategic rival (firms generating lots of value and revenue in the ecosystem) also becomes a very-tactical rival (firms actually competing in connectivity services).

The emergence of such competition is one reason long-term revenue and profit margin prospects for connectivity providers are decaying.

Some estimate there could be a radical 85 percent reduction in tier-one and tier-two firms globally.

Consolidation lies ahead for most telcos in Asia, according to J.P. Morgan, and possibly for most telcos globally, according to Nokia Bell Labs.

Where there now are 810 telecom service providers, there will be but 105 by 2025, says Bell Labs. That consolidation of about 87 percent in seven or eight years would be beyond comprehension, for most of us, and would be an apocalypse for most in the industry.

Capgemini calls an era of massive consolidation  on a “spectacular” level. New competition from one or more global LEO constellations is just part of that basic story.

Winning the "5G Race" Will Not Matter, Long Term

5G often is positioned as a “race,” with winners and losers. That way of looking at information and communications technology has been repeated over and over again over the last several decades about any number of consumer-facing innovations.

The “race” metaphor is no more relevant today than it was then.

It was probably inevitable that some would claim the United States is falling behind in the “race” to 5G.

After all, it has been argued that the United States was behind, or falling behind, in use of mobile phones, then smartphones, use of text messaging, broadband coverage, fiber to home, broadband speed or broadband price.

Some even have argued the United States was falling behind in spectrum auctions.  All of those prior characterizations have proven temporary, or wrong. What such observations often miss is a highly dynamic environment, where apparently lagging metrics quickly are closed.

And even when national comparisons are made, there often is not a terribly good correlation between high rankings in use of a technology and ability to produce value, at scale, from such adoption.

National rankings of adoption of any access technology are likely to prove ephemeral. And even when not ephemeral, there is not a very good correlation between supply, adoption and economic value.

Consider voice adoption, where the best the United States ever ranked was about 15th, among nations of the world, for teledensity.

For the most part, nobody really seemed to think that ranking, rather than higher on the list, was a big problem, for several reasons. Coverage always is tougher for continents than for city states or small countries. Also, coverage always is easier for dense urban areas than rural areas. The United States, like some other countries (Canada, Australia, Russia) have vast areas of low population density where infrastructure is very costly.

On virtually any measure of service adoption (voice or fixed network broadband, for example), it will be difficult for a continent-sized market, with huge rural areas and lower density, to reach the very-highest ranks of coverage.

For such reasons, no continent-sized country with vast interior and sparsely-settled areas will reach the top of any list of countries with fastest speeds. Nor is it ever easy to “know” when speeds, prices or availability are a “problem.” Disparities between rural and urban areas almost always are viewed as a problem.

Prices are harder to characterize. When all countries are compared, such prices must be adjusted for purchasing power. In other words, price as a percentage of income provides a better measure of price. In developed markets, for example, internet access costs about 1.7 percent of per-person gross national income.

The International Telecommunications Union has argued that U.S. fixed network internet access prices are among the lowest-priced globally. Mobile internet access provides another view: in perhaps a hundred countries, mobile internet access already costs less than fixed access.  

According to the latest survey by Cable, U.S. average speed ranks it 20th globally for internet access speed. As always, a significant number of the countries with the highest speeds are small.

Just what winning or losing the 5G race could mean is not simple. Some people think “winning” is a matter of which countries deploy and obtain high adoption first. Others would argue that access does not matter as much as the ability to innovate and create in terms of connected business models, apps, services and processes.

On that score, very few observers would challenge the claim that innovation leadership in the next phase of applications and technology development is going to happen in China and the United States, at scale. That is not to discount the formidable work going on in Israel, Japan or South Korea. It is simply to note that, globally, at scale, few would doubt that the at-scale progress will happen either in the United States or China.

What matters with 5G is not the speed or ubiquity of supply or demand, though that is not immaterial. Instead, what always matters is the ability of people, firms and nations to harness those innovations for economic growth.

Wednesday, April 3, 2019

Edge Computing Facilities Look Different, Depending on Where the "edge" Is Located

Where is the edge? All over the place, one might say, based on the perceptions of various participants in the content and data ecosystem.

"Web scale players think the edge means regional colocation data centers, hundreds or even thousands of kilometers away from the user,” says Joe Madden, Mobile Experts principal analyst.

Mobile operators see the edge as a location between 2 km and 100 km away from any end user.

REITs and micro data center supporters see the Edge close to the radio towers, less than 5 km from most users. Enterprises think that the edge needs to be in-building or at the client device.

All those definitions might have relevance for different edge use cases.

And that also means edge computing real estate can have very different footprints. An infrastructure edge data center at the base of a cell tower might look like this Vapor.io deployment.


At other locations, where computing demand is expected to be higher, one might see more than one cargo-container-sized physical buildings.

Enterprise sites might use a relatively standard cabinet with server racks on the premises.


Much hinges on the business model and the use case. Advertising apps might well be just fine with a regional approach, as latency is not much of an issue for many advertising requirements. Infrastructure edge will make sense for latency sensitive apps outside of main enterprise locations.

Large enterprises often will be able to use on-the-premises edge computing. Right now, it is impossible to quantify the size of revenue opportunities for edge computing “as a service” providers. Lots of servers will be installed, of course. Lots of electrical power will be used. New structures, with racks, air conditioning and electrical supply, will be needed.

So the picks and shovels will be busy. Revenue for edge computing services will take a longer time to build.

Some Possibly Good News for the Communications Industry

As far as analyses go, this positioning of the communications business, by consultants at Accenture, is somewhat reassuring. Unlike some other industries, including the postal service, energy and transportation, the communications business is less volatile.

On the other hand, communications also is more durable and less vulnerable. On the other hand, communications is not among the more viable, most durable of industries either, such as the software and high tech industries.

The point, Accenture argues, is that connectivity providers need to move beyond connectivity towards orchestration (up the stack). For the past decade or so, consultants and analysts have been telling retail-focused connectivity providers they had to choose: either be an efficient bit pipe or become a value-added solutions provider.


This latest argument by Accenture consultants is in that same vein. There are, fundamentally, not too many choices. Service providers can “stick to their knitting” and become more efficient providers of voice and data services. Or they can move up the stack into applications.

One big question, though, is what that means. Some, including Accenture, seem to argue for becoming a platform, more than becoming a solution provider. “Simply orchestrating vendors in a connectivity network will not add value—moving to a more open and different platform-based play, based on a modern IT architecture, will be the key to enabling growth,” they say.

It seems obvious that managerial skill, and a bit of luck, will be required over the next decade as the industry continues to morph into something different.

DIY and Licensed GenAI Patterns Will Continue

As always with software, firms are going to opt for a mix of "do it yourself" owned technology and licensed third party offerings....