Wednesday, August 19, 2020

Why Some Execs do Not See 5G as a Fix for 4G, or Wi-Fi 6 a Fix for Wi-Fi

Deloitte recently conducted a survey of 415 executives deploying either 5G or Wi-Fi 6, to find out “why are you doing it?” Some 57 percent of respondents report that their organization is currently in the process of adopting 5G and/or Wi-Fi 6, while another 37 percent plan to adopt these technologies within the next year.

Right now, as you would suspect, 4G and Wi-Fi 5 are the mainstays. But respondents expect 5G and Wi-Fi 6 to be mainstays in three years. 


source: Deloitte


Those executives view 5G and Wi-Fi 6 as a force multiplier for other innovative technologies including AI, IoT, cloud, and edge computing. Indeed, 95 percent of respondents believe 5G and Wi-Fi 6 will be important to unleash the value of cloud computing. 


About 83 percent of respondents believe wireless will enable the internet of things, the same percentage that believe edge computing will rely on advanced wireless. 


About 91 percent believe analytics for big data also depend on advanced wireless. The vast majority of enterprises surveyed say they are targeting a blend of scenarios with their adoption of advanced wireless networks.


source: Deloitte


Both indoor and outdoor usage, stationary and mobile devices are expected. Respondents expect to connect employees, machines, and customers. Employee use cases include workplace communications such as messaging and file sharing; device management; collaboration (video, augmented reality, virtual reality, remote workplaces), analytics and virtual network support. 


Machine support for sensors and analytics for machine-generated data also will be key. Autonomous vehicles, robots, unmanned aerial vehicles or delivery vehicles are other machine networking use cases. Asset tracking, safety and assembly processes also are expected to be enabled by 5G and Wi-Fi 6. 


Customer behavior analytics (shopping, buying, price trends, recommendations, location-based apps); security and fraud prevention (biometrics, location checking and blockchain); asset tracking; enhanced customer experience and supply chain efficiencies are expected. 


source: Deloitte


“5G is not just a faster and more reliable access technology, but also the genesis of a new communications network architecture,” Deloitte argues. 


What you might find surprising is that 5G and Wi-Fi 6 are not said to be important because the current networks are failing or troublesome. 


More than 80 percent of respondents are “satisfied” or “extremely satisfied” with a range of traditional performance characteristics of their current wireless networks,


Likewise, 80 percent of respondents are “satisfied” or “extremely satisfied” with the security of their networks and data, ability to control and customize their networks, interoperability, scalability, technology maturity, and ease of deployment. 


Nor is network age an issue. Some 75 percent of respondents say their networks are less than three years old. 


Instead, they are hoping to “unlock competitive advantage and create new avenues for innovation in their operations and offerings.” About 57 percent of respondents believe their company’s current networking infrastructure prevents them from addressing the innovative use cases. 


About 87 percent believe their company can create a significant competitive advantage by leveraging advanced wireless technologies.


It perhaps is mildly surprising that so many enterprises envision investments in 5G and Wi-Fi 6 at a time when the current networks actually are working well and are newly-deployed. 


To be certain, people and organizations buy “solutions to problems” expressed in concrete software, hardware and connectivity products. In this case, there are no apparent failures to counteract. 


The new investments are not being driven by performance issues, coverage, reliability or other network shortcomings, but by hoped-for business advantages the existing networks cannot support.


Sunday, August 16, 2020

Utility Regulation of Broadband?

 It never is entirely clear to me what proponents of regulating broadband “as a utility” have in mind. You might recall that we once regulated telecommunications as a “utility,” with limited market entry and price controls. Over a process of decades, starting in the mid-1980s, U.S. regulators slowly began to loosen those regulations, which originally were put into place as telecom was seen as a “natural monopoly.”

Natural monopolies, it is argued, must be regulated because only one supplier can exist. In such cases, market competition cannot act to restrain predatory behavior. But there is no such consensus anymore. Mobile and fixed communications market have been proven not to be natural monopolies, at least in the U.S. market. 


As often is the case, good intentions can be thwarted by inappropriate policies that actually create the opposite of intended benefits. You might recall that under monopoly regulation, business communication prices were quite high, to subsidize consumer services, which were moderately priced if not characterized by innovation and creativity. 


Prices fell, and usage rose as competition was introduced for long distance services, even before passage of the Telecommunications Act of 1996, which substantially deregulated the fixed network business. A look at AT&T revenues between 2000 and 2013 illustrates the point. 


Revenues from the deregulated fixed networks business dropped about 50 percent. Mobility nearly tripled. Cash flow from fixed network operations was slashed nearly two thirds. Mobility, historically unregulated, boomed and prices fell. As always, the changes have many drivers. Demand changed as consumers preferred new services. 


source: Deloitte


The same happened in other markets, as deregulation lead to lower prices, higher innovation and much-higher usage, with a huge amount of new investment. Global prices have fallen because of competition. 


To be sure, some prices--such as for consumer fixed network voice service--have risen. That is because the actual cost of service cannot be subsidized any longer by profits from long distance service. That being the case, retail prices must reflect actual costs. 


What is never clear to me is why some regulators and policy advocates think matters would be better if we reversed course and returned to monopoly regulation of fixed network services. That would doom a business with declining revenues and slim to no profits to further decline, were prices to be regulated. 


Unable to raise prices, ISPs would logically allow service quality to degrade, reduce costs, continue to downsize employment and slice investment, as profits would be very difficult to earn.


Has Pandemic Really Slowed 5G?

There is a tendency to input causation whenever there is correlation, and permanent changes caused by big--yet transitory--phenomena.We never act as though any single volcanic eruption or hurricane will “forever” change business and life in the affected area. Rather, our assumption is that life will return to normal over a period of months to years.

And yet it is most common to hear arguments that global life and business will never be the same after the Covid-19 pandemic, even as life already is returning to normal levels and behavior in many countries that are further along the recovery curve. 


An analysis of the way 5G is being used to ameliorate pandemic problems might be interpreted as conventional wisdom suggests, namely that the pandemic has slowed down all economic activity and 5G roll outs. 


In fact, the report suggests slowdowns and accelerations both have happened, the World Economic Forum suggesting that fixed wireless efforts have accelerated. One might have made that case before the pandemic, though. 


source: Maximize Market Research


Likewise, some infer and believe that bandwidth consumption patterns are permanently altered by the pandemic. That might be the case, but not for the “because of the pandemic” reason often cited. Every next-generation mobile platform since 2G has resulted in higher mobile data consumption. 


So we should not be surprised to hear that per-user mobile data consumption has increased 300 percent since 5G was commercially launched in South Korea. That is what we should expect. It has almost nothing to do with permanent changes directly caused by the pandemic, though people forced to stay at home from work and school have boosted their video streaming hours. 


That will change as they go back to work and school. 


It is likely more accurate to say that the pandemic and forced stay-at-home rules accelerated some already-occurring changes, ranging from a shift to video streaming from linear TV, more gaming, more work from home and online shopping. Pushing volume “up and to the right” is a permanent change, to be sure, but not a new trend; simply an acceleration of what had already been happening. 


We might ultimately be surprised that many predicted permanent changes did not happen in a way we will be able to capture quantitatively. Though the Great Recession of 2008 caused a massive change in economic activity, it was not “permanent.” Activity more than rebounded. The internet bubble burst of 2001 caused massive asset value changes. But valuations of new and surviving firms rebounded. Smooth out the data by looking at a decade or two worth of data and one can detect no permanent change. 


Neither 5G or other ongoing trends will be immune from that reversion to mean.


Saturday, August 15, 2020

CBRS is "Mobile Plus" Spectrum

As U.S. auctions of 3.5-GHz Citizens Broadband Radio Service have gone through 47 rounds of bidding, we now have a better idea of buyer estimation of the value of priority access licenses. To wit, those PAL licenses already are valued very close to 2-GHz mobile spectrum, and the auctions are not over yet..

At a high level, that suggests buyers see CBRS has having as much value as 2-GHz mobile spectrum. And, as was the case for the early 2-GHz spectrum awards, that is the ticket to market entry on a facilities basis for new competitors. Both Sprint and what became T-Mobile US were launched on the basis of new 2-GHz spectrum allotments. 

CBRS also will be supported by best effort spectrum access without a license, on the pattern of Wi-Fi. How big a revenue opportunity that activity might create is another question. Some internet service providers will undoubtedly explore the use of unlicensed CBRS to support rural internet access. That is something many wireless ISPs have done before.

But CBRS also is expected to support at least some private networks as well. Such private networks offer value to users and operators, but no often no direct revenue upside for the network operator.

Of course, there are many in the value chain who are not in the "connectivity as a service" role. All such private networks will create demand for infrastructure, maintenance, upgrades, design, perhaps connectivity and other services supporting other parts of the ecosystem. Consider radio instrastructure.

A 2019 report on the indoor market opportunity for CBRS, from Maravedis and EJL Research predicted that the CBRS radio node market will grow from revenues of about $3 million to $100 million by 2024, “driven primarily by private LTE deployments.” Keep in mind that is a prediction about mobile-type radio infrastructure used to support indoor market communications. 


Other estimates of CBRS radio capex illustrate the fact that--interesting as it is for some parts of the ecosystem--CBRS represents a very-small percentage of total mobile radio spending. According to Mobile Experts, CBRS CBRS radio infrastructure spending will not hit $1 billion in annual spending for five years or so. 


That makes CBRS an interesting and important market for some, not for all, on a global level. 


source: Dell'Oro Group


Other parts of the CBRS value chain represent various amounts of new sales activity as well, but it might be fair to note that many opportunities which are transformative or important for some suppliers (access system administrators, infrastructure, software, integration and consulting) might not prove especially large for connectivity service providers.


The value of CBRS obviously is a non-zero number, though, and the value often will come in the form of avoided cost, not direct incremental revenue. Cable TV operators are expected to benefit primarily from avoided mobile wholesale capacity costs.


Some internet service providers, though, may be able to use CBRS to support their internet access businesses, using CBRS for fixed wireless access. System integrators, network designers and consultants might in some cases see meaningful revenue upside as well. 


Of course, not all CBRS spectrum will likely be used using PAL. As with Wi-Fi, CBRS spectrum also includes “best effort” access without a license. For some use cases, best effort access might be sufficient, especially for many private networks. 


Also possible are many collaborative ventures where a PAL license holder might be willing to allow use of its license for a big private network, in some business arrangement. That might be interesting for large areas such as port facilities where a mobile operator can expect little financial return for providing direct service. 


In some cases, some apps or use cases might benefit from licensed access with less risk of signal interference and therefore unpredictable or less predictable performance. Collaborative networks (private on the premises but connected to the public network; private network but with rights to use PAL on the site; private network built and maintained by a public provider) might make sense in those cases. 


The point is that the new commercial value of CBRS networks will have a wider range of value drivers than has been typical for mobile spectrum.


Friday, August 14, 2020

Second Law of Motion, Second Law of Thermodynamics Provide Key Business Analogies

Newton's second law of motion explains that the acceleration of an object as produced by a net force is directly proportional to the magnitude of the net force, in the same direction as the net force, and inversely proportional to the mass of the object. 


A corollary of sorts is that “friction” is an uncorrelated or unbalanced force. In other words, friction always acts in the direction opposing motion. 


If friction is present, it counteracts and cancels some of the force causing the motion when any object is being accelerated. That means a reduced net force and a smaller acceleration. 


In this illustration Fa is the intended force, but faces uncorrelated forces including gravity and friction that essentially resist the applied force. 


source


And that is a similar concept to the idea of friction in business and life. No matter what resources are mobilized in pursuit of some objective, friction will reduce yield, effectiveness and impact. All organizational effort therefore must overcome friction. That is true for capital investment, competition, operating procedures, product development and production, distribution channels and customer support and service, plus marketing and legal or regulatory tasks. 


Virtually every business activity therefore involves some element of overcoming friction, which is the resistance to desired change or outcomes. 


As an energy conversion is somewhat inefficient, producing unwanted heat in addition to desired energy output, so friction prevents full and complete application of resources to any process. 


Friction is akin to the Second Law of Thermodynamics, which states that disorder (entropy) increases over time. In other words, order proceeds to disorder. So much organizational effort must be devoted to preventing entropy (decay). 


That is why all efforts to create a more frictionless business is an unstated objective of virtually all organizational activities.


Newtonian Technology Trends Post-Covid

Twenty years ago, futurist John Naisbitt wrote High Tech/High Touch, an examination of technology and a follow-on to his 1982 book Megatrends. It was Megatrends which predicted that people immersed in technology would be driven to seek human contact. 


High Tech/High Touch essentially concluded that the trend remains intact, shown in the prominence of both consumer technology markets and products, services and markets that offer escape from technology.


As the Covid-19 pandemic wears on, increasing our reliance on technology and restricting human contact, Naisbitt’s observations still hold. The more we are now forced to use technology, the more important will actual “high touch” matter. 

source: Megatrends


It is almost Newtonian: for every action there is an equal and opposite reaction. Kept indoors, demand for outdoor activities has grown significantly. Forced not to travel, people will want to travel. Required to interact virtually, people will want face-to-face encounters. 


The conventional wisdom that “everything has changed” suggests disruptive change in work and living habits that are permanent. That likely will prove to be a one-sided analysis. 


Post-pandemic behavior might be more unexpected than is commonly suspected, for several reasons. First, linear extrapolation from the present nearly always proves wrong. Non-linear change is more likely, an argument the “everything has changed” view also suggests.


But non-linearity cuts both ways. We might well see non-linear regression to the mean, as well as accelerated change of behavior. 


Many trends that already were underway before the pandemic  will be accelerated to an extent, though not nearly so much as many seem to believe. Naisbitt’s observations suggest why: to the extent we continue to work remotely, more often, we also are going to want and desire face-to-face contact. Unable to freely travel, humans will want to do so again. 


Zoom is not a perfect, or nearly perfect substitute for face-to-face interactions. People will want to get away from their screens, to the extent they are forced to rely on them. 


The Newtonian reaction to high tech will be high touch. 

Why the Broadband "Problem" Cannot be Permanently "Solved"

 So long as we keep changing the definition of “broadband,” we are likely “never” to see “improvement” in the number or percentage of homes or people able to buy the product, no matter how much investment is made in facilities. 

When we change definitions of minimum speed, for example, we automatically increase the number or percentage of locations or people that cannot buy the product. Colloquially, that is known as “moving the goalposts.” Put another way, our understanding of “broadband” changes over time. 


The classic definition of broadband was that it was any service running at speeds of 1.5 Mbps. In the U.S. market the official definition of “broadband” is 25 Mbps. But most consumers buy service at speeds an order of magnitude higher than the minimum definition. Yesterday’s power user is today’s light user. 


source: Openvault


And though new platforms might help, a continuing evolution of our definitions to support an increase in minimum speeds will continue to be a challenge for any market or country with lots of rural or thinly-populated areas. In the United States, six percent of the land mass is where most of the people live. 


How we define the market also affects our analysis of the amount of competition in the consumer broadband market. The common observation in the U.S. market, for example, is that minimum service at 25 Mbps is unavailable to “millions” of people. 


Of course, that finding requires a big assumption, namely that all satellite and mobile services are excluded from the analysis. Two U.S. satellite suppliers sell broadband access across virtually the entire continental land mass, while mobile speeds already exceeded the minimum threshold in 2019 and early 2020. 


If any and all services supplying 25 Mbps or faster speeds are considered, it might be very difficult to find any U.S. locations unserved by at least two providers. 


The point is that definitions and assumptions matter. By continually increasing the speed used as the definition of “broadband,” we will almost arbitrarily keep moving the goal line on who has it, where it is available and how many competitors can sell it. 


Ignore for the moment consumer choice, which has shown that most consumers buy services in the middle of the range: not the most costly or least costly; not the fastest or slowest offerings. 


Because “typical, average or median speeds” will keep getting higher, so will our definitions be adjusted. But at a time when satellite and mobile minimum and average speeds often already exceed the minimum definitions, and where most fixed network consumers buy services an order of magnitude above the “minimum” threshold, it is hard to “close the digital divide.”


There likely will always be some statistical gaps. Where there is a serious “problem” actually is--or will be--more debatable.


Directv-Dish Merger Fails

Directv’’s termination of its deal to merge with EchoStar, apparently because EchoStar bondholders did not approve, means EchoStar continue...