Wednesday, June 5, 2019

30-Minute Video Tutorial on 5G Spectrum

What Happens When Mobile Spectrum Capacity is 100X Greater?

Spectrum scarcity long has been a key assumption of mobile or fixed wireless service provider business models. But scarcity is diminishing. This year, some 1550 MHz of new mobile spectrum has been released by the Federal Communications Commission. Another 3400 MHz is going to auctioned in 2019.

That will increase the amount of mobile licensed spectrum by 10 times on a physical basis, and perhaps as much as 100 times once all the other network innovations are added (spectrum aggregation, spectrum sharing, dynamic sharing, small cells, beam forming).

That comes in  addition to incremental allocations in low-band (600 MHz, 700 MHz) and mid-band (AWS), the Federal Communications Commission now is releasing huge amounts of new spectrum--both licensed and unlicensed--in the millimeter regions.

And the impact those allocations will have on mobile operator capacity are unprecedented. In part, that new capacity is required simply to support ever-growing mobile data consumption. But the almost-universal belief is that boosting capacity so much will create new use cases, apps, services and revenue sources.

Consider that the leading four U.S. mobile providers have operated with 100 MHz to 180 MHz of spectrum assets in the 4G era. But all the allocated mobile spectrum has totaled less than 1,000 MHz.

But the latest FCC auctions of 28-GHz (auction 101) spectrum represented an additional 850 MHz of new spectrum. The latest auction of 24-GHz (auction 102) spectrum added another 700 MHz of capacity.

The next auction (auction 103) of 37 GHz, 39 GHz and 47 GHz spectrum will release 3400 MHz of new spectrum, in addition to the 1550 MHz released in auctions 101 and 102.

If Verizon, AT&T and T-Mobile US won just 300 MHz each of new spectrum on a national basis in auctions 101 and 102, that would double to triple the total amount of licensed spectrum each has to work with. And auction 103 is coming.

Here’s what spectrum holdings for major mobile networks and Dish Network looked like, before the latest millimeter wave auctions.  

In part, that new capacity is required simply to support ever-growing mobile data consumption. But the almost-universal belief is that boosting capacity so much will create new use cases, apps, services and revenue sources. Some of the use cases will develop based on ultra-low latency; others on ultra-high bandwidth.

In other instances, use cases grow not directly from 5G spectrum but from the ability to aggregate, share and dynamically allocate spectrum for private or enterprise networks as well as conventional mobile service.

The availability of so much more capacity means fixed wireless now might be feasible, on terms and conditions and make it a feasible substitute for fixed network access. That could change the fortunes of mobile and fixed network suppliers alike.


What Changes Most in Era of Mobile TV?

Looking back on all the potential changes in entertainment video over four decades, the biggest changes have been the move to higher-definition images, the shift in screen aspect ratios and screen sizes, growth of on-demand viewing and explosion of content sources.

There has been effort to change the nature of content (three-dimensional content the most recent effort). Custom viewing angles and other interactive features have, from time to time, been seen as promising.

But the shift of video viewing to mobile devices is among the bigger changes.

The average U.S. adult will spend three hours, 43 minutes watching video on mobile devices in 2019, just above the 3:35 spent on TV. Of time spent on mobile, U.S. consumers will spend 2:55 on smartphones.

What remains unclear is precisely how the “mobile TV” business will develop. So far, the biggest change has simply been the viewing of content on small mobile screens, not a change in video features, interactivity or image formats.

A reasonable assumption, based on past developments, is that business model impact will happen around viewing time on the mobile screen. There has been some effort to create aspect ratios optimized for the vertical orientation of the phone, but many of us expect the big trend will simply be mobile consumption of standard content.

In that sense, usability of video streaming on mobile devices as well as TV screens is the biggest likely driver of changes in consumer spending. “I want what I want, when I want it” remains the single most important principle of the video entertainment business.

Fixed Wireless Might be the Only Way U.S. Telcos Compete with Cable

Despite much skepticism in many quarters, there is a simple reason why 5G fixed wireless will get serious attention by U.S. telcos. There is almost no other platform that viably could keep telcos within some distance of cable operator internet access offers.

The traditional answer has been to install fiber to the home networks, but the business model now is rather sharply llimited, and the risk of stranded assets explains the dilemma.

Clearly, cable TV dominates residential broadband. Cable has about 60 percent market share nationally and has been getting essentially all the net new additions for a decade.

The problem for any new telco FTTH supplier is that 60 percent of the potential customer locations (cable operator customers) will not generate any revenue. Immediately, the problem is building a business case when 60 percent of assets are stranded and producing zero revenue.

And though linear video is undergoing a transition to on-demand streaming formats, cable operators as an industry segment continue to have the greatest share of linear video accounts. So linear video helps the business case, in the near term, but is challenging over the next decade, as that business will dwindle.

Residential voice is declining for both telcos and cable companies and is not a driving revenue opportunity for either industry segment, and not a reason for building FTTH.


Ignoring for the moment business services and residential mobility, where cable operators also are taking market share, it is a reasonable statement that upgraded fiber to home networks pose a difficult challenge for telcos, as the business case is challenging, in terms of payback. In most markets, the revenue drivers for a residential fixed network are internet access and entertainment video.


And there are many instances where those two services, supplied by a telco, face grueling odds. If cable has 60 percent share of internet access, with networks able to supply 1 Gbps now, and having a glide path to 10 Gbps, then a telco building FTTH is playing catch-up, and no more. How many observers are confident that a telco can ever gain more than 44 percent share in the U.S. residential market, at scale? Perhaps not many.


The number of third parties entering the consumer fixed network business, mostly leading with internet access, slowly is growing. And that means the residential market becomes, in some areas, a three-provider market, further limiting potential telco gains.


Video helps the business case in urban and suburban markets, but always has been a money-loser in rural areas, and in any case is a mature and declining revenue source.


Some might argue that, in the 5G and succeeding eras of mobility, the primary value of a fixed network is backhaul. That might be an overstatement, but is directionally correct. Even cable operators expect wireless backhaul to be an important opportunity.


That is why a wave of asset dispositions, which have had fixed network operators selling fixed network assets, has happened. Most recently, CenturyLink has said it is open to selling the consumer business it operates.


The bottom line is that the market limits the potential for profitable telco investments in FTTH.


All of that sharply limits telco options in the next-generation network area. The business case for FTTH simply might not work, and yet abandoning the business also often is impossible (the assets cannot easily be sold).


All that explains the interest in 5G mobile and fixed wireless access. As mobile substitution for fixed service has cannibalized fixed voice, so the hope is that wireless internet access can become an effective substitute for fixed connections. For AT&T and Verizon, that is the business answer to next-generation fixed networks where the FTTH business case does not work.


For T-Mobile US, fixed wireless is a way to participate in the fixed networks businesses it could not enter as a mobile-only service provider. For cable operators, fixed wireless is an obvious way to enter adjacent markets without full fixed network overbuilds.


Other markets might have more-favorable upside from FTTH. But it is hard to avoid the conclusion that, in a great many portions of the U.S. consumer fixed networks business, cable operators have market share that telcos will be very hard pressed to attack.


Consumer internet access speeds must be boosted, to stay within reach of cable offers, and to maintain the value of their fixed networks. Doing so when FTTH does not offer a payback is the challenge 5G fixed wireless is meant to solve.  

Tuesday, June 4, 2019

IT, Public Policy Initiatives Fail at a High Rate. IoT Will Not be Different

In either private information technology or public policy arenas, failure is a common outcome. That is worth remembering as we enter a period of experimentation around internet of things use cases, ranging from smart cities to industrial automation.

Half of enterprise software projects fail, it often is found. There is some evidence that enterprise project failure rates have improved. But maybe not in the internet of things area, where 75-percent failure rates are not unheard of.  

One study of World Bank projects found failure rates of 25 percent on operational grounds, but up to 50 percent on the ability to solve the stated problem.

“Historically, the majority of major projects in government have not delivered the anticipated benefits within original time and cost expectations,” says a report by the U.K. National Audit Office.

“We’ve been in business 11 years, we have 1,400 customers, and 80 percent of all the projects we’ve seen in 11 years are customers who come to us with an existing failed IoT project,” says Nick Earle , Eseye chairman and CEO.

Policy organizations like the World Bank judge success based on whether planned products are delivered through an efficient process; not whether policies solve the problems that warranted intervention in the first place, or whether the policies promoted development outcomes, says Matt Andrews of the Kennedy School at Harvard University.

In his paper Public Policy Failure: ‘How Often?’ and ‘What is Failure, Anyway’?, Andrews asks a hard question: “public policies absorb resources to address major social issues,” so “we should know if policies are proving bad social investments; routinely failing to solve focal problems at high costs.”

Public policy initiatives represented an estimated 16 percent of global gross domestic product in 2017,  about $13 trillion, he notes.So a high rate of policy failure would mean that we are wasting these resources.

Between 2016 and 2018, for example, the World Bank reported that, of 400 projects, failure rates ranged between 25 percent and 50 percent. The failure rate is about 25 percent  if the definition is simply processes being followed or output measures achieved.

The failure rate is about 50 percent if the definition asks whether the policy intervention solves the problem it was designed to solve, says Andrews. The former measure is what we might call “efficiency.” The latter measure is “effectiveness.”

Effectiveness is what we want: an outcome that actually solves the problem we were trying to solve. Efficiency only measures whether we did the right things, using the metrics we said we’d use.

Others might ask even more provocative questions: are public projects destined to fail? Among the chief problems of many possible influences, consultants at PwC see three big buckets of issues:
• Methods and processes
• Stakeholder and leadership
• Complexity and uncertainty



The first bucket of issues related to efficiency and procedures. The other two are fuzzy, political and had to quantify, because they deal with stakeholder involvement, leadership and the complexity of the problems policies are intended to fix.

New Study, Sure to be Interpreted as Evidence Quality Broadband Reduces Unemployment, Does Not Say That

Does high-quality broadband “cause” economic growth, job creation, lower unemployment, higher house prices or other positive outcomes? Actually, we do not know, even if we mostly assume those outcomes are produced by high-quality broadband. Consider one new study.

“We find that high speed broadband has significant effects on county-level unemployment rates,” say researchers Bento Lobo, Tennessee at Chattanooga, Department of Finance and Economics; Rafayet Alam, University of Tennessee at Chattanooga, Department of Finance and Economics and Brian Whitacre Oklahoma State University, Department of Agricultural Economics.

“We also find measurable benefits to early adoption of high speed broadband,” they say. The issue is what “effects” actually means. “Causation” is not what the researchers mean. “Positive association between broadband speed and labor market outcomes does not indicate causality,” they say.

Even if the study will mostly be interpreted as providing proof that high-quality broadband leads to, produces or causes higher economic growth or lower unemployment, that is not what the researchers say.

In a separate new study, The Rewards of Municipal Broadband: An Econometric Analysis of the Labor Market, Phoenix Center Chief Economist Dr. George Ford and Phoenix Center Adjunct Fellow Professor R. Alan Seals (Auburn University) use data obtained from the U.S. Census Bureau’s American Community Survey to quantify the economic impact, if any, of the county-wide government-owned network (GON) in Chattanooga, Tenn. on labor market outcomes.

“Across a variety of empirical models, we find no payoffs in the labor market from the city’s broadband investments,” they conclude. “We find almost no statistically significant effects for a wide range of important labor market variables, with the possible exception of a reduction in labor force participation.”

The study looked at private-sector labor force participation, employment status, wages, information technology employment, self-employment, and business income, “all of which appear unaffected by the GON,” the researchers say.

That is not to deny correlation between broadband status and jobs, unemployment, productivity,  rates of economic activity or growth, broadband and home prices, which most studies might find. The problem is that lots of things are correlated, without being causal. Sometimes even correlation is hard to find.

Areas of higher-value housing tend to have the fastest broadband. Those areas also tend to have high education attainment rates, lower unemployment, higher incomes and so forth.

Correlation is not causation. And indeed, in Tennessee, the researchers found correlations. “The data suggest that high speed counties were characterized by roughly one percent lower unemployment rates in 2016 than low speed counties on average.”

“It is unclear whether there is a linear relationship between broadband speeds and economic impact,” they also say.

That wording in their study on broadband speed and unemployment is important. It is one thing to note a correlation between the presence of broadband and economic metrics. It is quite difficult to determine a causal relationship, even if virtually everyone “thinks” broadband helps, in that regard.

They note that “unemployment rates are about 0.26 percentage points lower in counties with high speeds compared to counties with low speeds.” The obvious problem is that areas of higher economic activity and growth are almost certainly also going to have lower unemployment rates.

Also, places of high economic activity and growth, which also tend to produce higher incomes (not to mention population), are likely also associated with higher rates of broadband deployment and access speeds.

Again, correlation is not really the issue. Causation is the missing link.

Unprecedented Increase in Mobile Spectrum Just Occurred

For those of you accustomed to slow change in almost any U.S. mobile or telecom statistic, the recently-concluded Federal Communications Commission auctions fo 24-GHz and 28-GHz mobile spectrum are going to prove an abrupt change.

Those of you who follow spectrum holdings of the leading U.S. mobile service providers know how slowly such capacity metrics change. In the past, total spectrum available to any single mobile provider ranged from 100 MHz to perhaps 180 MHz. The recent auctions of millimeter wave spectrum are likely to radically change the amount of national spectrum holdings by Verizon, AT&T and T-Mobile US.

If Verizon, AT&T and T-Mobile US won just 300 MHz each of new spectrum on a national basis, that would double to triple the total amount of licensed spectrum each has to work with.

Here’s what spectrum holdings for major mobile networks and Dish Network looked like, before the latest millimeter wave auctions.  

Those bar graphs will look substantially different as millimeter wave holdings are added. Recent millimeter wave auctions have seen Verizon pick up substantial spectrum at 28 GHz. AT&T and T-Mobile US were quite active in the 24-GHz auctions.




Never before have the leading mobile service providers increased spectrum holding so much as with the last two spectrum auctions. It will represent an unprecedented increase in spectrum for AT&T, Verizon and T-Mobile US. 

Zoom Wants to Become a "Digital Twin Equipped With Your Institutional Knowledge"

Perplexity and OpenAI hope to use artificial intelligence to challenge Google for search leadership. So Zoom says it will use AI to challen...