Wednesday, June 5, 2019

AT&T FTTH Gains Seem Mostly Upgrades by Existing Customers

What impact is AT&T fiber-to-home having on company internet access accounts. Well, it is, as they say, “complicated.” For two decades or more, when a telco replaces copper access lines with optical fiber, it sees two types of changes.

Copper-based accounts go down, while optical accounts go up. Perhaps most of the change is simply existing customers switching from copper to fiber access services, for little net gain in total accounts.

That seems to be happening with AT&T FTTH accounts as well. According to Moffett Nathanson, AT&T achieved 57 percent growth rates for its FTTH services over the past year. But total broadband accounts have not grown at the same rate.

“Despite the dramatic growth at AT&T Fiber, AT&T’s broader IP broadband category has posted only modest subscriber gains over the past year,” the firm says.

The logical explanation is that existing customers have been upgrading to the new FTTH service, and dropping their former copper or fiber-to-node service. That also would make sense if AT&T conducted the sort of marketing campaigns cable operators do when introducing a new service: target existing customers first.

Over time, as those conversions reach an end, AT&T is likely to switch to targeting non-subscribers. Only after a few years of that activity will we be able to assess how well AT&T’s FTTH program has been in potentially taking market share from cable operators.

Which Firms Most Value "Critical Thinking?"

“Critical thinking skills” seem to be valued much more than technical expertise by executives running firms that emphasize innovation and experimentation.

That stands to reason, actually. Firms that intend to innovate people who “think outside the box,” as there often is no template to follow. Firms with other strategies, such as that of “fast follower,” only need to know enough to copy the leaders. Imitation, not thinking, is required.

Deloitte’s 2019 Industry 4.0 readiness report surveyed 612 technology, media, and telecom (TMT) respondents. Of that sample, Deloitte says 29 percent were “high innovators, firms that place a high priority on innovation and embrace experimentation, giving their leaders the leeway to learn from failure.

“Strikingly, the number-one skill that high innovators say they’re working hardest to develop isn’t technical: It’s critical thinking skills,” Deloitte says. “For less-innovative companies, critical thinking comes in last of eight skills probed.”

“Human skills” may be just as crucial to success as technical ones, overall, however. While 65 percent of respondents to the Deloitte 2018 global human capital trends survey indicated technical skills will need to increase as AI is integrated into enterprises, 62 percent pinpointed the expanding need for complex problem-solving skills, followed closely by cognitive abilities, process skills, and social skills.

30-Minute Video Tutorial on 5G Spectrum

What Happens When Mobile Spectrum Capacity is 100X Greater?

Spectrum scarcity long has been a key assumption of mobile or fixed wireless service provider business models. But scarcity is diminishing. This year, some 1550 MHz of new mobile spectrum has been released by the Federal Communications Commission. Another 3400 MHz is going to auctioned in 2019.

That will increase the amount of mobile licensed spectrum by 10 times on a physical basis, and perhaps as much as 100 times once all the other network innovations are added (spectrum aggregation, spectrum sharing, dynamic sharing, small cells, beam forming).

That comes in  addition to incremental allocations in low-band (600 MHz, 700 MHz) and mid-band (AWS), the Federal Communications Commission now is releasing huge amounts of new spectrum--both licensed and unlicensed--in the millimeter regions.

And the impact those allocations will have on mobile operator capacity are unprecedented. In part, that new capacity is required simply to support ever-growing mobile data consumption. But the almost-universal belief is that boosting capacity so much will create new use cases, apps, services and revenue sources.

Consider that the leading four U.S. mobile providers have operated with 100 MHz to 180 MHz of spectrum assets in the 4G era. But all the allocated mobile spectrum has totaled less than 1,000 MHz.

But the latest FCC auctions of 28-GHz (auction 101) spectrum represented an additional 850 MHz of new spectrum. The latest auction of 24-GHz (auction 102) spectrum added another 700 MHz of capacity.

The next auction (auction 103) of 37 GHz, 39 GHz and 47 GHz spectrum will release 3400 MHz of new spectrum, in addition to the 1550 MHz released in auctions 101 and 102.

If Verizon, AT&T and T-Mobile US won just 300 MHz each of new spectrum on a national basis in auctions 101 and 102, that would double to triple the total amount of licensed spectrum each has to work with. And auction 103 is coming.

Here’s what spectrum holdings for major mobile networks and Dish Network looked like, before the latest millimeter wave auctions.  

In part, that new capacity is required simply to support ever-growing mobile data consumption. But the almost-universal belief is that boosting capacity so much will create new use cases, apps, services and revenue sources. Some of the use cases will develop based on ultra-low latency; others on ultra-high bandwidth.

In other instances, use cases grow not directly from 5G spectrum but from the ability to aggregate, share and dynamically allocate spectrum for private or enterprise networks as well as conventional mobile service.

The availability of so much more capacity means fixed wireless now might be feasible, on terms and conditions and make it a feasible substitute for fixed network access. That could change the fortunes of mobile and fixed network suppliers alike.


What Changes Most in Era of Mobile TV?

Looking back on all the potential changes in entertainment video over four decades, the biggest changes have been the move to higher-definition images, the shift in screen aspect ratios and screen sizes, growth of on-demand viewing and explosion of content sources.

There has been effort to change the nature of content (three-dimensional content the most recent effort). Custom viewing angles and other interactive features have, from time to time, been seen as promising.

But the shift of video viewing to mobile devices is among the bigger changes.

The average U.S. adult will spend three hours, 43 minutes watching video on mobile devices in 2019, just above the 3:35 spent on TV. Of time spent on mobile, U.S. consumers will spend 2:55 on smartphones.

What remains unclear is precisely how the “mobile TV” business will develop. So far, the biggest change has simply been the viewing of content on small mobile screens, not a change in video features, interactivity or image formats.

A reasonable assumption, based on past developments, is that business model impact will happen around viewing time on the mobile screen. There has been some effort to create aspect ratios optimized for the vertical orientation of the phone, but many of us expect the big trend will simply be mobile consumption of standard content.

In that sense, usability of video streaming on mobile devices as well as TV screens is the biggest likely driver of changes in consumer spending. “I want what I want, when I want it” remains the single most important principle of the video entertainment business.

Fixed Wireless Might be the Only Way U.S. Telcos Compete with Cable

Despite much skepticism in many quarters, there is a simple reason why 5G fixed wireless will get serious attention by U.S. telcos. There is almost no other platform that viably could keep telcos within some distance of cable operator internet access offers.

The traditional answer has been to install fiber to the home networks, but the business model now is rather sharply llimited, and the risk of stranded assets explains the dilemma.

Clearly, cable TV dominates residential broadband. Cable has about 60 percent market share nationally and has been getting essentially all the net new additions for a decade.

The problem for any new telco FTTH supplier is that 60 percent of the potential customer locations (cable operator customers) will not generate any revenue. Immediately, the problem is building a business case when 60 percent of assets are stranded and producing zero revenue.

And though linear video is undergoing a transition to on-demand streaming formats, cable operators as an industry segment continue to have the greatest share of linear video accounts. So linear video helps the business case, in the near term, but is challenging over the next decade, as that business will dwindle.

Residential voice is declining for both telcos and cable companies and is not a driving revenue opportunity for either industry segment, and not a reason for building FTTH.


Ignoring for the moment business services and residential mobility, where cable operators also are taking market share, it is a reasonable statement that upgraded fiber to home networks pose a difficult challenge for telcos, as the business case is challenging, in terms of payback. In most markets, the revenue drivers for a residential fixed network are internet access and entertainment video.


And there are many instances where those two services, supplied by a telco, face grueling odds. If cable has 60 percent share of internet access, with networks able to supply 1 Gbps now, and having a glide path to 10 Gbps, then a telco building FTTH is playing catch-up, and no more. How many observers are confident that a telco can ever gain more than 44 percent share in the U.S. residential market, at scale? Perhaps not many.


The number of third parties entering the consumer fixed network business, mostly leading with internet access, slowly is growing. And that means the residential market becomes, in some areas, a three-provider market, further limiting potential telco gains.


Video helps the business case in urban and suburban markets, but always has been a money-loser in rural areas, and in any case is a mature and declining revenue source.


Some might argue that, in the 5G and succeeding eras of mobility, the primary value of a fixed network is backhaul. That might be an overstatement, but is directionally correct. Even cable operators expect wireless backhaul to be an important opportunity.


That is why a wave of asset dispositions, which have had fixed network operators selling fixed network assets, has happened. Most recently, CenturyLink has said it is open to selling the consumer business it operates.


The bottom line is that the market limits the potential for profitable telco investments in FTTH.


All of that sharply limits telco options in the next-generation network area. The business case for FTTH simply might not work, and yet abandoning the business also often is impossible (the assets cannot easily be sold).


All that explains the interest in 5G mobile and fixed wireless access. As mobile substitution for fixed service has cannibalized fixed voice, so the hope is that wireless internet access can become an effective substitute for fixed connections. For AT&T and Verizon, that is the business answer to next-generation fixed networks where the FTTH business case does not work.


For T-Mobile US, fixed wireless is a way to participate in the fixed networks businesses it could not enter as a mobile-only service provider. For cable operators, fixed wireless is an obvious way to enter adjacent markets without full fixed network overbuilds.


Other markets might have more-favorable upside from FTTH. But it is hard to avoid the conclusion that, in a great many portions of the U.S. consumer fixed networks business, cable operators have market share that telcos will be very hard pressed to attack.


Consumer internet access speeds must be boosted, to stay within reach of cable offers, and to maintain the value of their fixed networks. Doing so when FTTH does not offer a payback is the challenge 5G fixed wireless is meant to solve.  

Tuesday, June 4, 2019

IT, Public Policy Initiatives Fail at a High Rate. IoT Will Not be Different

In either private information technology or public policy arenas, failure is a common outcome. That is worth remembering as we enter a period of experimentation around internet of things use cases, ranging from smart cities to industrial automation.

Half of enterprise software projects fail, it often is found. There is some evidence that enterprise project failure rates have improved. But maybe not in the internet of things area, where 75-percent failure rates are not unheard of.  

One study of World Bank projects found failure rates of 25 percent on operational grounds, but up to 50 percent on the ability to solve the stated problem.

“Historically, the majority of major projects in government have not delivered the anticipated benefits within original time and cost expectations,” says a report by the U.K. National Audit Office.

“We’ve been in business 11 years, we have 1,400 customers, and 80 percent of all the projects we’ve seen in 11 years are customers who come to us with an existing failed IoT project,” says Nick Earle , Eseye chairman and CEO.

Policy organizations like the World Bank judge success based on whether planned products are delivered through an efficient process; not whether policies solve the problems that warranted intervention in the first place, or whether the policies promoted development outcomes, says Matt Andrews of the Kennedy School at Harvard University.

In his paper Public Policy Failure: ‘How Often?’ and ‘What is Failure, Anyway’?, Andrews asks a hard question: “public policies absorb resources to address major social issues,” so “we should know if policies are proving bad social investments; routinely failing to solve focal problems at high costs.”

Public policy initiatives represented an estimated 16 percent of global gross domestic product in 2017,  about $13 trillion, he notes.So a high rate of policy failure would mean that we are wasting these resources.

Between 2016 and 2018, for example, the World Bank reported that, of 400 projects, failure rates ranged between 25 percent and 50 percent. The failure rate is about 25 percent  if the definition is simply processes being followed or output measures achieved.

The failure rate is about 50 percent if the definition asks whether the policy intervention solves the problem it was designed to solve, says Andrews. The former measure is what we might call “efficiency.” The latter measure is “effectiveness.”

Effectiveness is what we want: an outcome that actually solves the problem we were trying to solve. Efficiency only measures whether we did the right things, using the metrics we said we’d use.

Others might ask even more provocative questions: are public projects destined to fail? Among the chief problems of many possible influences, consultants at PwC see three big buckets of issues:
• Methods and processes
• Stakeholder and leadership
• Complexity and uncertainty



The first bucket of issues related to efficiency and procedures. The other two are fuzzy, political and had to quantify, because they deal with stakeholder involvement, leadership and the complexity of the problems policies are intended to fix.

More Computation, Not Data Center Energy Consumption is the Real Issue

Many observers raise key concerns about power consumption of data centers in the era of artificial intelligence.  According to a study by t...