Tuesday, November 24, 2020

93% of U.S. Lifeline Accounts Now are Mobile, Not Fixed

In the last reporting period where data seems to be available, mobile subscriptions accounted for about 93 percent of support under the Lifeline program that subsidizes basic communications, and once represented support for fixed network voice connections for low-income customers.


That is a dramatic shift, and illustrates the changes in consumer end user demand that have occurred over the last few decades. The fact that low-income customers no longer wish to buy fixed network voice connections explains why support for rural communications also has shifted to broadband, not voice. 


Perhaps part of the shift in value, beyond simple mobility, is the bundling of internet access with voice, something that is not offered by fixed connections. 


Important General Purpose Technologies Can Easily Take 30 Years to Prove Value

Most predictions about “the future” turn out to be wrong, if not in substance, then surely in timing. Aside from that, few operating executives can afford to place big strategic bets on trends that might evolve over decades. They have a hard enough time making decisions with a five-year or 10-year horizon. 


Occasionally, though, big predictions turn out to be correct, though it might take three decades to find out. Though the internet now is pervasive, not all predictions about the internet were correct. That is not as surprising as predictions made before the “internet” or “web” actually existed. 


The impact of communications technology on economic and social life never was a primary concern for sociologist Daniel Bell. “Post-industrial society” or “the information society” constitute the notable body of his work of interest to professionals in the internet or connectivity businesses. 


But note some of his speculations from 1979 about the “merging of telephone, computers and television into a single...system that allows for transmission of data and interaction between personas or between computers.”


Though it is a term we do not use anymore, he predicts a convergence of computers, TV, and telephones into a single system for real time content retrieval--he specifically uses the term “search”--and transactions, including what we now call “e-commerce.”


He predicted that information including “news, weather, financial information, classified ads and  catalogs” would be “displayed on home television consoles.” Okay, the focus on TV screens turned out to be less significant than use of personal computers (which, though invented, were a hobbyist device until Apple’s 1977 commercialization of the Apple II), smartphones, tablets, smart watches and other ubiquitous screens.  


He called that “teletext.” 


He predicted that “facsimile systems” would be used to send documents and mail. He likely was not thinking of analog facsimile systems but digital transmission of text content (email, PDF, word processing documents and other images.


He probably did not think specifically of user-generated content including photos, images and full-motion video, though the use of multimedia was predicted. 


He also predicted the commercial use of “interactive online computer networks” that we experience today as apps and websites. 


He never used the words “internet,” “web,” “touchscreen,” “mobile,” “application,” “speech to text,” “broadband” or “mouse.” As the TV was the only widely-available screen, it probably only made sense that this would be the ubiquitous display, as low resolution as that device was, pre-high definition TV and before 4K. 


Even the largely-correct predictions might take 20 to 40 years to materialize. It is not about technology availability so much as actual commercial impact and value, the point where  significant deployment and value has happened


It can take 10 years for any successful and important innovation to be adopted and used by half of households, for example. Business applications can take longer to reach substantial commercialization. Big and systemically important technology-driven innovations routinely take 30 years to reach fruition, some note.  


source: Medium 


Electricity, the steam engine, the internal combustion engine and transistors are often cited as general purpose technologies that create widespread economic change. Some might be tempted to tout the revenue upside from 5G edge computing and internet of things services in that category. We will not know for some time. 


Still, it is fair to note that even popular technologies and products take some time to reach ubiquity. 

source: MIT Technology Review 


But it would not be historically unusual for many touted 5G innovations to achieve commercial success until the time of 6G.  That is worth keeping in mind with predictions about 5G, internet of things and edge computing. 


We might well have much of the technology available, but not the developed, ubiquitous platforms, for quite some time to come.


Monday, November 23, 2020

The Downside of Multi-Purpose IP Networks

By now, virtually all observers agree that direct revenue generated by fixed networks will shift to supplying broadband access, while some of the strategic value of the fixed network shifts to support of mobile and fixed wireless networks. 


That raises a fairly big question. The whole rationale behind multi-purpose internet protocol networks is that they can carry any type of information or media and support any service. But the growing reliance on broadband revenue functionally pushed the networks back to “single purpose” mode, at least in terms of what drives revenue. 


To be sure, the hope about tomorrow’s networks is that new use cases and revenue streams will develop. But those new sources will have to exhibit scale as the lead legacy revenue streams decay. For a big tier-one service provider, one might characterize the scale problem as “if any new service does not generate at least $1 billion annually in new revenue, it is too small to bother with.”


In some markets, broadband might already represent half of total revenue. Some predict that internet access will represent 64 percent of total revenue by about 2024, for example, lead by mobile data. 

 source: Omdia 


Fixed network revenue, though, arguably has been dropping for two decades. 


And the bottom line for some tier-one service providers is that the consumer fixed networks business is fairly small, as a contributor to revenues or cash flow. In recent years, AT&T, for example, has generated about 15 percent of total cash flow from all consumer services on the fixed network. 


Half of total cash flow came from mobility services and 17 percent was earned from the Warner Media content business. About 17 percent of cash flow was generated from services supplied to business customers on the fixed network. 


Altogether, the fixed network generates about 32 percent of total cash flow for AT&T. So the bottom line is that any investment in FTTH could affect 32 percent of total revenue, where 5G affects at least half of total revenue. 


Likewise, in recent years Verizon earned 87 percent of its profits from mobility services, just 13 percent from all fixed network services. 


That raises other questions. How much upside does fiber to the home or fixed wireless have for tier-one service providers, mobile or fixed? How much does the financial return justify investing in broadband access, compared to other alternative investments?


Consider mobile services. Most revenue in the global telecom business now is generated by mobile services and nearly all the net revenue growth. So it makes sense to prioritize investment in mobile infrastructure, compared to fixed infrastructure, for retail customers. 


In the fixed networks business, though, what investment provides the biggest financial and revenue return? “Broadband” is the easy answer, and often the correct answer. But even there, the cost-benefit analysis must be conducted, as return on capital always matters. 


And it is not always clear that investment in gigabit fiber networks has a positive return on invested capital.  For that matter, it has not been so clear that fixed network investments in general have an adequate return on capital. 


This is the sort of big problem service providers have faced before, and successfully. More than two decades ago, the anticipated withering of the core voice business (long distance and access lines) might have seemed an existential crisis. But internet access, video subscriptions and mobility provided substitute new revenues. 


Now that consumer broadband is becoming saturated and voice and video subscriptions are declining, another big shift has to be made. 


The shift to multi-purpose IP networks enables access to apps and services using every media type. Ironically, that very capability is pushing revenue generation on fixed networks and mobile networks alike to “dumb pipe internet access.” 


Another way of putting matters is that although multi-purpose networks increasingly are valuable for application providers who can get to customers on those networks, the new networks might actually reduce addressable revenue for connectivity providers.


Saturday, November 21, 2020

How Much Upside from 20 Million New FTTH Lines?

How much impact will an additional 20 million U.S. fiber to the home lines deployed by telcos have on broadband market share? If past is prologue, telcos will find they get between 37 percent and 40 percent take rates for those FTTH facilities.


But that is almost certainly not going to mean a net gain of 37 percent to 40 percent. The reason is customer demand and displacement of existing copper-fed connections. 


Assume take rates for copper DSL services are no higher than 20 percent to 30 percent, and primarily are purchased by cost-sensitive customers. What percentage of those value-conscious customers are likely to upgrade? Some, but not all, assuming speeds ranging from 100 Mbps on the bottom to 1 Gbps on the top end. 


Instead, a good portion of the incremental new customers are likely to be former cable customers. All other things being equal, the value is likely to be greater upstream bandwidth on the new FTTH facilities, compared to cable offers. 


All that noted, if new FTTH facilities are unlikely to exceed 40 percent take rates, and existing take rates are 20 percent to 30 percent, incremental market share is likely to grow by 10 percent to 20 percent. 


That implies a net gain of no more than two to four million accounts, in those areas. True, the total take rates for FTTH might ultimately reach 40 percent, or eight million locations.  


But not all that is an incremental gain, as the telcos already serve four million to six million locations in the areas where the new FTTH goes. 


The growth will be quite welcome, to be sure, since telcos have been losing installed base for two decades. Growth of 10 percent to 20 percent will be important, but probably not a business model revolution, for AT&T and Verizon. Fixed networks overall contribute relatively light amounts of revenue, and even less profit. 


Some other providers who possess only fixed assets might find the financial upside more important, since fixed networks generate 100 percent of revenue.


Did the U.S. National Broadband Plan Succeed, or Not?

Did the U.S. National Broadband Plan fail or succeed? Some argue the plan failed. Others might argue it clearly has succeeded. So what is the truth of the matter? It actually is hard to say. 


There are but two quantifiable goals stated in the plan. 


The document says a goal, at the end of 10 years, is connecting 100 million U.S. homes with “affordable” access to actual download speeds of at least 100 megabits per second and actual upload speeds of at least 50 Mbps. 


Another goal was to provide gigabit per second access to anchor institutions such as schools, hospitals and government buildings. 


All the other goals are not quantifiable, except in “yes-no” fashion: did a recommended action actually happen within the plan time frame or not? As with many plans, the issue is targets,  frameworks and rule changes, rather than quantifiable outcomes. 


The plan was couched in terms of “goals” that are either hard to quantify, require the cooperation of many entities in the ecosystem or are not easy to define. Also, the plan itself says it is a “roadmap,” not a firm set of outcomes. 


The plan itself mostly deals with what the government can do, in response to a Congressional mandate to provide “detailed strategy for achieving affordability and maximizing use of broadband to advance “consumer welfare, civic participation, public safety and homeland security, community development, health care delivery, energy independence and efficiency, education, employee training, private sector investment, entrepreneurial activity, job creation and economic growth, and other national purposes.”


Some cite the portions of the plan described as “long term” goals, when making their evaluations of plan success. Also, keep in mind that the plan itself was only designed to facilitate commercial actions by others. The government’s role was limited to spectrum allocation and other policies that create incentives for other actors to fulfill. 


So what of the two numerical outcomes? Are 100 million U.S. homes presently buying “affordable” access at 100 Mbps downstream speeds? First off, “affordable” is not quantified and is a matter of interpretation. But are 100 million U.S. homes buying internet access at 100 Mbps?


According to measurements by Speedtest, the average U.S. consumer on a fixed network is getting access at between 124 Mbps and 166 Mbps. And Speedtest reports that 61 percent of all U.S. fixed network internet access services purchased by consumers offer 100 Mbps or higher speeds


But there is a difference between supply and demand. The plan specified only demand, not supply. 


Current supply exceeds what the plan called for. But current demand is lower. Only six in 10 customers choose to buy a service operating at 100 Mbps or faster. So 40 percent largely choose service operating at less than 100 Mbps, or some will note, perhaps cannot buy such service. 


Assume there are 139.4 million U.S. households. Assume fixed internet access is purchased by 80 percent of households. That implies a total of 111.5 million locations buying internet access. That seems too high. 


But assume only 126.7 million housing units  actually are occupied. If 80 percent of occupied housing units buy fixed network broadband, that suggests there should be about 101 million subscriptions. That accords with other estimates. 


If 61 percent of those locations buy internet access at 100 Mbps or faster, then 61 million U.S. customers choose to buy service at 100 Mbps or faster. That, of course, is far less than the National Broadband Plan called for. 


Provisioned speeds--bought by customers--differs from available speed, in other words. So should the plan have differentiated between available and provisioned speeds? We cannot say, at this point. So the evaluation of “did the plan achieve its goals” also is a matter of opinion, not “truth.”


Even in hard-to-serve rural areas, 60 percent of residents can buy internet access at speeds of at least 100 Mbps. That does not mean they do so. So what is the “truth” of the matter?


While it is difficult to measure speed, actual U.S. broadband speed is more than 100 Mbps, on average, according to Akamai in 2017. Upstream speeds vary by location, but are at or above plan goals in most cities, with performance varying by provider.   


But is access “affordable?” That is a matter of opinion, not fact. Still, prices have fallen significantly. 


“The most popular tier of broadband service in 2015 (BPI-Consumer Choice) is now priced 20.2 percent lower and offers 15.7 percent faster speeds in 2020 on an average subscriber-weighted basis,” says USTA. 


“The highest speed offerings in 2015 (BPI-Speed) are now priced 37.7 percent lower and offer 27.7 percent faster speeds in 2020 on an average subscriber-weighted basis,” USTA says.

 

“When inflation is considered, the real price of the most popular tier of broadband service has dropped 28.1 percent since 2015; and the real price of the highest speed broadband service has dropped 43.9 percent,” USTA notes. 


At the same time, cost per Mbps has dropped 37.9 percent for the most popular service and 56.1 percent for the highest speed service, says USTA. 


Beyond that, an outcome not specified was “Can 100 million U.S. homes purchase--if they choose--internet access at 1 Gbps downstream rates?” The answer to that unasked question also unquestionably is “yes.” Looking solely at cable operators, that portion of the industry alone has 72 million actual accounts. Not all are consumer accounts. 


The cable industry says 80 percent of U.S. homes have gigabit access, while 90 percent of U.S. homes have high-speed access but at top speeds less than 1 Gbps. 


Cable operators alone pass 80 percent of U.S. homes with networks selling gigabit per second internet access, and about 90 percent of U.S. homes can buy high speed access access, but at rates less than 1 Gbps. 


Keep in mind that U.S. telcos have about 33 million internet access accounts, but cable operators have about 72 million accounts, or 70 percent of the installed base. 


So how about gigabit service for anchor institutions. Consider matters in rural areas, where 59 percent of schools, for example, have broadband at gigabit per second rates, or higher. Beyond that, it is hard to see what percentage of schools, hospitals and other anchor institutions presently have gigabit connections. In perhaps 80 percent of communities, that is possible. 


Still, the truth of the matter--whether the plan succeeded or not--is clouded by opinions. 


“What is truth?” occupies much of traditional philosophy, and still manages to maintain its relevance, even in the communications business. 


Truth is that will accords with reality, Wikipedia notes. Another way of saying this is that “truth accords with facts; there being a difference between what is factual and what is merely opinion. A related key concept is that there is a difference between a fact and a value. 


And there often is more “value” or “opinion” than “truth” in most parts of the industry. Consider almost any claims made by marketing staffs, industry lobbyists or the policy advocates who oppose them, public officials or customers. 


The adage that “you are entitled to your own opinions but not your own facts” encapsulates the issue. “Facts” are open to interpretation. Is any country, state, province, city or service provider “doing well” in terms of its deployment of advanced information or communications connectivity? 


That is consequential, since the oldest school of philosophy asserts that truth is that which corresponds with facts, at least since the time of Aristotle. 


How successful are we at value generation and social or educational benefit? And what is the basis for such evaluations. Quite often, we do not agree on the facts. If truth is that which accords with the facts, then contention is inevitable. 


There are more modern systems as well. In the mid-19th century the focus shifts away from individual propositions and towards a system of logically interrelated components or web of belief.


Postmodernism--especially in its radical deconstructionist variants-- essentially abandons the notion that truth is absolute. “Most radical postmodernists do not distinguish acceptance as true from being true; they claim that the social negotiations among influential people ‘construct’ the truth


The deconstructed view essentially permits a definition of “truth” which is merely “opinion,” albeit opinion ratified by its acceptance. 


In the early 20th century a school of pragmatists used what we might call a scientific framework, suggesting that what is true is that which works. One obvious issue is that truth becomes relative, since trial, error and interpretation is required to determine what “works.” 


The point is that the difference between fact and value is not as clear as you might think, in non-mathematical endeavors, especially. By extension, it is not as easy to determine “truth” from “falsehood,” either. The social constructionists argue that is simply a matter of the imposition of power. 


As in Lewis Carroll’s book Through the Looking Glass, “truth” is subjective. 


"When I use a word," Humpty Dumpty said, in rather a scornful tone, "it means just what I choose it to mean—neither more nor less."


"The question is," said Alice, "whether you can make words mean so many different things."


"The question is," said Humpty Dumpty, "which is to be master—that's all." 


Friday, November 20, 2020

U.S. Telcos Gain Net New Broadband Accounts in 3Q 2020

It might be too early to say whether this is a trend, but U.S. telcos actually gained net broadband accounts in the third quarter of 2020, gaining about 14 percent of total net new additions in the quarter, while cable TV operators got the rest, about 86 percent. 


That might not sound like a big deal, but in most quarters over the last decade, telcos collectively have lost broadband accounts. There were a couple of exceptions, but the net positive growth is quite a change. 


Net Broadband Additions, U.S. Telcos, Third Quarter 2020

AT&T

15,375,000

174,000

Verizon

7,069,000

110,000

CenturyLink/Lumen^

4,563,000

(75,000)

Frontier

3,119,000

(23,000)

Windstream

1,102,300

12,900

Consolidated

792,211

1,008

TDS

487,700

8,200

Cincinnati Bell

434,500

2,500

source: Leichtman Research


Thursday, November 19, 2020

FCC Reallocates 45 MHz for Wi-Fi

The Federal Communications Commission has moved to make 45 MHz of spectrum in the 5.9 GHz band (5.850-5.925 GHz) available for unlicensed uses such as Wi-Fi, indoors and outdoors. 


The new band plan designates the lower 45 megahertz (5.850-5.895 GHz) for unlicensed uses and the upper 30 megahertz (5.895-5.925 GHz) for enhanced automobile safety using Cellular Vehicle-toEverything (C-V2X) technology.


The 45 MHz reallocated for Wi-Fi formerly was assigned to Dedicated Short-Range Communications (DSRC) services more than 20 years ago, but DSRC has not been meaningfully deployed, and the spectrum has largely been unused for decades, the FCC notes. 


The FCC will propose technical rules for outdoor unlicensed operations across the United States using the new unlicensed spectrum.


One reason the FCC prefers not to make spectrum allocations which mandate use for particular purposes is precisely what happened with DSRC. Sometimes new proposed uses do not develop, and the assigned spectrum lies fallow. The preference these days is for general purpose assignments that are not application specific. 


In a sense, that also parallels the movement of all communications networks, especially public networks, away from application-specific use and towards general-purpose or multi-service modes of operation.


What Declining Industry Can Afford to Alienate Half its Customers?

Some people believe the new trend of major U.S. newspapers declining to make endorsements in presidential races is an abdication of their “p...