Terms such as digital redlining imply that U.S. internet service providers upgrade neighborhoods able to pay for higher speed internet access underinvesting in poorer neighborhoods. At some level, it is hard to argue with that point of view, at least where it comes to gigabit internet access.
Google itself pioneered the tactic of building in neighborhoods where there is demonstrated demand, building Google Fiber first in neighborhoods (typically higher-income areas) where potential customers were most interested. Other gigabit service providers have used the placing of deposits for the same reason.
And regulatory officials at the local level seem to now agree that “universal service” (building a gigabit network past every home and business) is desirable in some cases, but not absolutely mandatory in all cases. The thinking is that allowing new internet service providers or facilities to be built wherever possible is a better outcome than requiring ubiquity, and getting nothing.
Also, higher-speed facilities often are not found everywhere in a single market or city. CenturyLink does sell gigabit internet access in Denver, just not everywhere in the metro area. That is not necessarily “redlining,” but likely based on capital available to invest; expectations about financial return; customer density or any other combination of business issues that discourages the investment in new access facilities.
The economics of communication networks also are clear. Density and cost per location are inversely related. Mobile networks typically have 10 percent of cell sites supporting 50 percent of usage. About 30 percent of sites carry about 80 percent of traffic. That has been true since at least the 3G era.
In fixed networks, network cost and density also are inversely related. So population density has a direct bearing on network costs. In the U.S. market, network unavailability is concentrated on the last couple of percent of locations.
With cable operators already holding at least 70 percent share of the internet access installed base of customers, any new investment in faster facilities faces a tough challenge. Any new fiber to home network, for example, essentially is playing catch-up to a cable operator, as roughly 80 percent of U.S. households already also are reached by gigabit speed cable networks.
And cable share has grown, up from possibly 67 percent share in 2017.
That noted, internet speeds do vary by geography: speeds in urban areas frequently are higher than in rural areas. But the argument that large numbers of U.S. households are underserved often is correct, depending on what standard one wished to apply, and how one defines the supplier market.
Some claim 42 million U.S. residents are unable to buy broadband internet access, defined as minimum speeds of 25 Mbps in the downstream. That actually is incorrect.
Virtually every household in the continental United States is able to buy 25 Mbps or faster service from at least two different satellite providers. But those who claim “42 million” people cannot buy broadband simply ignore those choices, and focus only on the claimed availability of 25 Mbps service by fixed network providers.
There are other estimates which also vary wildly. Roughly 10 percent of U.S. households are in rural areas, the places where it is most expensive to install fast fixed network internet access facilities, and where the greatest speed gaps--compared to urban areas--almost certainly continue to exist.
In its own work with TV white spaces, Microsoft has targeted perhaps two million people, or roughly a million households, that have no fixed network internet access. That assumes there are two people living in a typical household, which is below the U.S. average of roughly 2.3 to 2.5 per household.
Recall that the definition of broadband is 25 Mbps downstream. Microsoft has argued that 20 million people (about 10 million homes) or perhaps eight percent of the population (perhaps four percent of homes) cannot get such speeds from any fixed network service provider.
Microsoft also has cited figures suggesting 25 million people cannot buy broadband--presumably using the 25 Mbps minimum standard, most of those people living in rural areas.
That conflicts with data from Openvault that suggests 95 percent of the U.S. population can buy internet access at a minimum of 25 Mbps, while 91 percent to 92 percent can buy service at a minimum of 100 Mbps.
Using the average 2.5 persons per U.S. household average, that suggests a universe of about 10 million U.S. homes unable to purchase internet access at 25 Mbps from a fixed network supplier, in 2018. What is not so clear is the percentage of households or persons who can do so using a mobile network.
None of that explains urban areas with slow speeds, though. There the issue is more likely to be high construction costs in urban areas where underground construction is necessary, along with demand expectations that are lower than in suburban areas. That is true whether it is electrical lines or communications networks being considered.
But at least one Microsoft analysis suggests that about half of all U.S. households are not using 25 Mbps access. The claim is that 162.8 million people are “not using the internet at broadband speeds.” That seems to clearly contradict data gathered by firms such as Ookla and Opensignal suggesting that average U.S. speeds are in triple digits.
In 2018, the average U.S. broadband speed was 94 Mbps, according to the NCTA. That same year, Ookla reported the average U.S. speed was 96 Mbps.
It is not quite clear how the Microsoft data was generated, though one blog post suggested it was based on an analysis of “anonymized data that we collect as part of our ongoing work to improve the performance and security of our software and services.”
The claim of 162.8 million people “not using the internet at broadband speeds” (probably using 25 Mbps as the definition) equates to about 65 million households, using the 2.5 persons per household definition. That does not seem to match other data, including the statistics Microsoft itself cites.
What remains difficult, but might explain the divergence, is if applications and services include both apps run on smartphones as well as PCs and other devices connected to fixed networks. That would explain the number of users, while usage on mobile networks might account for large numbers of sessions where 25 Mbps speeds downstream were not noted, or perhaps it was the upstream speed definition (minimum of 3 Mbps) that was the issue.
Even then, downstream average 4G speeds in 2018 were in excess of 40 Mbps downstream, so even that explanation is a bit difficult.
Perhaps there are other ways to make sense of the data. There is a difference between users (people) and households. There is a difference between usage and availability; usage by device (mobile, PC, tablet, gaming device, sensor); application bandwidth and network bandwidth.
Perhaps the issue is application performance on a wide range of devices including mobiles and untethered devices using Wi-Fi, which would reduce average experienced speeds, compared to “delivered access speed.”
Methodology does matter. So do the costs and benefits of broadband capital investment under competitive conditions, in areas with high construction costs or low demand for advanced services, especially when newer platforms with better economics are being commercialized.
Telecommunications is a business like any other. Investments are made in expectation of profits. Where a sustainable business case does not exist, subsidies for high-cost areas or universal service support exist.
The point is that every human activity has a business and revenue model: it can be product sales, advertising, memberships, subscriptions, tax support, fees, donations or inheritances. Children have a “parents support me” revenue model, supported in turn by any of the aforementioned revenue models.
But every sustainable activity has a revenue model, direct or indirect. The whole global communications business now operates on very different principles than the pre-competitive monopoly business prior to the 1980s. We still have a “universal service” low end, but we increasingly rely on end user demand to drive the high end.
Our notions of low end change--and higher--over time. We once defined “broadband” as any data rate of 1.544 Mbps or higher. These days we might use functional definitions of 25 Mbps or 30 Mbps. Recall that 30 Mbps--in 2020--was called “superfast” as a goal for U.K. fixed network broadband.
Few of us would consider 30 Mbps “superfast” any longer. Some might say the new “superfast” is gigabit per second speeds. But that is the change in real-world communications over just a decade. What was a goal in 2010 now is far surpassed.
What some call “redlining” is simply a response to huge changes in the internet access and communications business. “Maximum” is a moving target that responds to customer demand. “Minimums” tend to be set by government regulators in search of universal service.
As independent internet service providers cherry pick service areas where they believe the greatest demand for gigabit per second internet access exists, so do incumbents.
Similar choices are made by providers of metro business services; builders of subsea connectivity networks or suppliers of low earth orbit satellite constellations and fixed wireless networks. They build first--or pick customer segments--where they think the demand is greatest.