Thursday, August 22, 2019

City of New York Internet Access: Is it Worse than Natiional Averages?

“Lies, damn lies and statistics” is the quip once made by Samuel Clemens, the author otherwise known as Mark Twain about statistics and their relationship to “truth.” It is worth keeping in mind. 

Consider statistics cited by the City of New York about residents who do not buy internet access.

As many as 917,239 New York City households, or 29 percent of all households, “are without broadband internet access,” a report by the city of New York indicates. The phrase is meant to indicate that this percentage of households do not buy fixed network internet access, not to describe the availability of internet access to those households. 

Either meaning would be surprising to many, especially when compared with other coastal cities such as Seattle, where the lowest percentage of homes buying fixed network internet access is 93 percent, and the “average” buy rate is about 96 percent.

Keep in mind that nationally, U.S. household purchasing of fixed network internet access internet access is about 77 percent, according to the Pew Research Center. If that seems low, consider that between 17 percent and 20 percent of U.S. households are “mobile-only” for internet access. In that context, the 71 percent buy rate claimed for New York City is roughly in line with national figures. 

Actually, the New York figures arguably are close to U.S. national averages. According to the Pew Research Center, 17 percent of U.S. households are mobile only for internet access, including 23 percent of black households and 25 percent of Hispanic households. 

The New York study claims 30 percent of Hispanic and black New Yorkers do not buy fixed network internet access.

According to Pew, some 26 percent of U.S. households with household income of less than $30,000 are mobile-only and not buy fixed network internet access. 

The New York City study says “44 percent of New Yorkers in poverty” do not buy fixed network internet access. Two caveats: persons are not households, and fixed network internet access is purchased “by the household.” 

Also, keep in mind that New York uses a different definition of poverty, setting the benchmark higher at $33,600 per household. The New York definition is about 12 percent higher than the U.S. federal government definition. 

In other words, New York buying rates for lower-income households are roughly in line with national averages, when considering the different definitions. The Pew data is based on households. 

The New York City study sometimes uses “persons” as the unit of analysis. But 44 percent of people, using the 2.4 persons-per-household metric, yields a non-buy rate of about 18 percent for homes in poverty. Again, in line with U.S. national averages. 

Likewise, about 26 percent of U.S. households not buying fixed network internet access are headed by people with a high school diploma or less are mobile-only for internet access, according to Pew. 

The New York City study says 33 percent of New Yorkers who are high school graduates do not buy fixed network internet access. That might represent about 14 percent of households. 

Of households headed by someone with less than a high school degree, about 41 percent of people do not purchase fixed network internet access. That might represent about 17 percent of households. 

If the populations represented by high school graduates and less-than-high-school persons are exclusive of each other, then possibly 31 percent of New York City households headed by someone with a high school degree or less. 

The point is that people and households living in New York City do not buy fixed network internet access at rates that are roughly in line with U.S. national averages. 

Sunday, August 18, 2019

5G Coverage Actually Will Not be a Problem, Even for Verizon

Many discussions of 5G spectrum seem to center on where all the capacity (or coverage) will come from. The confusion is understandable. If every area where 5G is made available has to support the sorts of speeds millimeter wave small cells will support, there are serious questions. Verizon is a case in point.

Most agree coverage is better supplied by low-band or mid-band assets. But some mobile operators--Verizon being the case in point--might have relatively little low-band or mid-band spectrum to deploy, at the moment.

Of course, more mid-band spectrum is coming, so that is unlikely to be a long-term issue. The immediate problem is how to support the early 5G rollout if mid-band or low-band assets are scarce.

The answer lies in market demand. All hype aside, 5G really is needed at some locations because 4G capacity is going to prove inadequate in a couple of years.

But 5G capacity will not have to be evenly supplied. In fact, it will be highly uneven, or unequal, as has been the case for 3G and 4G. It is likely that, in terms of coverage, 5G speeds will resemble good 4G, and will not rely very much on the more-exotic millimeter wave assets.

Rather, lower-frequency and mid-band frequency will be more common for coverage purposes in rural areas.


There are several reasons. Coverage 5G spectrum will tend to be in lower frequency ranges simply because lower frequencies--while not best for capacity--are best for covering distances. Millimeter wave is optimal for bandwidth, but not distance covered.

In some cases, 5G devices actually will use a combination of new 5G spectrum and 4G spectrum, depending on where each connection takes place. In dense urban areas connections might often use new millimeter wave capacity. In rural areas 5G devices often will default to 4G. 

The point is the demand for mobile capacity is quite unevenly distributed. T-Mobile has noted in the past that 20 percent of 3G cells carry 60 percent of all 3G traffic. About half of all cell sites support 95 percent of 3G traffic. 

That pattern was true of 4G networks and undoubtedly be true of 5G networks as well. T-Mobile also has noted that, for any typical user, half of all data traffic is consumed from just one cell tower. About 80 percent of any typical user’s data consumption happens on just three cell sites. 


Studies conducted by Amdocs have suggested that 20 percent of mobile network locations support about 80 percent of network data demand. 

Data consumption might also be related somewhat directly to revenue generation, if one assumes that where a customer uses most of his or her data is where the value of the subscription lies. Though coverage is “nice” where one does not need to use the network, it is a “must” where any particular user uses mobile networks daily. 

Some estimates suggest that as much as half the total revenue (value or usage) happens when users communicate on about 10 percent of total cell sites. Perhaps 80 percent of usage or revenue is generated on about 30 percent of cell sites.

It is worth noting that the half of all cell sites generating about 10 percent of service provider revenue are in rural areas. There simply are not that many people in rural areas. And lots of people mean lots of usage. 



So part of the answer to the question of “where will all the 5G spectrum come from?” requires understanding that demand is highly unequally distributed. Rural areas might have half of all cell sites, but support only 10 percent of data demand. 

Urban sites might be only about 10 percent of total sites, but support as much as half of all data demand, simply because that is where most of the people are. 

Spectrum resources will come from new low band capacity, repurposing legacy capacity, new mid-band capacity, millimeter wave assets, spectrum sharing, spectrum aggregation, offload mechanisms and use of smaller cells. 

But the supply is highly unequal: lots of new capacity in urban areas, some new supply in suburban areas and relatively light capacity needs in rural areas. The new 5G networks will not have to support urban small cell capacities “everywhere” across the whole network.

Friday, August 16, 2019

"5G or 4G" Will Not Matter for Most Users, in Practice

It already is tough to determine what “5G speed” actually means, as early 5G often is based as much on 4G as 5G.

And the problem of assessing 5G speed is going to get worse. We will have to compare 5G primarily using low-band assets, which will have good coverage but restrained speeds; 5G using millimeter wave, which will have extraordinary speed but limited coverage; 5G using mid-band that offers a mix of coverage and capacity, but might also be using 4G representing nearly half of total performance; and 5G using a mix of frequencies and spectrum aggregation.

Beyond all that, 5G devices might be connecting to two or more radio sites at once, further complicating our understanding of which network (5G, 4G, unlicensed or licensed) is being used, at a moment in time.

It soon will only be clear that a particular 5G device, on a particular network, at a specific location, works well, or does not work so well. The actual mix of networks (5G, 4G, licensed and unlicensed; cell locations used simulaneously) might vary quite a lot.

Speed and cost measurements on a cross-country basis--both fixed and mobile--have been contingent. Choices have to be made about what and how  to measure (which plans, across all countries, at what speeds, price points, uptake volumes, including promotions and other buying behavior). 

Then adjustments might have to be made based on household sizes (to get per-user metrics); geography (relatively more urban or rural; large or small country) or pricing power differentials between countries. 

All of that will become more complicated in the 5G era, when virtually any spectrum can be used to support 5G services, with clear and distinctive coverage and capacity profiles, depending on which frequencies are used, and in what mix. 

5G can be used in a legacy-free manner, though perhaps rarely, using only “new” millimeter and mid-band frequencies. It might use a combination of new and legacy frequencies (high, mid and low band assets). 

5G might use spectrum within the low bands (new and legacy), or combine low and mid-band assets. Perhaps the most-common approach will be a mix of spectrum bands. 


Both 4G and 5G spectrum also can be used to support a 5G device, further complicating matters. 

That perhaps already is clear in South Korea, where 5G uses the  mid-band frequencies to support 5G, but where, in many cases, it is a combination of mid-band and 4G spectrum that actually supports usage, although 28-GHz also is authorized and will be used, at some point. 


Some recent tests have used devices able to access 1.5 Gbps of 5G bandwidth using SK Telecom’s 3.5 GHz spectrum, plus 1.15 Gbps of 4G bandwidth at 1.8Ghz, 2.1Ghz, and 2.8GHz frequencies. 

The point is that 5G access is going to be quite heterogenous. There will be many ways of supplying 5G access, and performance will vary based on how the access is supplied. Even when 4G spectrum is not used (dynamic spectrum sharing, spectrum aggregation), 5G capacity will vary based on which bands of spectrum are used, and especially when millimeter wave or mid-band spectrum is available. 

Low-band 5G will be faster than 4G, but less so than when mid-band and high-band assets are used. 

But many early 5G deployments will aggregate 4G with 5G. In other cases 5G might be aggregated with unlicensed spectrum. In other cases, access might default entirely to 4G, when on 5G handsets in rural areas. 

And 4G will keep getting faster, closing the gap with 5G using the coverage frequencies (low-band and mid-band). So even when a 5G device defaults to 4G, the speed experience might not vary too much from 5G. 

The point is that interpreting 5G speeds is going to become highly contingent. Stand-alone 5G is going to be different than non-stand-alone (using 4G). 5G experience will hinge on which frequency bands are used, and what types of spectrum aggregation are possible at specific locations. 

Australia NBN Seems to be Breaking Business Models

Telecom service provider wholesale policies can make or break business models. Back in the early years of the 21st century, U.S. regulators briefly relied on robust wholesale discounts--below actual cost, in many instances--for network service customers, in hopes of stimulating more competition in telecom services markets. 

The policies allowed competitors to buy and use complete services--provisioned lines--to support competitive voice services. The framework allowed firms such as the then-independent AT&T and MCI to grow their retail phone services businesses. 

It all collapsed when the Federal Communications Commission changed its rules and allowed network services suppliers to negotiate market-rate prices. 

Australia seems to be suffering from a related problem with its National Broadband Network, as there is but one supplier of wholesale network services, leading Telstra to experience a 40-percent drop in profits, year over year.

The country’s largest service provider (which accounts for 41.5 percent of the telco industry’s market share) saw its profits plummet 40 per cent in the 2018-2019 financial year, largely because of NBN wholesale tariffs. 

Telstra’s largest rival, Optus, similarly saw profits fall by 32 per cent in the same period. 

That is one clear danger for all telecom regimes relying on a single wholesale network services supplier.The brief U.S. policy reliance on wholesale service supply--seen as a stepping stone to facilities investment--rather quickly was replaced by the alternative of facilities-based competition, and essentially worked because cable TV operators were able to use their own networks to compete in voice and internet access services. 

The 1996 Telecom Act's focus had two goals: to open the local exchange market to competition (by stimulating facilities-based investment) and to promote expanded competition within the long-distance marketplace.

In retrospect, the focus on voice services--at a time when the internet was emerging--seems misplaced. A fundamental change in policy aiming to change voice market share was unveiled precisely at the point that voice services began a long-term decline. 

The primary goal was to provide residential customers with choice and innovation in their local voice telephone service. After nearly seven years, though choice increased for urban customers, investment by the incumbent and competitive carriers was virtually nonexistent.

The problem is compounded by the decline of every legacy revenue stream the wholesale infrastructure is supposed to enable, with declining average revenue per user now a global trend. 

Under such conditions, wholesale prices “need” to be reduced, as retail value is less, so retail price needs to decline. Whether that is possible, and to what extent, is the issue for the NBN.

Thursday, August 15, 2019

Perhaps 4% of U.S. Households Buy Gigabit Internet Access

A new study of U.S. consumer internet access behavior might shed some light on typical speed tiers purchased by customers. The monthly weighted average data consumed by subscribers in the second quarter of 2019 was 271 gigabytes, according to OpenVault.

The median monthly weighted average usage in the second quarter of 2019 (half use more, half use less) was 144.5 GB, up nearly 32 percent from 109.6 GB in the second quarter of 2018. 

About 43 percent of customers using less than 250 GB per month use services operating between 100 Mbps and 150 Mbps, according to OpenVault. 

About 15 percent buy services running between 200 Mbps and 300 Mbps. Another 19 percent purchase services running between 50 Mbps and 75 Mbps. 

Altogether, some 77 percent of U.S. households buy internet access running between 50 Mbps and 300 Mbps. 

About four percent buy gigabit services. 


Wednesday, August 14, 2019

Value Too Low, Price Too High for Linear Video Subscriptions

Almost by definition, consumers who choose not to buy a popular product such as linear subscription video have objections based on value, price, or both. “Nearly half of American cord cutters/nevers said price was the main reason they cut the cord or never connected, the highest rate among all seven countries we reviewed,” say analysts at S&P Global.


On the other hand, over-the-air television is seen as a viable way to satisfy video entertainment needs by just 10 percent of poll respondents who choose not to buy a linear subscription service. 



Is U.S. Consumer Internet Access Market Near Saturation?

If roughly 100 million locations buy fixed network broadband service, and if six percent of those accounts are purchased by business customers, then consumer broadband might be bought by about 94 million U.S. households, out of a base of perhaps 122 million to 135 million total homes. Using the 122 million figure, that implies an adoption rate of about 77 percent. 

Mobile-only households might represent as much as 20 percent of U.S. homes, or perhaps 24.4 million households. Taking the 94 million U.S. homes buying fixed network broadband, and adding the mobile-only households, perhaps 117 million U.S. homes buy either mobile or fixed broadband, excluding satellite broadband customers, which might represent an additional two million accounts. 

If so, then about 119 million U.S. households buy internet access from telcos, cable companies or satellite internet firms. But one also must add customers of wireless ISPs, serving perhaps four million customers. Adding those, one reaches a subscriber base of about 123 million homes.

In other words, we are very nearly at the point where every household that wants to boy internet access already does so.

Telco Execs Say AI Already Used to Support Service, Reduce Costs

Artificial intelligence is a bit like broader elements of computing, in that all networks and business processes now rely on such capabilities, even when few executives or workers actually think much about it. 

The highest current use of AI among service providers worldwide is in service quality management (17 percent) and operational cost savings (16 percent), according to Ericsson. 


Service provider executives expect that their networks will use AI before the end of 2020, as 53 percent of survey respondents report. 

About 55 percent of respondents believe the benefits will be evident within a year or two. The majority of service providers are at the stage of testing AI, with 48 percent focusing on AI to reduce capital expenditure. A further 41 percent are focusing on using AI for optimizing network performance, and 35 percent for new revenue streams.

The responses are based on data from 165 senior executives from 132 mobile communications service providers globally.

Are Communications Services Commodities?

Is communications a commodity? The answer matters, as commodity products tend to be priced at levels just barely above the marginal cost to provide the product. 

One can see this clearly in voice pricing, text messaging and even internet access (easier to explain in terms of cost per bit, but even absolute pricing levels have declined).

In fact, telecom product prices have a tendency to drop towards zero

That, in turn, posed key questions for business models, especially relating to the cost of producing the products sold. The core problem with pricing at marginal cost (the cost to produce the next unit), or close to it, is that the actual recovery of sunk costs does not happen. 


As the point of depreciation is to recover the cost of replacing an asset, so communications service providers must replace the cost of the underlying networks. In essence, that means pricing, longer term, at prices that allow recovery of the cost of rebuilding the network. 

Strictly speaking, pricing at marginal cost does not allow that recovery of sunk network investments.  

In fact, one of the biggest long-term trends in the communications business is the tendency for connectivity services to constantly drop towards “zero” levels. 

That is arguably most true in the capacity supplier portions of the business (bandwidth), the cost of discrete computing operations, the cost of storage or many applications.

In large part, marginal cost pricing is at work. Products that are "services," and perishable, are particularly important settings for such pricing. Airline seats and hotel room stays provide clear examples.

Seats or rooms not sold are highly "perishable." They cannot ever be sold as a flight leaves or a day passes. So it can be a rational practice to monetize those assets at almost any positive price.

Whether marginal cost pricing is “good” for traditional telecom services suppliers is a good question, as the marginal cost of supplying one more megabyte of Internet access, voice or text messaging might well be very close to zero.

Such “near zero pricing” is pretty much what we see with major VoIP services such as Skype. Whether the traditional telecom business can survive such pricing is a big question.

That is hard to square with the capital intensity of building any big network, which mandates a cost quite a lot higher than “zero.”

Surplus is part of the reason for pricing at marginal cost, though. That was why it once made sense for mobile service providers to offer reduced cost, or then eventually unlimited calling “off peak,” when the network was largely unused. 

Surplus capacity caused by declining demand also applies to text messaging, where people are using alternatives. If there is plenty of capacity, offering lower prices to “fill up the pipe” makes sense. And even if most consumers do not actually use those resources, they are presented by value propositions of higher value.

Video entertainment and internet access are the next products to watch. Video is more complicated, as it is an “up the stack” application, not a connectivity service. Retail pricing has to include the cost of content rights, which have not historically varied based on demand, but on supply issues.  

Linear video already has past its peak, while streaming alternatives are in the growth phase.

Internet access, meanwhile, is approaching saturation. That suggests more price pressure on linear video and internet access, as less demand means stranded supply, and therefore incentives to cut prices to boost sales volume.

Marketing practices also play a big part, as the economics of usage on a digital network can be quite different than on an analog network. And some competitors might have assets they can leverage in new ways.

In 1998, AT&T revolutionized the industry with its “Digital One Rate” plan, which eliminated roaming and long-distance charges, effectively eliminating the difference between “extra cost” long distance and flat-fee local calling.

Digital One Rate did not offer unlimited calling at first, but that came soon afterwards. In the near term, lots of people figured out they could use their mobiles to make all “long distance” calls, using their local lines for inbound and local calling only.

With unlimited calling, it became possible to consider abandoning landline service entirely.

At least in part, the growth of mobile subscriptions from 44 million in 1996 to 182 million by the end of 2004 is a result of the higher value of mobile services, based in part on “all distance” calling.

Mobile revenue increased by more than 750 percent, from 10.2 billion dollars in 1993 to more than 88 billion dollars in 2003.


During this same time period, long distance revenue fell by 67 percent to 4.3 billion dollars, down from 13.0 billion dollars.

The point is that connectivity prices and some application (voice, messaging) prices have had a tendency to drop closer to zero over time. Moore’s Law plays a part. Open source also allows lower costs, and therefore more-competitive prices.

That is why the question of whether communications products are commodities does matter. Commodity prices drop, over time, to just above marginal cost. And that implies pressure on business models.

Yes, Follow the Data. Even if it Does Not Fit Your Agenda

When people argue we need to “follow the science” that should be true in all cases, not only in cases where the data fits one’s political pr...