Tuesday, December 24, 2019

How Much Does Internet Access Actually Cost?

In a non-scientific study of consumer fixed network internet access bills, the Wall Street Journal found monthly bills for service at about 100 Mbps clustered between $45 a month to $70 a month. The caveat is that the sample is heavily weighted to just one internet service provider, Charter Communications, which has a heavy subscriber weighting to rural areas, in addition to some big-city systems. 

Notably light in the survey were customers of Comcast’s services. Comcast is the biggest U.S. supplier of fixed network internet access connections. 


I would not infer too much except where it comes to Charter customer experiences, where the typical price paid for internet access tends to run in the $45 to $80 a month range. 

depends on whether service is purchased a la carte or as part of a bundle. A la carte prices virtually always are higher than bundled prices. It is possible that 60 percent to 75 percent of U.S. households buy a bundled fixed network service

The Journal collected and analyzed information from more than 3,300 bills from homes in all 50 states, mostly supplied by Billshark, a company that helps customers negotiate better rates with their cable and telecommunications providers. 

Among the unknowns is how customers on triple-play packages, or bill analyzers, decide to allocate a bundle’s cost across video, internet access and voice components of the bundle, as triple-play service details are not typically shown on a bill. 

That means any evaluator has to make assumptions about what the “price” of each component might be. One method might be to take the posted a la carte rates for each individual service and then discount by an equal amount. Others might try and weight the prices based on some assessment of value. 

Any method involves making assumptions that cannot be independently verified. Some of us would apply a near-zero (there are taxes and fees, even if the value of the service is deemed to be “nothing”) or actual zero value to voice, and then allocate total bundle price only to the internet access and video services. In principle, that could boost internet access and video prices inside a bundle by $20 to $25 per service. 


The other obvious issue is whether the analyzed or assumed prices include only the actual service, or whether equipment rentals, fees and taxes are included. Obviously, ISPs want to advertise only the lower figure; customers are likely to use the “what I pay” total, which always is higher. 

“In some cases, the final cost (of a bundle) is as much as 45 percent over the advertised rate,” said Courtney Rudd, GlobalData analyst. “For example, Xfinity’s $40 ‘Starter Internet plus Basic’ TV bundle jumps to $58 per month once the additional $18 in equipment costs are added. Prices can also vary based on location.”

As has been the case in the broader telecom industry, actual prices and profit margins are, in large part, determined by the allocation of costs and overhead to various services. 

Video entertainment arguably has the highest cost of goods, so I would allocate perhaps 46 percent of price in a bundle to video. Then perhaps 32 percent of price in the bundle to internet access and maybe 21 percent for voice (as accountants might tally the numbers).

Consumers buying triple play bundles might allocate near-zero value to voice, however, even if offering voice has non-zero cost to connectivity providers. That obviously would affect the perceived price of internet access and video services. So one way to tally the price is to say that even if the value of voice in a bundle is zero, it has a cost, namely the attributed fees and taxes a consumer has to pay, even when not using the voice line. 

I’d estimate that cost as about three percent to five percent of the bundle price. So maybe 54 percent of the bundle price is for video, 40 percent for internet access, five percent for voice. 

So that might imply a retail price of video in a triple-play bundle as about $82 a month, internet access at about $61 a month, voice at perhaps $5.

After Merger with Sprint, T-Mobile US Might Want Merger with Comcast

At least some of us have long believed the “best” set of merger outcomes would be for Sprint and T-Mobile US to be merged with (pick one combination) Comcast and Charter Communications, allowing four suppliers to sustain themselves in a market where “mobile only” seems unsustainable in the consumer market. 

Now there are indications that is what T-Mobile US might be thinking, assuming its merger with Sprint is approved. 

The long-term sustainability argument has been that integrated service providers serving the U.S. consumer market would have to own both fixed and mobile network assets. 

The proposed T-Mobile US merger with Sprint, creating a big mobile-only company. That has therefore seemed the first of at least two mergers. Create a big mobile-only entity, then merge again with Comcast or Charter, Charter the more-likely candidate. 

The long-term sustainable structure for the U.S. consumer connectivity business has seemed to be four firms, including assets clustered around AT&T, Verizon, Comcast and Charter Communications, each with both mobile and fixed network facilities ownership. 

For that matter, the long-term and sustainable market might have all four of the leading contenders involved in some way with content assets, though Comcast and AT&T are likely to continue to be the only two with significant content production assets and revenues. 

Such cable-mobile combinations have a rather lengthy history, with Cox Communications and Cablevision Systems Corp. having tried a go-it-alone regional approach, with several of the biggest U.S. cable firms having partnered with Sprint decades ago for spectrum purchases and other possible uses of that spectrum. 

Cablevision Systems, in fact, was considering a non-mobile wireless service about 20 years ago, essentially modeled on Japan’s Handy-phone Handyphone wireless service, supporting voice communications at pedestrian speeds, but not full mobility at high speed. 

Cablevision’s thinking was that its hybrid fiber coax network would allow it to build such a system without the cost of acquiring licensed spectrum and building a full mobile network. That same sort of approach is used by Comcast to support its voice on Wi-Fi network using hotspots with public as well as private access

Such strategic scenarios also illustrate why the entry of Dish Network into the mobile market also is the first of at least two transactions. Dish has to replace its declining satellite video business with another big key driver, which it hopes will be mobility. 

But that only creates yet another mobile-only company, and that is unsustainable in the U.S. market, some would argue. So the eventual path will be a combination of Dish assets with another entity, especially if, eventually, every leading internet service provider owns its own mobile and fixed networks, save one. 

If the Sprint merger with T-Mobile US is approved, we might well see that as only the first of two big transactions. The second will merge the mobile-only assets of new T-Mobile with the fixed assets of a cable company.

Monday, December 23, 2019

Overprovisioning is as Dangerous as Underprovisioning Capacity

Historically, the way connectivity providers have protected themselves from growing end user data demand is by overprovisioning: building networks that supply bandwidth in excess of what present demand currently requires.

In the past, that has meant building voice networks to support the peak calling hour of the peak day levels of consumption, for example. That also means most of the time, networks have spare capacity. 

Historically, one of the examples is call volume, or data consumption, by time of day and day of week. That is true for business and consumer call patterns, as well as call centers, for example. 

Both business and consumer calling volume drops on weekends and evenings, for example. Traffic to call centers peak between noon and 3 p.m. on weekdays. 


Communications networks, however, no longer are dimensioned for voice. They are built to handle data traffic, and video traffic most of all. That still leaves service providers with the question of how much overprovisioning is required, and what that will cost, compared to revenues to be earned. 

In principle, service providers tend to overprovision in relationship to the pace of demand growth and the cost of investing in that capacity growth. It is one thing to argue that demand will grow sharply; it is quite another matter to argue that demand levels a decade from now must be provisioned today. 

That noted, there is a difference between the amount of data consumed and the speed of an access connection. The two are related, but not identical, and customers often get them confused. 

But they do tend to understand the difference between a usage allowance (“how much can I use?”) and the speed of a connection (“how fast is the connection?”). 

Service providers must overprovision, it is true. But they tend to do so in stair step fashion. Mobile network operators build a next-generation network about every decade. Fixed network operators, depending on the platform, likewise increase capacity in stair step fashion, because doing so involves changes in network elements, physical media and protocols. 


The point is that capacity increases are matched to demand needs; the pace of technology evolution; the ability to sustain the business model; revenue, operations cost and profit margin considerations. 

While some might argue, as a rule, that service providers must “deploy fiber to the home,” or build 5G or make some other stair step move, such investments always are calibrated against the business case.


It is one thing to overprovision as needed to support the business model. It might impair the business model to overprovision too much, stranding assets. Some of you will recall the vast over-expansion of wide area network transport capacity in the 1998 to 2002 period. That provides a concrete example of the danger of excessive overprovisioning. 

Yes, some service providers have business models allowing them to create gigabit per second internet access networks today, even when typical customers buy service at 100 Mbps to 200 Mbps, and use even less. 

That does not necessarily mean very access provider, on every platform, in every market, can--or should--do so. 

Sunday, December 22, 2019

Why Multi-Access Edge Computing Opportunity is Possibly Fleeting

The problem with big lucrative new markets is that they attract lots of competition. "Your margin is my opportunity," Amazon CEO Jeff Bezos notably has quipped.

So even if internet of things and multi-access edge computing are legitimate areas of potential new revenue for telcos and other internet service providers, they also are highly interesting to all sorts of other powerful potential competitors. And they are moving.

Walmart, for example, plans to build its own edge computing centers. Furthermore, Walmart expects to make that capability available to third parties. Also, Walmart expects to make its  warehouse and shipping capabilities available to third-party sellers, using a model perhaps similar to the way Amazon supports third-party retailers. 


Walmart further says it will use its large supercenter store locations to fulfill online orders for Walmart’s groceries and other items.


The point is that the market opportunity for multi-access edge computing now is being challenged by retailers, Amazon and eventually others who plan to deliver edge computing as a service themselves, perhaps foreclosing much of the potential role for connectivity service providers to become suppliers of edge computing as a service. 

“All the cloud providers see themselves as extensions of edge computing,” says James Staten, Forrester Research principal analyst. “They’re trying to become the edge gateways. One of the biggest ways they’re gaining attention--and every enterprise should pay attention to this--is that they want edge computing as a service.


At this point, it also is possible to flip that argument. Cloud computing might not be an "extension" of edge computing. Edge computing might be an extension of cloud.


In other words, to the extent the next generation of computing as a service is "at the edge," then the cloud kings have all the incentive they need to master edge computing themselves, simply to protect their existing businesses, instead of possibly losing that new business to others in the ecosystem.

The challenges for telcos at the edge are reminscent of their experiences in the data center or colocation business. In the former case, telcos generally have added little value. In the latter case, telco facilities have not proven to be the logical neutral host platforms.

It is possible that distributed telco central offices will become convenient locations for some types of edge computing facilities. Those will probably develop around a relative handful of important use cases requiring ultra-low latency and high bandwidth in outdoor settings.

The reason "outdoor" becomes important is that any on-site, indoors use cases arguably will work using on-premises edge computing platforms and fixed networks for access. The specific value of 5G access might not be relatively lower for indoors use cases, compared to
outdoor and untethered use cases.

If you think about it, that reflects the current value of mobile access networks. They are most valuable for untethered, outdoors, mobile instances, least useful inddors, where fixed network access using Wi-Fi is arguably a better substitute.

The larger point is that edge computing value will come as it supports important applications delivered "as a service." The physical details of colocation, rack space, cooling, power and security will be a business, but not as big a business as applications based on edge computing.

It remains unclear whether telco execs really believe they can master roles beyond access and rack space rental. Early moves by others in the ecosystem suggest why the multi-access edge computing (ISPs as the suppliers) opportunity will be challenging.

If the biggest customers for computing as a service increasingly become the biggest suppliers, opportunities for others in the ecosystem are limited.


Saturday, December 21, 2019

Are Consumers Rational About Internet Access Choices?

There is a big difference between supply and demand in the fixed network internet access or any other business. Some assume that supply is itself sufficient to create demand. That clearly is not the case. 

Ofcom estimates that 30 percent of consumers who could buy a fiber to home actually do so. AT&T believes it will hit 50 percent take rates for its fiber to home service after about three years of marketing. 

In South Korea, FTTH  take rates seem to be only about 10 percent, for example, though network coverage is about 99 percent. In Japan and New Zealand, take rates have reached the mid-40-percent range, and network coverage might be about 90 percent. But in France and the United Kingdom, FTTH adoption is in the low-20-percent range. 

Perhaps 10 percent of Australians buy the fastest service available, whether that speed is 100 Mbps or a gigabit. 

In other words, though some assume that what really matters is the supply of access connections--seen in calls for better coverage and higher speeds, or lower prices--consumer behavior suggests the demand is not quite what some believe. 

Even when it is made available, FTTH does not seem to lead to a massive consumer rush to buy. Quite the contrary, in fact, seems to be the case. Nor does the supply of gigabit-per-second connections necessarily lead to better results (higher take rates).

In the U.S. market, though perhaps 80 percent of potential customers can buy a gigabit-per-second service, only about two percent seem to do so. The reason seems to be that most consumers find their needs are satisfied with service in the hundreds of megabits per second range. 


About 51 percent of U.S. fixed network internet access customers now buy service at 100 Mbps or higher, according to OpenVault. 

Some 35 percent buy service rated at  100 Mbps to 150 Mbps, OpenVault says. About 27 percent buy service running between 50 Mbps and 75 Mbps. 

It is one thing for policy advocates to call for extensive, rapid, nearly universal supply of really-fast internet access services. It is quite another matter for any internet service provider to risk stranded assets on a major scale simply to satisfy such calls. It is, in fact, quite dangerous to overinvest, as consumer behavior shows that most people do not want to buy the headline speed services. 

Some might essentially argue that consumers do not know any better. Others might argue consumers are capable of making rational decisions. And the rational decision might always begin with how much speed, or usage allowance, actually satisfies a need. 

Price also matters. Generally speaking, the trend in the consumer internet access market has been to supply faster speeds for about the same price faster speeds for about the same price

Consumer budgets, in other words, are essentially fixed: they cannot and will not spend more than a certain amount for any and all communication services, while expectations about service levels creep up year by year. 

So critics will always be able to complain. No rational ISP will push its investments beyond the point where it believes it has a sustainable revenue model. And that model has to be based on reality, which is that real customers do not want the headline speeds advocates often push for.

Friday, December 20, 2019

U.S. Mobile Operator Spectrum Positions, Strategies Changing Fast

U.S. mobile service provider spectrum holdings, spectrum prices and spectrum strategies are changing very rapidly, in part because of the proposed T-Mobile US merger with Sprint, the emergence of Dish Network as a new provider, and very-active spectrum auctions.

Though scale is a clear advantage of the proposed T-Mobile US merger with Sprint, spectrum acquisition is key. A merged Sprint plus T-Mobile US would have huge spectrum assets, compared to all other leading mobile providers.

Though scale is a clear advantage of the proposed T-Mobile US merger with Sprint, spectrum acquisition is key. A merged Sprint plus T-Mobile US would have huge spectrum assets, compared to all other leading mobile providers.

Also, Verizon’s relative spectrum paucity also is clear. Verizon has more customers to support on its network than does T-Mobile US or Sprint, for example, and arguably the most to gain from using small cells to intensify its spectrum reuse. 

Verizon also has the most incentive to use new millimeter wave spectrum, as that represents the greatest immediate source of new mobile spectrum. 


Though Verizon in 2018 had about the same amount of spectrum as T-Mobile US, it had twice the number of subscribers, and three times the number of Sprint, which had almost twice the spectrum. 



Spectrum prices also might be a bit hard to evaluate using the traditional dollars per MHz per potential user, in part because the amounts of spectrum are so much greater for millimeter wave auctions. Where low-band spectrum with much more limited capacity once sold for prices above $1 per MHz POP, millimeter wave spectrum appears so far to be selling for $.01 per MHz POP, and should cost even less, on a MHz-POP basis, as frequency increases.

The reason is that higher frequencies feature much-greater capacity (orders of magnitude more MHz per POP). As with any business or consumer budget, there is only so much money to spend on any particular product. 

As consumers now pay between $40 and $80 for internet access for hundreds of megabits per second, where they once paid the same amounts for a few megabits per second, so too mobile service providers can only afford to pay so much for new blocks of spectrum.

So prices will fall. 


Rural Internet Access Will Always be Tough

Reality generally is more complicated than many would prefer, and supplying internet access to highly-isolated locations provides a clear example, even if some casually misunderstand the enormity of the problem

The big problem is that, at least for traditional fixed networks (fixed wireless is the exception), there literally is no business case for a communications network. Without subsidies, such networks cannot be sustained.

Assuming a standard fixed network investment cost, no positive business case can be produced over a 20-year period, the Federal Communications Commission has suggested. 

That is why satellite access traditionally has been the only feasible option for highly-isolated locations, with subsidized access the only way most networks can survive in rural areas. 

There is a direct relationship between housing density and network cost. Most of the coverage problem lies in the last two percent of housing locations. Consider many U.S. states where rural population density ranges between 50 and 60 locations per square mile, and ignore the vast western regions east of the Pacific coast range and west of the Mississippi River, where population density can easily range in the low single digits per square mile.


Assume 55 locations per square mile, and two fixed network suppliers in each area. That means a theoretical maximum of 27 customers per square mile, if buying is at 100 percent. Two equally skilled competitors might expect to split the market, so each provider, theoretically, gets 27 accounts per square mile.

At average revenue of perhaps $75 a month, that means total revenue of about $2025 a month, per square mile, or $24,300 per year, for all the customers in a square mile.

The network reaching all homes in that square mile might cost an average of $23,500 per home, or about $1.3 million.

At 50 percent adoption, that works out to roughly $47,000 per account in a square mile, against revenue of $900 per account, per year. Over 10 years, revenue per account amounts to $9,000.

The business case does not exist, without subsidies. 

It remains unclear whether new low earth orbit satellite constellations can attack the cost problem aggressively. What does not seem unrealistic is that revenue expectations remain in the $100 per month range, for internet access.

AI Will Improve Productivity, But That is Not the Biggest Possible Change

Many would note that the internet impact on content media has been profound, boosting social and online media at the expense of linear form...