Tuesday, April 3, 2018

Can Independent ISPs Get 50% Market Share?

Can independent internet service providers (public or private) actually get 50-percent market share when competing against telcos and cable companies? Ting Internet believes so, but results from other firms suggest the level of competitor pricing really does matter. G

It always is difficult to quantify take rates for gigabit internet access services, as virtually no internet service provider ever releases such figures. That has been the case since faster internet access services, priced at a market premium, began appearing. The reason for the reporting reticence is that take rates for the fastest, most expensive tier of service tend to be minimal.

Still, ISPs do tout some figures. Mediacom, for example, claims that between 10 percent to 20 percent of its new accounts are buying  gigabit services costing between $125 a month to $140 a month. Again, it is hard to quantify what that means.

The actual number or percentage of account that change providers every year (churn) in the fixed network internet access business arguably varies between markets with strong offers (where fiber to home is sold, and where gigabit cable services also are sold).

Churn arguably is highest where a cable operator offering speeds in the hundreds of megabits per second competes with a telco only able to sell slower digital  subscriber line service. AT&T and Verizon, for example, tend to see low churn rates, where other independent telcos with less fiber to home tend to see higher churn rates.

Much obviously depends on pricing levels. In markets where a gigabit ISP sells 1,000-Mbps service at prices that match the current or legacy prices for slower service (perhaps 100 Mbps), take rates can climb dramattically.

What is harder to model are markets where a clear price premium exists. It will matter when a reasonably-fast standard offer costs $50 a month and the gigabit offer costs more than $125 a month.

Presumably, in such market, demand will be anchored by business demand, higher-income households and multi-user households.

Perhaps the most-optimistic provider to make public predications is Ting Internet, which tends to argue it will get adoption as high as 50 percent in any new market it launches, within five years or so. Initial take rates when first marketed appear to be about 20 percent.

In its fourth quarter 2017 report, Tucows reported Ting Internet take rates of 30 percent take rates in areas where it is able to market its gigabit service.

Ting prices its gigabit service at $90 a month. At that price, it is lower than Comcast and other cable companies charge, but higher than the $70 a month some other ISPs offer. The point is that Ting is pricing at a significant price premium to “standard” offers that offer less value (in terms of speed), but not pricing as high as cable companies or telco practice.

We are likely to see much more of this sort of independent ISP competition in the fixed market, not to mention 5G-based gigabit offers from mobile suppliers.

At least in principle, more than 100 Colorado communities could see some form of
municipal broadband network created, as voters in those communities have approved such moves.

Longmont, Colo. already has built out a portion of its planned gigabit internet access network, aided by that city’s ownership of a municipal power utility, meaning Longmont owns rights of way, distribution facilities, rolling stock and other assets helpful to creating a city-wide internet access network.

In Centennial, Colo., private internet service provider Ting Internet will piggyback on a new government network to be built by the city of Centennial itself.

Also, it sometimes is difficult to ascertain precisely what take rates are, since many independent ISPs challenging cable or telco suppliers seem to count “units sold” rather than “households served.”

That matters when an ISP sells two or more products, and then calculates adoption rates as “units sold divided by households passed.”

In other words, penetration is measured in terms of revenue-generating units, not “locations” or “households.” Each unit sold (voice, video or internet access) is counted against the base of locations. So a single location buying three services results counts as much as three other homes buying just one service.


Customer “penetration” by household therefore is different from penetration measured as a function of units sold. The difference is that determining the magnitude of stranded assets hinges on how many locations passed generate revenue.

Assume that, on average, a typical household buys 66 percent of the total suite of services (two of three triple play services or  three of five services, for example).

The difference is significant. Measuring “penetration” by units sold, penetration appears to be as high as 76 percent to 87 percent. Measured as a function of homes generating revenue, penetration could be as low as nine percent, or as high as 44 percent, with a “typical” value being something between 20 percent to 25 percent of homes passed.

Penetration: Units Sold or Homes Buying Service?

Morristown
Chattanooga
Bristol
Cedar Falls
Longmont
homes passed
14500
140000
16800
15000
4000
subscribers
5600
70000
12700
13000
500
units sold
39%
50%
76%
87%
13%
services sold
3
3
5
3
2
HH buys .66 =
2
2
3
2
1
Homes served
2828
35354
3848
6566
379
penetration
20%
25%
23%
44%
9%

It might be worth pointing out that all these communities (Morristown, Chattanooga, Bristol, Cedar Falls and Longmont) have municipally-owned utility companies, and might therefore represent a sort of best case for retail operations serving consumers.

That seems consistent with other evidence. In markets where a telco and a cable operator are competent, as is the attacking ISP (municipal or private), market share might take a structure of 40-40-20 or so, possibly 50-30-20 in areas where the telco does not have the ability to invest in faster broadband and the cable operator has the largest share.

Beyond the actual cost of the network, and the business role chosen by the municipality, details of revenue generation (homes that generate revenue as a percentage of total; number of services offered) are fundamental.

Beyond that are the other operating and marketing costs, overhead and need for repaying borrowed funds and making interest payments, on the part of the retail service provider.

One might argue that most other communities, without the advantages ownership of an electric utility provides, will often find the lower risk of a shared public-private approach more appealing.

Also, some ISPs might find the availability of some amount of wholesale or shared infrastructure makes a meaningful difference in a business model.

One might suggest there are a couple of potential practical implications. Efforts by incumbent ISPs to raise retail prices in the same way that video entertainment prices have grown (far higher than the rate of overall inflation) will increase the odds new competitors enter a market.

Higher prices, in fact, will increase the likelihood of new entrants entering a market, as the higher prices increase the attractiveness of doing so.

In at least some cases, the new competitors will be firms such as Verizon, which now has announced it will essentially overbuild an AT&T and Comcast markets in Sacramento, Calif.

Though it is not easy, more competitive ISPs are likely to enter more markets, as lower-cost access platforms evolve, helped in some cases by municipal facilities support.

Where that happens, it is conceivable that the incumbents will see a new limitation on their market share, dipping from possibly 50-percent share to a maximum of perhaps 40 percent each, on a long-term basis, assuming the new competitor is not eventually bought out by one of the incumbents.

Monday, April 2, 2018

If the Internet is Changing, Backward-Looking Regulation Maybe Not So Smart

It is a truism that regulators never can keep up with technology, which is a reasonable argument for caution where it comes to regulating just about anything central to the  internet, which keeps changing.

The logical implication might be that the historic light-touch treatment of anything related to computing likely should be the presumption, even if consumer protection or antitrust remedies might from time to time be deemed necessary.

The reason for such caution is that computing and the internet itself continue to change. And if there is anything history shows, it is that the leaders in any single era rarely--if ever--are the leaders in the succeeding era. So “punishing” a leader in one era, when an era is about to evolve, does not make too much long-term sense.

We might today believe it is nonsensical to consider the web browser a source of competitive advantage, as though the browser provides some sort of business moat. Nevertheless, not so long ago, the browser was considered a source of anti-competitive advantage.

Not so long ago, ability to offer phone services using Class 5 switches was considered to be a major pro-competitive step. Before that, long distance voice was the profit driver for the telecom industry. Things change, and change quite a lot.

Leadership in the mainframe computing era was not the same as leadership in the mini-computer or personal computing eras. And none of those leaders were key in the early internet age.

Someday, Google, Facebook and others might not be the leaders in some future evolution of the internet ecosystem. The point is that regulation is sometimes pointless when it is backward looking.

The internet has grown through multiple eras, any observer can note. Where the internet was a low-bandwidth method for researchers to communicate, it then became a consumer transaction and information platform, before becoming a social media vehicle, and now a content delivery mechanism. Next, the internet is on the cusp of possibly becoming a major platform for enterprise private networks.

With the coming internet of things evolution, autonomous vehicles and sensor networks of many types could represent the next big wave of development for the internet, as the internet has defined the most-recent eras of computing.



Saturday, March 31, 2018

5G is Like the Tail on a Dog

5G is to networking as telecom is to the internet. That is to say, 5G is part of a larger shift of networking as "telecom" has become a tail on the internet dog.

Specifically, 5G is part of a larger transformation of global public networks involving much lower latency, much more virtualization and new roles for data centers. Those features, in turn, are required to lower the cost of running networks as well as create the foundation for new categories of services that will drive incremental revenue at scale.

Extremely low latency, high connection density, high reliability and gigabit speeds are driving the design of the whole new architecture, with implications for access, cloud computing and virtualization.

Virtualization is a key change, with separation of control and signaling functions from delivery of end user traffic becoming key. Lower cost is among the expected outcomes. Greater flexibility also is anticipated.

What might be relatively unexpected is that virtualization will increase the ability to create new virtual networks on a wider scale. That could have key implications for new suppliers, including suppliers that only want to create large virtual WANs to support internal
business requirements.

The analogy here is the shift of wide area networking from “mostly public” to “mostly private.” In other words, on many routes, enterprises carry most of the traffic, but only to support their own internal operations. Major app providers, in other words, have vertically integrated WAN services.

That has business model implications both for the enterprises and suppliers of WAN services.

Also, to support new ultra-low-latency apps, data center functions will emerge at the edge of the network. In that sense, 5G and edge computing are parts of a larger trend, the creation of next-generation networks that are both virtualized and built around latency performance.

That stringent latency performance will be needed to support emerging new applications in the connected car, internet of things and real-time application areas. Bandwidth will matter, but latency arguably is the key new capability.   

Friday, March 30, 2018

The Prejudice Against Bigness

It would be reasonable enough to argue that there is an almost-instinctive distrust of "bigness" in business, as such bigness is presumed to be responsible for destroying the fortunes of small and local businesses.

It must also be said that it is consumers who propel the rise of "big businesses," since it is consumers who prefer the products, prices or other attributes of bigger businesses, over those of smaller suppliers.

We might be tempted to decry bigness, but consumer choice is what produces bigness. And that means bigness is not always and inevitably bad.

Sometimes, in fact, bigness might be a survival requirement. Consider some foundations of business in the internet era. Among the key trends are a few that define internet market dynamics:

* Pricing and margin pressure
* New competitors from outside the existing value chain
* Winner take all market share dynamics
* Disintermediation of distributors in the value chain

In case you somehow missed the trend, the internet leads to lower retail prices and lower supplier profits, as price transparency increases and distributors are removed from the value chain. 

Faced with lower retail prices, suppliers can respond in a few ways. They can sell more units at the lower prices, which requires more scale. Firms can sell different products, which means moving into new or different parts of the value chain. And firms can take on additional roles within value chains, increasing the ways they make revenue. 

The point is that all that tends to imply firms get bigger. 

Also, the internet allows firms outside the existing industries, possibly operating with different value and revenue models, to enter existing businesses and disrupt them. That creates additional pressure on sales volume, prices and profits. 

Market disruptors also frequently rely on internet capabilities to remove cost from the value chain by eliminating steps and roles in the distribution process, also after changing costs in the production part of the value chain. 

The "winner take all" nature of many internet-reliant businesses and industries also contributes to bigness. 

So we need to be careful not to confuse bigness, which might well be a requirement for firm or industry survival, with actual anti-competitive behavior on the part of big firms. Abuse always is a possibility when power exists in a market. But abuse is not automatic, and hopefully not a routine outcome.

Nor should we automatically penalize firms for delivering higher quality products at lower prices. That is the outcome we are supposed to gain from competition. 

But sometimes competition can only be fostered by big firms. Break up the big firms and you also possibly destroy ability to innovate and create even more consumer benefits. 

Bigness is not inherently bad. 


Antitrust The Wrong Solution for the Wrong Problem

At the risk of oversimplifying, the apparently-growing sense that “something has to be done” about the size of today’s firms (financial, retail, telecom, internet apps and so forth) is likely ill considered.

As profit is wrung out of all value chains affected by the internet, firm revenue and profits fall, if not to zero, then always in that direction. There is only a few long-term solutions for such margin compression: additional scale in existing businesses, and a move into new businesses, elsewhere in the value chain.

Both strategies require that firms get bigger. So attacking "bigness" also means attacking chances for firm survival.

The demand for scale is--virtually all agree--a byproduct and necessity in an era of price transparency, lower protections from market entry by “outsiders,” falling prices and profits in virtually all incumbent businesses and markets.

We cannot easily repeal the economic impact of the internet, even if we wanted to do so. And make no mistake, the internet will reshape nearly every industry in certain ways, the cumulative effect of which is to wipe out profit margins. That, in turn, drives the need for scale.

Also, economic eras change. And that is why, in retrospect, lots of antitrust action seems not to work. It ultimately did not matter what we did about the size of the oil or steel or auto industries, as they were destined to lose their place as the economic engines anyhow.

The big revision of U.S. telecommunications law (the Telecom Act of 1996) aimed to introduce competition for voice services, precisely at the point that voice was shifting to mobile delivery and the rest of telecom was shifting to the internet.

Some of us question the long term value of antitrust action applied to Microsoft (the internet era displaced the PC era anyhow).

And, for such reasons, some of us are quite skeptical about the value of antitrust action against AT&T, Amazon, Facebook, Google or Apple, especially when the types of action we see are instances of vertical integration, which does not lessen competition in existing markets.

The big problem is that we humans are always “fighting the last war,” seeing dangers that already are destined to pass. Consider the U.S. Department of Justice blocking of two big mergers in the health insurance industry, and the blocking of mergers in the retail pharmacy industry, because of concerns about excessive concentration of power that seem, in retrospect, unnecessary.

Walgreens abandoned its effort to buy Rite Aid in 2017, after the Federal Trade Commission said it would review the proposed transaction on competitive grounds.

That was a horizontal merger. Now, however, a big possible wave of vertical transactions might happen, as the pharmacy benefits and insurance industries face the new threat of Amazon entry into the ecosystem.

That already is raising calls for antitrust action to block vertical deals such as insurer Cigna merging with Express Scripts. Pharmacy CVS is merging with insurer Aetna.

Other deals might happen as well, including Walmart and Humana.

The point is that, in the internet era, scale becomes a necessity, to deal with lower revenues and profits, loss of value and business model disruption.

The point is that firms and industries need to change in the internet era, and gaining scale, plus entering new parts of the ecosystem, are often necessary survival and relevance actions.

We may generally believe “bigness is bad,” but bigness might well be necessary when profits, margins, value and roles are compressed.


What is “big” in a legacy context might well be “small” in a broader internet context, where the definition of “market we are in” changes, or has to change.

Some might say antitrust concerns considered “reasonable” in a horizontal context actually make no sense in a “vertically changing” industry or market. That might especially be true when whole markets are shrinking, disappearing or are threatened with that eventuality, as value shifts.

That is why some of us view the U.S. Department of Justice antitrust case against the vertical AT&T merger with Time Warner unfortunate and misplaced. The industry is going to have to change, vertically, to survive, as AT&T’s major competitors already are doing.

Vertical integration already is happening in many parts of the internet ecosystem, meaning the question who are my competitors changes.   



Thursday, March 29, 2018

5G Will Accelerate Mobile and Wireless Substitution for Internet Access

Up to this point, 4G has been a primary method of internet access for a substantial, but still minority of usage in developed markets, though it arguably has been a primary or exclusive form of access in developing markets.

But the 5G era is likely to accelerate those trends in a major way. For the first time, mobile or 5G-based fixed wireless networks will offer speeds and retail prices as good--or better--than fixed alternatives, with latency performance that is better .

Many of us would not be at all surprised if wireless substitution as much as doubled over the first decade of 5G commercial service. In some developed markets, that could mean that wireless access takes as 20 percent share of the residential internet access market. In other markets, wireless internet access could reach 30 percent or higher.


You might be tempted to think that it mostly is lower-income people who use their mobile devices as the sole means of internet access. In many countries, substantial numbers of people--rich and poor--do so.

While it is true that, in the United Kingdom, relatively few people actually are “mobile only for internet access (2.5 percent of the richest consumer households; perhaps seven percent of the poorest  households), in Canada, 20 percent of the richest households are mobile only for internet access, while about 30 percent of the poorest households are mobile only.

Households in rural areas also are more likely to rely on mobile networks for their internet access requirements as well.

Single-person households and households with younger consumers also are more likely to be mobile only for internet access.

Less than a tenth of people in France and the UK were mobile-only, but in Turkey the figure was more than three times higher.

In Latin America, for example, Deloitte Brazil believes that over a third of all homes in Brazil were mobile data only. And in China, a fifth of the online user base (rather than households) were mobile-only as of 2016.

In Tokyo, where fiber optic connections are widely available, hundreds of thousands of homes (or about five percent) are relying on only mobile in 2017.

In Latin America, Deloitte Brazil believes that over a third of all homes in Brazil were mobile data only.

Better mobile networks, especially 4G LTE, have made mobile internet user experience more like fixed access.

Deloitte Global predicts that 20 percent of North Americans with internet access will get all of their home data access from mobile networks in 2018.

Deloitte Global further predicts that a mixture of mobile and fixed wireless access technologies could lead to 30 percent to 40 percent of the population relying on wireless for data at home by 2022, an increase from 10 percent in 2013.


What is unclear is how wireless access might change over the next five years. Deloitte argues that, by 2022, wireless home internet solutions will grow both at the low end of the market (homes using relatively little data) and portions of the market that otherwise might have purchased a fixed connection.


Directv-Dish Merger Fails

Directv’’s termination of its deal to merge with EchoStar, apparently because EchoStar bondholders did not approve, means EchoStar continue...