Tuesday, July 29, 2014

Are Usage Caps Inherently Unfair?

Attitudes about usage-based pricing of Internet usage seem always to be contentious, because significant revenues for ecosystem participants is involved. That isn’t to say there are no important consumer protection issues.

Transparency is always difficult, since most consumers cannot determine with precision the bandwidth implications of their usage patterns and preferences.

That is why some argue it is unfair and non-transparent for ISPs to sell today’s usage-based plans--with overage charges--especially since many consumers do not use much data, and buy the wrong plans, or because some consumers might rather routinely find themselves paying overage charges.

Overage charges might not be a big issue today. But some fear usage caps could become a bigger issue as user behavior changes, and more consumers start to bump up against the caps, no matter how those usage plans are constructed.

Transparency is a genuine problem. Most consumers can only tell how much data they use after a few billing cycles.

And, as with many other products, predictable cost is preferred. Most mobile consumers buy bigger data plans than they actually believe they will use in full, to avoid overage charges.

Unlimited usage plans provide that assurance, but few ISPs believe they can offer such plans indefinitely, as usage continues to climb. Usage caps are a compromise between full metered plans consumers dislike and unlimited plans ISPs will, at some point, not be able to afford to offer.

The issue for high speed access services is that application usage profiles have changed. Even heavy use of dial-up access, in an app environment optimized for limited bandwidth, did not cause much strain on access networks.

That is not the case for today’s visual apps, especially when full-motion video is commonplace.

The new problem is that “heavy usage” really does have much more serious capital investment implications.

Usage plans can be unfair, if users do not have an understanding of roughly how much data they use, without an order or two of magnitude.

But most fixed network Internet service providers tell the Government Accountability Office that only one percent to two percent of users exceeded their data caps. That essentially is an argument that metered usage plans generally work well, and are not unfair.

On the other hand, lack of transparency can lead to overage charges that, in some businesses, have been a significant source of revenue for some providers. One thinks back to video rental providers such as Blockbuster Video, which earned a sizable proportion of its revenue from late charges.

That, in principle, is similar to “overage charges” incurred by Internet access customers who exceed their usage caps. To be sure, most ISPs have gotten better at warning users when their usage is approaching a usage cap.

And, for most users, caps do not seem to be a major issue.

The GAO report on usage-based pricing, to be released in November 2014, apparently notes that users typically do not understand how much data they actually use. The implication is that usage caps are at the very least non-transparent, and might be unfair, since people buy usage plans far larger than they actually require.

Sandvine data suggest that many consumers, especially those who watch lots of Internet-delivered television or video, consume an average of 212 gigabytes of data a month, which is close to many existing data allowances, the GAO will report.

On the other hand, consumers do seem to understand that they use much more data at home than they do on their mobiles.

Consumers in eight focus groups reportedly expressed few serious concerns about usage-based pricing of mobile Internet plans, but had "strong negative reactions" to such pricing of fixed network ISPs. That suggests consumers do have an understanding about the volume of their data usage, at a high level.

The GAO's preliminary observations stated that usage-based pricing could limit innovation or creation of data-heavy apps because some consumers may restrict their Internet use to save money.

Some, including many ISPs, would argue that there is a simple, consumer-transparent way to deal with such issues. TV, radio and other content made available to consumers routinely has relied on advertising to defray cost.

Providers of cable, satellite and telco TV do the same thing, in essence, bundling the network bandwidth costs into the retail price of entertainment video services. In principle, that could be applied for bandwidth-intensive video entertainment as well.

Predictably, content app providers oppose that notion, as it could raise their costs of doing business, if app providers wind up subsidizing bandwidth consumption of their customers.

ISPs argue they cannot indefinitely increase network capacity without raising prices, and that therefore some way for consumers to correlate usage and price is necessary.

If Internet apps were not loosely-coupled, there would not be a problem. Suppliers would simply embed network costs into the retail cost of products. So the problem likely will become more manageable over time. But usage plans are unlikely to stop being an issue. Revenue models are at stake.

Is Windstream REIT the Beginning of a Trend?

Windstream is spinning off its fiber and copper networks and other real estate into a new real estate investment trust, and then lease use of the assets to support its communications business.

Basically, creation of the REIT allows Windstream to deleverage.

Windstream expects to retire approximately $3.2 billion of debt as part of the transaction, resulting in the company deleveraging to 3.3 times “debt to adjusted operating income before depreciation and amortization” immediately at closing.

So the new speculation is whether other larger service providers, such as Comcast, AT&T or Verizon Communications, not to mention likely candidates Frontier Communications and CenturyLink, might also take such a route.

In part, the value comes from shifting leverage to the REIT operations, thus reducing leverage for the retail operating part of the business. Also, in part, the value comes from tax benefits that should lead to higher cash flow available to the non-REIT company that remains.

The tricky part is that the core network becomes something like a master limited partnership. Are executives comfortable with that arrangement?

Such moves to spin off network assets into separate REITs are not precisely the same as creating structurally separated “network” companies. Those sorts of proposals would turn the network into a wholesale supplier of connectivity to all who wished to pay the network company for services.

U.S. telcos and cable companies traditionally have been loathe to do so. A REIT presumably would have the ability to avoid wholesale obligations greater than would have been the case under the existing arrangements.

But there would be at least some uncertainty.

Still, any chance to significantly deleverage would be helpful to most service providers looking to boost cash flow and ability to invest in strategic growth opportunities, something Windstream touts as a key benefit of its REIT move.

By One Classic Test, U.S. Telecom Markets Remain Unstable

Most big markets eventually take a rather stable shape where a few firms at the top are difficult to dislodge.

Some call that the rule of three or four. Others think telecom markets could be stable, long term, with just three leading providers. The reasons are very simple.

In most cases, an industry structure becomes stable when three firms dominate a market, and when the market share pattern reaches a ratio of approximately 4:2:1.

A stable competitive market never has more than three significant competitors, the largest of which has no more than four times the market share of the smallest, according to the rule of three.

In other words, the market share of each contestant is half that of the next-largest provider, according to Bruce Henderson, founder of the Boston Consulting Group (BCG).

Those ratios also have been seen in a wide variety of industries tracked by the Marketing Science Institute and Profit Impact of Market Strategies (PIMS) database.

By those rules, most mobile markets, globally, are unstable. Using those ratios, the U.S. fixed network business also is unstable. So are some other related markets.

Many have noted the concentration of smartphone profits by just two suppliers--Apple and Samsung--for example.

A couple of important additional ratios seem to be important. Under certain conditions, competitors can reach a point where destabilizing the market is viewed as dangerous.

A ratio of 2:1 in market share between any two competitors seems to be the equilibrium point at which it is neither practical nor advantageous for either competitor to increase or decrease share, Henderson has argued.

That would seem to explain why marketing attacks in stable markets are not designed to upset market share, but only to hold existing share.

Any competitor with less than one quarter the share of the largest competitor cannot be an effective competitor. In a market where the largest provider has 30 percent share, that implies an attacker has to gain at least 7.5 percent share to remain viable.

There are some very-important strategic and tactical implications. Virtually any market that does not yet have the “rule of three” pattern and the 4:2:1 market share structure is going to be unstable, unless there are government-imposed restrictions on competition that allows the market structure to change.
But when markets are allowed to consolidate, growth is key. All competitors who survive must grow faster than the market average.

All except the two largest share competitors will be either losers and eventually eliminated or be marginal cash traps reporting profits periodically and reinvesting forever.

Anything less than 30 percent of the relevant market or at least half the share of the leader is a high risk position, long term.

Firm strategy also therefore is clear: cash out of any position quickly if the number-two market position cannot be gained, or aim to take the number-two spot.

Definition of the relevant market and its boundary barriers becomes a major strategy evaluation. In other words, knowing “who else is in our business?” is necessary before any firm can assess its market share position and challenges.  

Shifts in market share at equivalent prices for equivalent products depend upon the relative willingness of each competitor to invest at rates higher than the sum of market growth rates and the inflation rate. In other words, if markets are growing at two percent, and the inflation rate is two percent, than a leading contestant has to invest at rates greater than four percent annually.

Anyone who is not willing to do so loses share. If everyone is willing to do so, then prices and margins will be forced down by overcapacity until someone stops investing.

The faster the industry growth, the faster the shakeout occurs, Henderson has argued. There also is one rule that applies directly to the U.S. mobile market: near equality in share of the two market leaders tends to produce a shakeout of everyone else.

That is the case for Verizon Wireless and AT&T Mobility, and underpins the argument advanced by Sprint and T-Mobile US that they cannot prosper, long term, unless they merge.

The market leader controls the initiative. And though equity holders do not like the practice, cutting prices to maintain share is the “right” strategy. Any market-leading firm that chooses to maintain near-term operating profit while losing share, will not survive.

If the market leader, under attack, prices to hold share, there is no way to disrupt the market, unless the company with number-one share runs out of money to maintain market share, Henderson has argued.

One might argue about why such patterns are seen, but market share seems directly related, typically, both to cost and profitability. In other words, cost and profit margin are functions of market share.

Under most circumstances, enterprises that have achieved a high share of the markets they serve are considerably more profitable than their smaller-share rivals, according to the Marketing Science Institute and Profit Impact of Market Strategies (PIMS) database.

The ratio 4:2:1, representing market shares in telecom markets, is key. That pattern suggests the market is stable. At the moment, it is hard to identify too many telecom markets that fit the pattern well.

And tells you what you need to know about market disruption. It is possible, because markets are unstable.

Monday, July 28, 2014

Nashville to Get AT&T Gigabit Services

AT&T, which is evaluating 1-Gbps networks in as many as 21 metropolitan markets, has definitely decided to build a gigabit network in Nashville.



The AT&T "GigaPower" network will provide services offered over an all fiber network featuring symmetrical upload and download broadband speeds up to 1 Gigabit per second. 



It is likely AT&T already has concluded that at least some neighborhoods in those 21 metro areas will provide a positive business case for gigabit Internet access, and simply wants expedited permitting and other agreements from cities of the sort Google Fiber has pioneered. 



There are several important associated changes. AT&T historically has been a proponent of "fiber to neighborhood," rather than "fiber to home" for its next generation networks. AT&T now will be shifting to a "fiber to home" access architecture where gigabit services are offered. 



The other huge change is retail pricing and packaging. Few U.S. consumers had appetite for gigabit access when it cost $300 a month. But lots of consumers have concluded that a gigabit per second service costing $70 to $80 a month is attractive. 



screen-shot-2014-04-21-at-12-57-51-pm.png




Cambodia Government Wants to Nationalize Mobile Infrastructure

A new Ministry of Posts and Telecommunications draft law in Cambodia proposes to nationalize telecom infrastructure such as cabling networks and towers. 

If the draft law is approved, telecom companies will have to rely on government-controlled infrastructure providers, after first divesting their networks.

The law also apparently would require all operating companies to return their operating licenses as well. 

You can imagine what such a law, if approved by the National Assembly, is likely to do to the telecom business in Cambodia. 



Saturday, July 26, 2014

U.S. Broadband Faster, More Available Than in Europe, Study Finds

On some measures, U.S. consumers have access to, and use, faster Internet access services, more than consumers in Europe, in large part because U.S. policies have encouraged investment, compared to European policies that historically have been more focused on wholesale access to encourage competition.

It also would be fair to note that most communities have access to at least two facilities-based providers, a fact that arguably has encouraged investment in upgraded facilities.

Google Fiber’s entry also has had a direct impact, encouraging other Internet service providers to drop their prices to $70 to $80 a month for gigabit access, and to invest in such facilities.

Some would argue Google Fiber’s decision not to allow wholesale access, and thus reap the benefits of capital and operating investments, is one example of how there are incentives to invest. Google Fiber cannot be compelled to sell wholesale access, especially at low rates, to other competitors.

A far greater percentage of U.S. households have access to Internet access at 25 Mbps or faster, the study argues.

On a national basis, 82 percent of U.S. consumers can buy access at 25 Mbps or faster, compared to 54 percent of Europeans.

In rural areas, 48 percent of U.S. rural consumers have access to 25 Mbps or faster services, compared to 12 percent in Europe, according to a study by Christopher Yoo, University of Pennsylvania law school professor.

The study also found that the United States had 23 percent fiber-to-premises coverage, compared to 12 percent in Europe.

The United States also has 86 percent coverage of Long-Term Evolution (4G LTE), compared to 27 percent LTE coverage in Europe.

U.S. download speeds during peak times (weekday evenings) averaged 15 Mbps, below the European average of 19 Mbps, however.

During peak hours, U.S. actual download speeds were 96 percent of what was advertised, compared to Europe, where consumers received only 74 percent of advertised download speeds.

U.S. consumer experience in the areas of latency and packet loss also was better than in Europe.

U.S. broadband “stand-alone” prices were cheaper than European broadband for all speed tiers below 12 Mbps.

U.S. broadband was more expensive for higher speed tiers. The caveat is that most U.S. consumers do not appear to pay “stand-alone” prices for fixed network broadband, typically buying bundles that in essence discount prices.

Consider that 97 percent of AT&T customers bundle their video subscription service with other AT&T services.  Cable providers have 75 percent or more of their subscribers on a bundle of video and broadband, AT&T notes.


Standard coverage is available in 99.5 percent of U.S. households and 99.4 percent  of European households. Standard fixed coverage is available in 95.8 percent of U.S. households and 95.5 percent of European households, the study found.

Mobile broadband coverage at 3G speeds also fall within quite similar ranges, covering
98.5 percent of U.S. households and 96.3 percent of European homes.

Yoo attributes a regulatory “light touch” for higher U.S. investment in broadband and expanded access to high-speed internet in the US compared to Europe.

The University of Pennsylvania Law School study also showed that Europe’s treatment of broadband as a public utility, which some net-neutrality advocates are pushing for in the US, has hindered internet access growth there.


The study notes that, in Europe, where telecom service revenues have fallen by more than 12 per cent since 2008, the financial return for investing in next-generation networks is less promising, since those facilities must be leased to competitors, allowing them to avoid building their own networks.

Those rules also reduce the “scarcity value” of new networks, though.

“The empirical evidence thus confirms that the United States is faring better than Europe in the broadband race and provides a strong endorsement of the regulatory approach taken so far by the US,” said the study, which was written by law professor Christopher Yoo.

The differences in regulatory regimes also contributed to $562 of broadband investment per household in the US versus $244 per household in Europe, where regulators treat broadband as a public utility and promote service-based competition where new players lease existing facilities at wholesale cost.

U.S. policy has emphasized facilities-based competition by firms that can build new facilities, and then reap any rewards, without enabling competitors.

U.S. policy also shows the importance of competition between cable TV and telcos. Although many advocates regard telco fiber to the home as the primary platform for faster networks, the data suggest otherwise.

In Europe, DOCSIS 3 (39 percent coverage as of 2012) and VDSL (25 percent) both contribute more to fast network coverage than does FTTP at 12 percent.

In terms of actual subscriptions, the distribution skews even more heavily towards cable connections (DOCSIS 3), with 57 percent of subscribers, followed by FTTP at 26 percent,
and VDSL at 15 percent.

Even if one were to focus exclusively on FTTP coverage, the data clearly give the edge to the U.S. market. As of the end of 2011, FTTP service was available in 17 percent of U.S.
households and 10 percent of European households. By the end of 2012, FTTP service increased to 23 percent of U.S. households and 12 percent of European households, the study found.


But mobile Internet access now is more important than ever. As of the end of 2011, Long Term Evolution networks covered 68 percent of the U.S. population and eight percent of European households.

By the end of 2012, LTE coverage increased to 86 percent of the U.S. population and 27 percent of European households. Note the difference in data collection, though. European dat is “by household.” U.S. data is by “person.” That understates European coverage figures, to the extent that households have more than a single occupant.

On the other hand, average download speeds at peak periods are higher in Europe, compared to the United States.  

Verizon Wireless Applies Congestion Management for Customers on Unlimited Usage Plans

About 22 percent of Verizon Wireless customers are on unlimited usage plans. Of those customers, about five percent are deemed by Verizon Wireless to be "heavy users" consuming 4.7 Gbytes a month of data, or more.



So Verizon now manages data connection speeds for users on unlimited Long Term Evolution 4G plans, at specific cell sites experiencing congestion or high demand, while those heavy users are being served from the congested, or potentially-congested cell sites, while the congestion lasts. 



The speed management practices are applied to the heavy users only when congestion at congested cell sites exists, and there is no throttling of speeds after such events. 



Verizon customers on metered plans will not be affected, Verizon says. 



"Admission control" traditionally has been a key means for preserving quality of experience on networks facing high demand at peak hours. 



Internet access services traditionally have not used such mechanisms, and simply degrade under load. 



Observers might disagree about whether admission control is the best way to preserve quality of experience at times of high network demand. But it does work.








Will Video Content Industry Survive AI?

Virtually nobody in business ever wants to say that an industry or firm transition from an older business model to a newer model is doomed t...