Friday, March 13, 2015

Is U.S. High Speed Access Market Competitive, or Not? Data Conflicts

The U.S. Federal Communications Commission has released the text of its order that federally preempts Tennessee and North Carolina law restricting municipal broadband service.

The FCC says it acted under provisions of section 706 of the Telecommunications Act of 1996
that authorizes the FCC to adopt policies promoting broadband infrastructure investment and competition.

Just how much competition exists, in the high speed access market, and how much is feasible, are matters about which reasonable people can, and do, disagree. The majority FCC view obviously is that too little competition exists, while others would argue competition is reasonably robust, and is increasing.

Others might argue there is a danger that competition could decrease, long term, based substantially on the worsening profitability of such services. That is not to argue that every provider is so challenged.

On the other hand, the most-important suppliers--cable TV firms and the largest telos--do face growing challenges to the basic business model.

Consider mobile Internet access pricing. From 2010 to 2013, U.S. mobile data pricing (per unit sold) declined by only single digits year over year.

But in the first nine months of 2014, data pricing dropped by 77 percent, according to industry analyst Chetan Sharma.

In the fixed business, Google Fiber has changed consumer expectations about market level speed and pricing, creating an expectation that a symmetrical gigabit service costs $70 a month. Other suppliers essentially now are working around the Google Fiber price leadership.

Consider what Sonic.net is doing. It now is selling gigabit connections, with voice service, for $40 a month, submarining even Google Fiber pricing of $70 a month for gigabit high speed access.

CenturyLink now is selling a gigabit service for about $110 a month, guaranteed for a year.

A 100-Mbps service costs $70 a month, with the price guaranteed for a year.

A 40-Mbps service costs $30 a month, guaranteed for a year. All those prices are for stand-alone service, with no phone service.

A year ago, a CenturyLink offer of 40 Mbps would have cost more than $77 a month.

The point is that some claim there is not enough ISP competition to create consumer benefit. Others would argue the price data suggests competition already is robust, leading to substantially lower prices offered by the suppliers in a position to sell to most households, and radically lower prices in some markets.

Without Abandoning Price Attack, Free Mobile Plans to Boost Profit

It long has been obvious that Illiad’s Free Mobile attack on French mobile market pricing would eventually have to pivot a bit to ensure sustainable profit margins. Illiad might now, after attaining market share of about 15 percent, be preparing to do so.  

The company now says it aims to boost operating profit by 10 percent in 2015. Earnings before interest, taxes, depreciation and amortization will grow more than 10 percent in 2015 after rising 6.6 percent to 1.3 billion euros ($1.4 billion) last year, Iliad said.

Oddly enough, significant mobile account additions hurt profit margins, as mobile accounts represent lower gross revenue and profit than the fixed line accounts.

Illiad might be banking on a shift of Free Mobile accounts from basic to enhanced service plans, specifically those supporting mobile Internet access.

That doesn't mean Illiad is going to abandon its price assault in the French market, only that it believes it can encourage a higher percentage of accounts to pay for mobile Internet access.

Is DSL Progress at a Limit?

With increasing stress on some service provider business models related to fiber to the home investments, the logical question is how much more can be done to prolong the life of the copper access infrastructure.

For 25 years, the answer has been "lots more." And some believe the progress is not yet at a limit.

The ability to deliver higher speeds over digital subscriber line or hybrid fiber coax,  for example, translates into lower retail costs, since the full replacement of the access network is avoided.

In that regard, it is worth noting that digital subscriber line has surprised even its would-be supporters. There was a time when the chief technology officer of a tier one global infrastructure supplier could say, privately, that “DSL won’t work.”

But smart people threw effort at the problem, and DSL did work. The big tradeoff has been distance versus bandwidth, so shorter access loops mean higher speeds are possible.

The other persistent limitations include line noise and the availability of spare wire pairs to bond.

Oddly enough, in many cases the abandonment of voice services by a majority of former users means there are more available copper pairs to be bonded for the remaining potential customers.

Roughly speaking, 50 percent fewer customers also means roughly 50 percent more copper pairs available to be bonded into high speed access lines.

Within some distance parameters, it is believed possible to push beyond 10 Gbps. The issue is the distance at which this is possible, how well the existing physical plant corresponds to ideal laboratory conditions, and the service provider’s ability to pull fiber deep enough into a serving area to support short access loops.

At least as Alcatel-Lucent’s Bell Laboratories sees matters, progress is not at a limit. That is a fundamental change from some thinking in the early 1990s.

In principle, and with the caveat that distance matters, speeds ratcheting to 40 Gbps are conceivable.

The lesson here is clear. When we say “something cannot be done,” that is a conditional statement. We mean “something cannot be done today, by us, at a cost that allows commercial operations and business models.”

Conditions can change.

There are other implications. When one tries to measure “quality,” one has to pick quantifiable metrics. Among the issues is whether the metrics one chooses represent quality as viewed by the supplier, or quality as seen by the customer.

In the end, the only metric that matters is the customer’s metrics for quality. So even if we believe fiber to the home is "better," and represents "better quality," that might only partially match customer perceptions.

The old adage that "customers don't care" about how you provide a service, only that you do so" is germane. All marketing hype aside, tests tend to show that beyond about 10 Mbps, any single user is unable to perceive an advantage compared to access at higher speeds.

Those requirements will grow over time, as they have since people began using the Internet widely, but the principle remains: end user experience increasingly is not dictated by the bandwidth of the local access loops.

Nevertheless, supplier choices do matter, because end user assessment of quality includes a “price” evaluation, not simply a “performance” perception.

That matters since all supplier costs inevitably are reflected in end user retail prices. If faster DSL allows lower retail prices, that is a "better" choice for many ISPs than ripping out all access network copper and substituting fiber to the home.

To a startling degree, headline speeds are a marketing necessity, but not necessarily an end user perceivable value. Beyond a certain point, headline speed increases yield no discernible end user advantage. 

That might not matter. Marketing concerns often dictate the ability to sell faster speed service, even if such enhancements do not lead to higher end user quality of experience.

No matter. ISPs will seek to deliver high speeds, because that is what successful marketing requires.

Bell Laboratories Technology comparison
Technology
Frequency
Maximum aggregate speed
Maximum Distance
VDSL2
17 MHz
150 Mbps
400 meters
G.fast phase 1
106 MHz
700 Mbps
100 meters
G.fast phase 2
212 MHz
1.25 Gbps
70 meters
Bell Labs XG-FAST
350 MHz
2 Gbps (1 Gbps symmetrical)
70 meters
Bell Labs XG-FAST with bonding
500 MHz
10 Gbps (two pairs)
30 meters

Thursday, March 12, 2015

FCC Releases Text of Net Neutrality Order

The Federal Communications Commission has published its Open Internet order. It is 400 pages long, so no, I haven't read it yet! (update: I'm trying to read it carefully and it is tough going. The actual text of the order is not so daunting, the footnotes and "how we got here" attempts to justify are what makes it slow reading)

It does seem clear that the rule is only the beginning of the process, as the use of terms such as "reasonable" means a continuing, case by case review of discrete actions. (In fact, now that I'm plowing through it, it reminds me of another book-length law recently passed where the actual implementation is where much of the actual rules will be developed)


The Commission also says it will launch a separate proceeding to bring mobile data roaming obligations into conformity with the new rules, and possibly apply common carrier obligations to data roaming.

Internet domain interconnection, long a voluntary set of agreements between the networks, now will be governed by common carrier rules. That is likely to have unexpected consequences.

In fact, much of the actual implementation seems to lie ahead, as the use of the test of "reasonableness" means a case by case method of ruling will be required.

Phablets Gain Share at Expense of Tablets

In the wake of the first ever year-over-year decline in global  tablet shipments in the fourth quarter of 2014, there seems little question but that “phablets” (smartphones with larger screens) have become a substitute product.

Tablet shipments are expected to reach 234.5 million units in 2015, a modest year-over-year increase of two percent  from 2014, according to International Data Corporation, which now also has scaled back its five year forecast for the product category.

Android will remain the platform leader, with close to two thirds of the market over the course of the forecast.

Apple  iOS share of the market to decline in 2015, reaching levels below that of the past three years.

Windows, despite modest adoption to date, is expected to gain significant share over the course of the forecast, growing from five percent in 2014 to 14 percent in 2019.

What, in Telecommunications, has Changed Over the Last Decade?

What has changed in telecommunications over the last decade? For Ofcom, the U.K. communications regulator, quite a lot.

Since 2005, broadband adoption increased 2.5 times, from 31 percent to 78 percent. High speed access at a minimum of 30 Mbps now is available to 78 percent of locations, while adoption has grown to 27 percent.

Mobile broadband availability has increased significantly, with 3G coverage increasing from
82 percent to 99 percent of premises, and 4G services available to 73 percent of premises.

More significantly, mobile broadband adoption now is 67 percent.  
Purchasing of bundled services has more than doubled from 29 percent to 63 percent.

About 44 percent of high speed access connections now are supplied by retailers using the wholesale approach, up from 17 percent in 2005.

Supplier consolidation also has been significant, including the formation of EE from
Orange and T-Mobile and acquisitions of smaller broadband ISPs (O2, Tiscali,
AOL, Be, Easynet).

The qualitative changes are just as significant.

A decade ago, the key issue was how to create a wholesale fixed network structure.

Now, Ofcom says, “a strategic review focussed on the market structure in fixed telecommunications risks being overly backward looking.” That means mobile and over the top apps and services must be a fundamental part of the examination.

“Increasingly, digital communications encompasses a combination of fixed, wireless
and mobile connectivity, and communications services provided over these networks,” Ofcom says.  

“For this review to be genuinely strategic, it needs to take account of digital communications infrastructure and competition more broadly,” Ofcom says.

The context includes increasing convergence between fixed and mobile communications, associated developments in wireless networks, and the ever increasing importance of “over the top” services, Ofcom notes.

Put another way, that means analysis and has to include fixed, mobile and untethered modes, plus over the top services that compete directly or substantially with carrier-offered services.

Policy, as a corollary, will likewise take into account the full range of access and services supply to ensure “competition, investment and innovation.” Ofcom suggests that, in addition to creating a climate for investment, while protecting consumers, it also will look at instances where deregulation is possible because competition (especially in voice and messaging) will discipline the market.

Wednesday, March 11, 2015

Lost Viewers Only the Start of Problems for TV Networks

In the third and fourth quarters of 2014, perhaps 40 percent of cable TV network ratings declines were caused by consumers who watched over the topp subscription video services instead of the linear channels, according to the Cabletelevision Advertising Bureau.

Total TV viewing fell 10 percent year over year in the third quarter and nine percent, year over year, in the fourth quarter, according to Todd Juenger, Sanford C. Bernstein analyst.

In the first quarter (through February), linear video network viewing was down about 12 percent.

“We believe the U.S. television industry is entering a period of prolonged structural decline,” said Juenger.

Should those rates of decline continue, at least some channels will face pressure to reconsider their business models. That will be a tough challenge. Linear distribution has one huge advantage: it creates huge potential audiences for advertising.

A la carte distribution will not carry such premiums. Most networks will find they simply cannot replace lost advertising revenues, in any switch to over the top distribution, by substitute subscriber fees.

The rough comparison is earnings of five cents to seven cents per viewer rather than 30 cents, on a much-smaller base of units, for the network.

The cumulative revenue shift would be catastrophic for most programming networks. Unbundling Analysts at Needham and Company have estimated that half of U.S. linear video ecosystem revenue would evaporate in any full shift to completely unbundled content access.

In other words, $70 billion in revenue would disappear. As a direct result, fewer than 20 channels would survive in an a la carte world where consumers are required to bear 100 percent of the cost of the content in the form of subscription or other fees, and advertising essentially disappears.

In 2012, the TV ecosystem generated total revenue of approximately $150 billion, about
$77 billion from advertising and $74 billion from subscription fees
paid to cable, telco and satellite distributors.

TBS, which historically was the second cable channel to be created, and the first ad-supported cable channel, generates about $1.5 billion of revenue from subscription fees plus $2 billion from advertising revenue, according to Needham.

Needham estimates that TBS charges distributors about $1.20 a month. Consumers might theoretically pay about $1.60 at retail, in a bundle. Nobody knows what TBS might cost as a stand-alone streamed channel, but something in the range of $5 to $10 is probable.

The reason is that TBS distribution costs would need to be covered, at the same time TBS loses much of its advertising and distributor revenue.

In a full a la carte regime, where channels are purchased as part of over the top Internet subscriptions, both revenue streams would be severely disrupted.

Networks would lose most of their subscribers and most of their ad revenue. The present TV ecosystem generally splits revenue 50-50 between content and distribution partners. The content provider generally earns 80 percent of all advertising revenue.

“Consumption of network and cable content is taking place in ways that allow viewers to circumvent high monthly cable bills, avoid watching commercials, or both. Every single one of
these changes represents a move to a revenue model that is less profitable than the one currently enjoyed by TV networks,” said analyst Gary Brode. “It is only a matter of time before the revenue and profitability of the networks begins to fall.”

DIY and Licensed GenAI Patterns Will Continue

As always with software, firms are going to opt for a mix of "do it yourself" owned technology and licensed third party offerings....