Friday, December 12, 2014

Can European Telcos Return to Growth in 2016?

Service provider revenue in Europe has fallen consistently since 2009. That is not completely unusual. In five of six years since 2009, service provider revenues also have fallen in Japan.

So will the European telecommunications business return to growth by 2016? Analysts at IDATE say that will happen, although the impact of what appears to be a threat of recession in Europe might make that a daunting prospect.

The impact of the Great Recession beginning in 2008 is easy enough to describe. According to TeleGeography Research, revenue growth slipped from about seven percent annually to one percent in 2009, returning to about three percent globally in 2011.

So a new recession in Europe could well wipe out expected revenue gains expected to be in low single digits in European markets between 2015 and 2016.

Earlier in 2014, economists expected European Union economic growth of between 1.5 percent and 1.75 percent. But growth rates have dipped lower most of 2014, and might go negative in the fourth quarter.

Telecommunications service revenues were four percent lower in 2013 than in 2012, but the revenue decline should slow to perhaps -1.8 percent for 2014, with 2015 a transition year, IDATE expects. In 2016, European Union telecom revenue is projected to reach one percent for the year.

But a new recession could dash those hopes.   

In the EU28, mobile average revenue per user will have lost some 25 percent of its value between 2008 and the end of 2014. “Hence, in spite of the fact that mobile penetration
stood at 129 percent of population in europe in 2013 and will keep growing, operators earn less revenues year after year,” IDATE says.

Fixed high speed access average revenue per user was EUR 24.9 per month in 2008, declining steadily to EUR 22.3 per month in 2014. Fixed network voice revenue has been falling for about 14 years.

The IDATE study, sponsored by ETNO, the organization representing European telecom service providers, notes that 2014 revenue will fall about four percent, following at 2.9 percent decline in 2013.

Even mobile Internet access, a growth category in most markets, dipped 1.8 percent in 2014, though a better performance than the 4.5 percent decline in 2013.

Mobile remains the dominant form of Internet access, with the number of subscriptions approaching the 800 million mark in Europe. Mobile Internet access subscribers grew 0.9 percent in 2013 and likely will grow slightly by the end of 2014 as well.

Fixed broadband subscriptions grew from 157.7 million in 2012 to 163.8 million in 2013 and are expected to stand at 170 million by the end of 2014.

According to IDATE, at the end of 2014, for the first time, fixed broadband subscriptions will outnumber traditional circuit-switched fixed lines.

The big takeaway, though, is that the IDATE forecast will bump up against a likely recession in 2015 that is likely to depress revenues. All that is required is for a recession to shave one percent to two percent off service provider revenue growth rates.

Thursday, December 11, 2014

FCC Boosts Connect America Fund Minimum Internet Access Speed to 10 Mbps

The U.S. Federal Communications Commission has voted to change the definition of “broadband” speeds required for receipt of Connect America funding from 4 Mbps downstream and 1 Mbps upstream to a minimum of at least 10 Mbps downstream and 1 Mbps for upstream speeds.

The change boosts minimum downstream speeds that originally were set in 2011. The Commission’s thinking is that the new standard better reflects minimum speeds received by
99 percent of U.S. residents living in urban areas.

Most locations can buy service at far-higher speeds, however. In fact, 86 percent of U.S. households can buy service at speeds from 25 Mbps to 50 Mbps at the end of 2013.

The new rules affect eligibility for CAF monies of about $1.8 billion a year, for the purpose of providing high speed access to five million rural residents.

Is Linear Video Business Finally at a "Crossroads?"

Is the U.S. multi-channel subscription video business finally “at a crossroads?” A new survey of video consumers might suggest the rate of change might be accelerating.

On the other hand, the survey might also suggest the big immediate danger exists across the board, in every age category between 18 and 59.

In fact, respondents 18 to 34 appear to buy linear subscription video at higher rates than consumers 35 to 59, a finding that might suggest demand has weakened across the entire age range between 18 and 59.

Some might argue the greatest danger can be found among younger viewers 18 to 34, as the one-year drop in buying was highest for those respondents, while buying was consistent for age groups 39 to 59: either flat or slightly higher.  

Cable TV (or telco TV or satellite TV) subscriptions among viewers 18 to 24, and those 25 to 34,  dropped six percent from 2013 to 2014, according to PwC.

But consider that subscription rates for consumers in older demographics was even lower than among those 18 to 34.

At 71 percent adoption, respondents 18 to 24 had the highest rates of purchasing, at 71 percent of respondents in that age cohort. Among those 25 to 34, the take rate was just 67 percent.

Among respondents 35 to 49, an even lower 64 percent of respondents say they buy linear video subscription services. Among users 50 to 59, the take rate was 67 percent.

The cable subscription figures are steadier for older demographics, though.

Looking ahead, 91 percent of viewers said they see themselves subscribing to cable in one year.  
Some 61 percent of respondents said they would likely subscribe in five years, and only 42 percent thought they would be buying in a decade.

So far, the survey found, use of on-demand, over the top streaming services remains largely ancillary to purchasing of linear video services.

Some 65 percent of linear video subscribers 18 to 24 used Netflix in 2014. About 71 percent of linear TV subscribers 25 to 34 had Netflix in 2014, up from 51 percent in 2013.

Of respondents 35 to 49, 66 percent of linear video buyers also bought Netflix. Of those 50 to 59, 58 percent bought Netflix, the PwC study reports.

The PwC survey, if it accurately reflects the actual population of consumers as a whole, might be interpreted as suggesting demand for linear video is actually a bit stronger among consumers 18 to 34 than among consumers 35 and older.

What is not so clear is “what” is at risk. The common perception is that it is the “linear” format or “bundling” that is endangered. That might not be the case for all programming formats.

Sports is the best example of linear delivery that is most valuable, and least susceptible to disruption by non-real-time alternatives. Movies and TV series clearly are the most at risk to on-demand delivery.

But bundling might not actually be endangered quite so much. Some light users will prefer full a la carte access. But most multi-user households and sports fans likely will find that the cost of a full a la carte approach actually costs more than what they currently pay for their video subscriptions.

As always is the case, price and value will matter. Much depends on how content owners decide to allow retail sales of their content, and what steps distributors might take in response.

Should content owners conclude they prefer to fully unbundle access, without requiring prior purchase of a linear video subscription, major instability could erupt in the video ecosystem.

On the other hand, if content owners stick with the current system, where a consumer generally must first buy a linear subscription to get on-demand access to some or all of the content available from any single programming channel, there would be less channel conflict, but likely also less development of a new distribution system and consumer choice.

Nor is it clear retail prices might be lower, in all cases, or profit margins higher, in all cases, for content owners. Today, programming networks other than the TV broadcast channels rely on a  wholesale role.

Few channels sell directly to end users, with the exception of local broadcast stations, as a class of suppliers.

A shift to a retail role would entail lots of new cost, especially for marketing and promotion. The new business models would be highly dependent on how those new retail selling costs changed gross revenue, profit margins and operating cost for the content owners.

Nor is it completely possible to predict the reaction of their current distributors (cable, satellite and telco TV providers). Faced with a major move to over the top access on the part of the programmers, the distributors would react.  

The point is that, even with the whole industry in a process of incremental movement, one might argue the crossroads actually remains in the future.

Not Even Apple Pay Will Quickly Solve "Mobile Payments" Problems

Some of us have argued that it could well take a decade or more before mobile payments become a routine part of the consumer experience when paying for merchandise at a retail location. And even then, “routine” use might be a reality for only about half of all consumers.


After 20 years, the percentage of U.S. households using automatic bill paying is still only about 50 percent. Likewise, after 20 years, use of debit cards by U.S. households is only about 50 percent.


It took about a decade for use of automated teller machines to reach usage by about half of U.S. households.


So history is the reason it is reasonable to predict that mobile payments will not be a mass market reality for some time.


Some might argue the problem is that big companies cannot innovate. Actually, that might be a problem, but is only a small part of the adoption process.


The bigger problem is that major changes in end user behavior have to happen, and before that can become a reality, it often is necessary to spend quite significant sums to create the infrastructure enabling the behavior change.


In the case of mobile payments, that involves creating a critical mass of devices, payment apps and processes, merchant terminals and retail brands. Beyond that, the developing market would have to come to a practical consensus about standards, interfaces and methods.


Also, with huge amounts of revenue at stake, it will take some time to sort through rival business interests and approaches that pit credit card issuers against retailers, for example.


All of that ensures a lengthy period of confusion before scale is possible. And until scale is possible, progress will be limited.

In consumer financial services, decades can pass before a significant percentage of consumers use an innovation. In fact, a decade to reach 10 percent or 20 percent adoption is not unusual, in the consumer financial services space.


Gigabit Access Improves User Experience, But Only So Much

The growing embrace of faster Internet access in the U.S. market, and the growing supply of faster connections, despite all criticisms, has many drivers--on both the demand and supply side. But it is changes on the supply side that arguably are most important.

In the past, Internet service providers rightly have suggested there was not much demand for faster access (50 Mbps, 100 Mbps, 300 Mbps or gigabit Internet access), but that was in large part primarily because of retail prices.

It would have been more accurate to say there was, in the past, not much demand for Internet access costing $100 to $300 a month, when “good enough” services could be purchased for $40 to $60 a month.

All that is changing. When Google Fiber launched Google Fiber, offering a symmetrical access service for $70 a month, the key innovation might arguably have been the price.

True, Google Fiber offers very-fast access (two orders of magnitude faster than typical offers, and an order of magnitude faster than 300-Mbps services that had been available in some communities.

But the key innovation arguably was the price. A gigabit for $70 a month is not such a leap from the $50 a month level of “standard” Internet access.

Also, $70 a month is a huge change from the $300-a-month price of prior 300 Mbps offers, or $350 monthly prices for a few gigabit services that had been available.

So major changes on the supply side--dramatically lower prices and dramatically faster speeds--are spurring demand.  

In part, that is because the underlying technology is getting better.

“I joined AT&T in 2008 and I remember around 2012 looking at some charts and the cost of speed hadn’t really had a breakthrough, because 80 percent of your deployment in broadband is labor based,” said John Donovan, AT&T senior executive vice president for architecture, technology and operations.

“And then all of a sudden you have vectoring in small form factor stuff and all of a sudden a little bit of an investment by our supply chain a few standard things and we start to take a 25 meg on a copper pair and then we move it to 45 and then 75 and then 100 which is on the drawing board,” said Donovan.

The point is that the underlying technology used by cable TV operators and telcos has been continually improved, providing better performance at prices useful for commercial deployment.

Operating practices also are becoming more efficient. Google Fiber has been able to work with local governments to streamline permitting processes and other make-ready work in ways that can lower costs to activate a new Internet access network using fixed media.

Google Fiber also pioneered a new way of building networks, getting users to indicate interest before construction starts, and building neighborhood by neighborhood, instead of everywhere in a local area.

That changes gigabit network economics. As has been true for nearly a couple of decades in the U.S. market, for example, competitive suppliers have been able to “cherry pick” operations, building only enough network to reach willing customers, without the need to invest capital in networks and elements that “reach everyone.”

That makes a big difference in business models. A network upgrade that might not have made sense if applied across a whole metro network might well make sense in some parts of a city, where there is demand.

Also, every new supplier of Internet access goes through a learning curve, generally operating inefficiently at first, but improving as experience is accumulated.

“And then we are getting better at the deployment side of the business as well,” said Donovan. “So our average technicians and our best technicians are converging.”

But there is an important related issue. Customers who get gigabit service often cannot perceive a difference in experience as great as they might have expected.

“When we do the installs, we often have to stay and show them on a speed test is getting a gig,” said Donovan. That illustrates a problem we are going to be seeing more often, namely that gigabit access can only improve end user experience so much.

Access speed only improves experience so much because it is only part of the combined ecosystem, and only can affect a part of user experience. Remove the local access bottleneck on one end and all the other elements become visible.

Bragging rights and therefore marketing advantages are believed to accrue to Internet service providers with the highest perceived Internet access speeds.

Some--if not most--of that marketing hype apparently is misplaced, a new study by Ofcom, the U.K. communications regulator, might suggest.

The study found that “access speed” matters substantially at downstream speeds of 5 Mbps and lower. In other words, “speed matters” for user experience when overall access speed is low.

For downstream speeds of 5 Mbps to 10 Mbps, the downstream speed matters somewhat.

But at 10 Mbps or faster speeds, the actual downstream speed has negligible to no impact on
end user experience. Since the average downstream speed in the United Kingdom now is about 23 Mbps, higher speeds--whatever the perceived marketing advantages--have scant impact on end user application experience. Some 85 percent of U.K. fixed network Internet access customers have service at 10 Mbps or faster.

Investing too much in high speed access is, as a business issue as investing too little. The important insight is that it is perception that now matters most in the United Kingdom and United States, not the actual threshold required to provide reasonable end user experience of Internet applications such as web browsing and streaming video.

Average access speeds in the United States are 10 Mbps, according to Akamai. Average speeds are 32 Mbps, according to Ookla. Another study shows that average Internet access speeds in the United Kingdom and United States are equivalent, in fact.  

The quality of the upstream path and in-home network have some impact, at all speed ranges, but at a dramatically lower level as speeds climb above 10 Mbps.  

One finding was surprising. The Ofcom tests of end-to-end user experience suggest that web browsing is significantly affected by upstream and downstream access speeds, the home network and the Internet service provider’s network interconnection policies.

Both the upstream and downstream speeds affect user experience of streaming video, while voice experience is, relatively speaking, barely affected.

Those are important findings. The quality of the broadband experience is not solely dependent on
access speed. In-home wiring (including Wi-Fi performance) and peering arrangements etween internet service providers can also be important.

“Indeed, for connections with a download speed greater than 10 Mbps, access speed appears to become less significant than these other factors,” Ofcom says.

At connection speeds above the range of 5 Mbps to 10 Mbps, though, the relationship breaks down and broadband connection speed is no longer an important determinant of performance, Ofcom says.

The important observation is that elements of the end-to-end value chain--other than access speed--now are becoming greater bottlenecks.

Wednesday, December 10, 2014

BT Acquisition of O2 Seen as Imminent

BT is expected to buy O2, getting back into the mobile business in a big way, “before the end of 2014,” U.K. financial site “This is Money” reports. That would give BT 24 million mobile customers, create the ability to provide a quadruple pay bundle, and likely trigger a reshuffling of assets in the U.K. communications market as well.

“Virtually all operators today believe that their future lies in a quad play, a supposedly new buying behavior expected to consolidate in future: In the operators’ view, customers will buy their fixed telephony, broadband, mobile, and TV services all from the same company and stay with the same provider for longer,” says Dario Talmesio, Ovum analyst.

“Mobile businesses are looking to move into broadband because their revenues are in decline,” many would note. “While revenues for the broadband industry have grown by four percent to five percent a year over the last two years, mobile revenue continues to decline by three percent a year,  according to media consultancy Enders Analysis.

Still, a successful acquisition of O2 by BT would put BT into contention near the top of the U.K. mobile market.

EE, the joint venture between Orange and Deutsche Telekom, has 33 percent share of mobile revenues in the U.K. market.

O2 has 26 percent market share in the U.K. market, as does Vodafone, with 26 percent market share by reported service revenue in the United Kingdom.

Hutchison Whampoa’s Three has 12 percent market share and and Virgin Mobile has three percent market share.

Growing interest in quadruple play retail offers is driven, fundamentally, by the contraction of mobile revenue, the shrinking of fixed network voice revenue and the degree of competition in the U.K. market.

Simply stated, contestants look to sell more products to a perhaps-smaller number of customers, boosting gross revenue and slicing customer churn by selling four products instead of one or two. ee

It isn’t rocket science.

Tuesday, December 9, 2014

Despite Gigabit Hype, Internet Access Speed is Not the U.K. or U.S. Experience Bottleneck

Bragging rights and therefore marketing advantages are believed to accrue to Internet service providers with the highest perceived Internet access speeds.

Some--if not most--of that marketing hype apparently is misplaced, a new study by Ofcom, the U.K. communications regulator, might suggest.

The study found that “access speed” matters substantially at downstream speeds of 5 Mbps and lower. In other words, “speed matters” for user experience when overall access speed is low.

For downstream speeds of 5 Mbps to 10 Mbps, the downstream speed matters somewhat.

But at 10 Mbps or faster speeds, the actual downstream speed has negligible to no impact on
end user experience. Since the average downstream speed in the United Kingdom now is about 23 Mbps, higher speeds--whatever the perceived marketing advantages--have scant impact on end user application experience. Some 85 percent of U.K. fixed network Internet access customers have service at 10 Mbps or faster.

Investing too much in high speed access is, as a business issue as investing too little. The important insight is that it is perception that now matters most in the United Kingdom and United States, not the actual threshold required to provide reasonable end user experience of Internet applications such as web browsing and streaming video.

Average access speeds in the United States are 10 Mbps, according to Akamai. Average speeds are 32 Mbps, according to Ookla. Another study shows that average Internet access speeds in the United Kingdom and United States are equivalent, in fact.  

The quality of the upstream path and in-home network have some impact, at all speed ranges, but at a dramatically lower level as speeds climb above 10 Mbps.  

One finding was surprising. The Ofcom tests of end-to-end user experience suggest that web browsing is significantly affected by upstream and downstream access speeds, the home network and the Internet service provider’s network interconnection policies.

Both the upstream and downstream speeds affect user experience of streaming video, while voice experience is, relatively speaking, barely affected.

Those are important findings. The quality of the broadband experience is not solely dependent on
access speed. In-home wiring (including Wi-Fi performance) and peering arrangements etween internet service providers can also be important.

“Indeed, for connections with a download speed greater than 10Mbps, access speed appears to become less significant than these other factors,” Ofcom says.

At connection speeds above the range of 5 Mbps to 10Mbps, though, the relationship breaks down and broadband connection speed is no longer an important determinant of performance, Ofcom says.

The important observation is that elements of the end-to-end value chain--other than access speed--now are becoming greater bottlenecks.

Will Generative AI Follow Development Path of the Internet?

In many ways, the development of the internet provides a model for understanding how artificial intelligence will develop and create value. ...