Wednesday, June 4, 2014

How Telcos Are Like Apple

Nobody would argue that a large tier-one telco can move as fast, and be as agile, as most device or application providers. There are many reasons, including the time it takes to build or upgrade physical facilities requiring construction work.

But a “monopoly” fostered culture plays a role as well. Still, tier one service providers are more agile than some suspect, and have faced key changes in revenue model that are similar to the challenges Apple faces in migrating from personal computers to music players to phones and tablets.

There are other important implications. Telecommunications services are heavily regulated, compared to device and application markets, further imposing limits to agility. Apple, Google, Amazon or Facebook can jump instantly into new markets and abandon older markets if they choose. That is not possible, or easy, for a tier-one communications provider.

Apple doesn’t have to ask permission to change chip suppliers or operating systems. Telcos face opposition when they want to shift from time division multiplex protocols to Internet Protocol.

On the other hand, telcos already  have gone through several changes of core revenue model, and face continuing pressure to migrate business models of substantial size, with no assurance of success.

Two key observations might be made in that regard. Telcos are more agile than commonly supposed, when it really counts. Also, the telco revenue model is uncertain, as uncertain as Apple, Google, Facebook and Amazon face.

But communications networks also are viewed as “essential” national resources--and affected directly by essentially political constraints-- in ways that device and app providers are not so affected.

There is a simple way to illustrate the difference. It is unlikely many would advocate huge intervention efforts were any single device or application supplier face bankruptcy. It is not so clear the major suppliers of communications connectivity would be viewed so casually.

Prior to 1985, when the former monopoly AT&T was broken up as part of an antitrust settlement, the U.S. Defense Department objected to the breakup, on national security grounds.

In similar ways, no major potential failure by AT&T or Verizon would be viewed as casually as would a major device or application provider bankruptcy, for example.

So far, telcos have proven about as agile as Apple in replacing former legacy products with new revenue generators. Where Apple has moved through PCs and music players to smartphones, major telcos have displaced international and long distance with mobile voice, then mobile voice by mobile Internet access.

At the same time, the major telcos have replaced fixed line voice with high speed access and have added entertainment video as key revenue sources.

There is no question even those transitions will have to be followed by other transitions. So agility is more characteristic of large telco operations than often is assumed to be the case.

On the other hand, telcos face danger in virtually every key revenue source, no different in principle from the need app and device suppliers face to “constantly innovate” in products and services.

The point is that tier-one telcos are not the slow-moving dinosaurs observers sometimes assume they are. On the other hand, neither are they so secure in their markets that the risk of major failure ever can be dismissed.

Just as device and app providers have to innovate constantly, so do tier-one telcos.

And even if firms such as Google and Microsoft have learned the hard way that even they cannot make any choices they wish, tier-one service providers face fundamentally greater constraints.  

Governments have direct influence over tier-one telco pricing, products, terms and conditions of sale and even permissible market share.

The point is that “regulating” big telcos is more tricky than in the past. No less than other device and app providers, telcos constantly face maturation of former core products, and continually must create new products to sell.

So regulators must avoid actions that allow re-monopolizing the market. At the same time, regulators must allow telcos enough freedom to create brand-new products with sizable scale, to replace current revenues destined to shrink or disappear.

Telcos have proven agile enough to navigate the transitions. Regulators also must be agile enough to allow the innovation to continue. It isn’t easy. It is necessary.T

Tuesday, June 3, 2014

AT&T Project VIP "Ahead of Schedule" But Is Capex Ahead of Schedule?

Project VIP, AT&T’s program to upgrade its network infrastructure, is “ahead of schedule,” AT&T says. 

Does that mean AT&T also has spent more than planned, less than planned, or is AT&T shifting capital to new projects? It is hard to say.

Some might note that what AT&T touts is Long Term Evolution and fixed network access to business locations, not consumer upgrades.

AT&T Project VIP progress is said to include a 4G LTE network that now covers nearly 290 million people, and a Project VIP broadband build “expected to take fiber to more than 400,000 new business customer locations by the end of the second quarter.”

Notably absent is any statement of progress in the consumer access area.

Some might point to estimates that AT&T’s capital investment was cut in May, primarily in the fixed network.

Others might note that AT&T capital investment also was light in the fourth quarter of 2013.

AT&T's fourth-quarter capital investment of $5.38 billion came in below Wall Street's estimate of $5.62 billion and Raymond James' $5.66 billion estimate.

The downside was driven by weak mobile capital investment representing a double-digit percentage decline year-over-year and high-single-digit percentage decline sequentially.

On the other hand, fixed network investment was also below expectations, yet grew year-over-year.

It isn’e yet clear whether AT&T is slowing temporarily because it really is “ahead of schedule,” or whether there might be an actual slowdown planned.

AT&T forecast 2014 capex of about $21 billion, up from its prior $20 billion. The Raymond James 2014 capex estimate for AT&T increased to $20.9 billion from $20.4 billion, while the Street currently predicts investment of $20.3 billion.

On a sequential basis, first-quarter 2014 estimated capex declined 16 percent. Analysts had projected an acceleration in the first half of 2014, compared to the first half of 2013. Any slowdown in May might, or might not, be reflected in those half-year forecasts.

It is possible AT&T wound up spending more than planned earlier in 2014. But it might be possible that priorities have shifted a bit, with AT&T spending more on Project Agile, AT&T's effort to streamline its operations.

That could shift some spending to software and consulting and integration expenses, rather than network hardware,  in 2014.
The larger question is what could be happening globally.

With global CSP capital expenditures forecasted to total more than US$2.1 trillion from 2014 to 2019, a level that represents largely slowly-growing overall spending, Ovum expects careful spending on fixed network infrastructure.

By 2019, globla capex is projected to be US$367 billion, with fixed network investment constrained by tougher revenue growth in that segment, with possible growth of total network infrastructure spending of perhaps $50 billion, in 2019, over 2014 levels.

Keep in mind that the United States, China and Japan alone account for 45 percent of total global capital investment.

Other forecasts call for slightly declining or flat overall capex in the U.S. market between about 2012 and 2015, down from 2008 levels, according to Statista.

Infonetics forecasts capital investment growth of four percent in 2014.

Other firms have been even more bullish. Gartner has forecast global sales of network equipment to carriers rising six percent in 2014 to $85.4 billion, up from three percent growth in 2013.

Asia, excluding Japan, was predicted to grow seven percent, and Europe and North America six percent, according to Gartner researchers.

Dell'Oro analysts predicted 2014 capital investment growth of three percent, compared with two percent spending growth in 2013.

AT&T originally had forecast 2014 capex of about $21 billion, up from its prior $20 billion.

The Raymond James 2014 capex estimate for AT&T increased to $20.9 billion from $20.4 billion, while the Street predicted  $20.3 billion.

But Raymond James also forecast that on a sequential basis, first-quarter 2014 estimated capex would decline 16 percent. But Raymond James also predicted first-half 2014 spending would increase by three percent  over first-half 2013.

Whatever other service providers might be doing, actual AT&T investment plans might have slipped from original targets, shifted to different projects or might simply be ahead of quarterly or half-year targets.

It is likely too early to know for certain which explanation ultimately will prove correct.

T-Mobile to Support Wi-Fi Calling on iOS 8 iPhones

T-Mobile US says it will enable Wi-Fi calling from iPhones running iOS 8. That will mean at least 90 percent of all T-Mobile US smartphones are able to support such calling, according to Mike Sievert, T-Mobile US chief marketing officer.

“Already today, T-Mobile provides Wi-Fi Calling capabilities to far more customers than any other major U.S. wireless provider,” said Sievert. “Already, there are 17 million Wi-Fi calling-enabled customer devices on our network.”

About five million T-Mobile US customers use Wi-Fi Calling during any given month.

In what sort of market would a constant aggressively support  Wi-Fi calling features that are functional substitutes for the core functionality of a mobile phone, including both voice and messaging?

There is a tactical answer and a strategic answer.

Tactically, underdog competitors often try to attack and disrupt existing markets to gain share. Emphasizing Wi-Fi calling, even at the risk of losing voice revenue, is one way T-Mobile US, the smallest U.S. major mobile service provider, can create distinctiveness and add value.

T-Mobile US gains more than it loses by deliberately sacrificing some voice revenue, to gain subscriber accounts.

On a net basis, T-Mobile US gains by adding subscribes, whatever small losses it sustains on lower voice calling volume and revenue.

Strategically, all contestants in the market will earn less revenue from voice over time, as mobile Internet access--not voice or messaging--emerges as the core revenue stream.

That is one reason why Verizon Wireless and AT&T Mobility make unlimited voice and messaging a feature of an access service, at a flat rate and bundled with a subscription.

The variable parts of service for a smartphone account then becomes use of mobile Internet access services.

Strategically, contestants begin to discount and merchandise features that once were core revenue drivers when those features cease to be core revenue drivers.

In the U.S. market, data revenue generated 51 percent of mobile revenues in the fourth quarter of 2013, up from about four percent a decade earlier, according to Chetan Sharma.

Others might say the shift is less pronounced, if text messaging revenues are separated from Internet access revenue. Still, the trend is clear enough.

In Europe, where the trend is most pronounced, a study by the GSMA found that mobile average revenue per user (ARPU) across the 27 European Union (EU27) countries fell by 20 percent between 2007 and 2010, caused primarily by ongoing declines in the average per-minute price for voice calls, which dropped from EUR0.16 to EUR0.14 in the EU27 mobile markets over the period.

The point is that when service providers start to discount and merchandise any particular feature or service, that feature or service no longer is viewed as driving significant future revenue, even when those sources have been crucial in the past.

source: GSMA

Monday, June 2, 2014

Most Mobile Markets are "Too Cencentrated" With 3 Major Providers

Mobile markets will, over time, tend to feature three leading service providers, history suggests. Looking at the biggest 36 mobile markets globally, Chetan Sharma found that the average Herfindahl-Hirschman Index (HHI)--the test of market concentration also used by the U.S. Justice Department--the typical market ranks 0.344 on the scale.

Developed markets have an HHI of 0.327. The U.S. market HHI is 0.25, between “heavily concentrated” and “moderately concentrated” markets. The U.K. market is the notable exception.

The Justice Department will generally investigate any merger of firms in a market where the HHI exceeds .100 and will very likely challenge any merger if the HHI is greater than .180.

Some would argue that any deal in a market with an HHI over .230 will be heavily scrutinized and most likely rejected.

Sharma found 30 of the 36 markets over that level. The U.S. market has an HHI of about .250.

Supporters of a Sprint deal to acquire T-Mobile US will have to convince antitrust authorities that--despite the HHI--the U.S. market will continue to be dynamic, with the entry of at least two additional national providers, including Dish Network and a consortium of U.S. cable operators lead by Comcast.

FCC "Broadband" Definition Simply Tracks Actual Performance

That the Federal Communications Commission, which last updated the definition of “broadband” access in 2010, is proposing to do so again should come as no surprise, given rates of speed increase that essentially double speeds about every five years, for typical users.

The Federal Communications Commission reported in February 2013 that the average high speed access speed tier subscribed to by U.S. consumers increased from 14.3 megabits per second (Mbps) to 15.6 Mbps. That was based on 2012 data, so actual 2014 speeds are undoubtedly higher.

Bandwidth growth for a high-end user since 1984In 2010, the FCC defined broadband as 4 Mbps down and 1 Mbps up, at which point 14 million to 24 million people (perhaps 5.6 million to 9.6 million household locations) were not able to buy broadband Internet access.

Using the five-year doubling rule, and a 2010 rate of 4 Mbps, the 2015 minimum should be about 8 Mbps. By 2020, using the same rule of thumb, the minimum speed for any “broadband” connection should be about 16 Mbps, at least for fixed network connections.

With the important caveat that availability is different from purchasing behavior, about 71 percent or more of U.S. households now buy a fixed network high speed access service, according to IHS.

About 86.1 million U.S. households at the end of the first half of 2013 had broadband Internet access, translating into a 70.2 percent penetration of all American households. Penetration will reach a projected 71.3 percent at the close of 2013, up from 69.6 percent in 2012, IHS estimated at the time.

Broadband Internet will hit 74.1 percent of households by 2017, equivalent to some 94.7 million homes in the United States, IHS also predicts.

A related issue is the typical speeds made available for purchase, and the actual tiers of service bought by households.

According to thee most-recent FCC report, some 55 percent of fixed broadband subscribers receive download speeds greater than 10 Mbps (up from 48 percent in the first half of 2012 and 46 percent in 2011).

The FCC report also shows that consumers are upgrading to faster tiers of service, in part because Internet service providers are serving up faster speeds, and in part because
consumers are upgrading to faster tiers of service.

The biggest changes are occurring at the low end of the market, where 46 percent of consumers buying access at less than 1 Mbps in 2011 had upgraded in 2012.

In the year period between 2011 and 2012, the average subscribed speed reached 15.6 Mbps,  an average annualized speed increase of about 20 percent.

In fact, observers tend to underestimate rates of speed increase for Internet high speed access services. Top-end speeds The "headline" access speeds have grown as much as 50 percent annually since about 1983.



source: Broadband Trends

Percent Change of April 2012 Panelists Subscribed to Higher Tier in September 2012
source: FCC

How Big is the Market for Dish Network Streaming Serivce?

With the caveat that competitors always have an interest in "spinning" a new competitive offer, especially when those offers threaten to undermine an incumbent's business, Time Warner Cable CO Jeff Bewkes says the audience for a new Dish Network streaming TV service could be two million to five million customers.



“They limited the volume that would go through this over-the-top thing to two million subs with one distributor, fiv e million overall,” Bewkes said. 



In other words, the Dish Network streaming service, which aims to provide a streaming linear video service at far lower costs than currently is available, is a niche, Bewkes argues. 



Dish Network is thought to be creating a service offering 20 to 30 linear channels, sold for perhaps $20 to $30 a month. 



Specifically, Bewkes argues the offer will appeal to single-viewer households, not families. That is a rational thought, and likely resembles Dish Network thinking about the core audience. 



What makes the offer unusual is the focus on linear TV channels, rather than pre-recorded material such as movies or archived TV series. 



Verizon, also developing and selling over the top video, seems to prefer the Netflix-Prime-Hulu model anchored in pre-recorded content, not live streaming. 



Many likely believe Time Warner Cable vastly underestimates the potential audience for a streaming linear video service. We should know soon enough. 

Can Google Achieve a Satellite Internet Cost Breakthrough?

Google plans to spend more than $1 billion on a fleet of new satellites to provide Internet access to underserved regions and people, launching perhaps 180 new satellites, likely using a middle earth network architecture.

What is not clear is the revenue model. Google might be envisioning a wholesale revenue model, which means it must find “on the ground” retail partners. Though a retail model using the same infrastructure is conceivable, that would be a challenging business model.

Gree Wyler, founder of satellite-communications startup O3b Networks, is running the project for Google. O3b's former chief technology officer also has joined Google. That might suggest a wholesale model, necessitating access partners, is at least a part of the effort.

To the extent that middle mile facilities are a key impediment in many emerging Internet markets, the new satellite capability could provide a solution for backhaul to the core of the Internet.

As with any network, there are trade-offs. Generally speaking, satellite will provide less bandwidth than optical facilities, but with the upside of faster deployment at lower capital investment.

Of course, a satellite solution can combine backhaul and access, serving end users directly, or providing backhaul to retail providers. It isn’t yet clear what Google will want to do with its new fleet of satellites.  

Separately, Google also is trying to buy Skybox Imaging, which would provide both satellite and command capabilities, in addition to additional new mapping capabilities, arguably at lower cost per satellite.

Google also bought Titan Aerospace, a manufacturer of drones, and also has been investigating balloon-delivered Internet access.

As has been the case with Google’s earlier investments in municipal Wi-Fi and free airport Wi-Fi, investments in mobile spectrum and even O3b itself, Google investigates numerous access methods, and might eventually commercialize some of them.

Google will provide Wi-Fi access at Starbucks locations, replacing AT&T, and also has commercialized Google Fiber.

Whether Google will succeed at hitting its hoped-for cost targets with the new satellite initiative is unclear. But skeptics doubted Google could turn Google Fiber into a commercial venture as well.

The issue, as always, is whether Google can find some new breakthrough on the capital investment, operating cost or business model fronts, compared to all others who have investigated or who use the same MEO architecture.

To be sure, the cost of supplying satellite bandwidth arguably has fallen over the past couple of decades. By some estimates the latest generations of satellites will feature capacity about 300 percent higher than current generation satellites, and perhaps 33 percent lower cost.

Those with long memories also will recall earlier end-user-focused efforts to create low earth orbit or middle earth networks in the past. To be sure, costs have gotten lower over the past few decades, with developments both in launch costs and the satellites themselves.

Google will of course also have to secure rights to use satellite spectrum.

Some will argue that the key innovations will not come in the areas of bandwidth or cost per megabyte of delivered capacity, but instead in business model innovations. Still, Google and others are creating new competition for legacy providers of Internet access, including now even O3b, one of the newest contenders.

Google will gain by relying on “new technology.” But some will question the magnitude of performance and cost advantages to be gained by using the latest platforms.

As with Google Fiber, with used both traditional suppliers of access and transport gear as well as some home-grown customer premises equipment, the trick will be to innovate in the operations model.

O3b launched an initial fleet of about a dozen satellites, investing about $1.3 billion in that effort. Google plans to launch an order of magnitude more satellites than did O3b.

Even assuming some cost advantages, scale alone will be a cost issue. Even with savings of perhaps 30 percent, Google plans a fleet that is 10 times bigger than what O3b deployed.

How much bandwidth per customer such an approach can yield always is an issue with satellite delivered services. Retail price per gigabyte also can be an issue.

And then there are the business model issues. O3b provides both “middle mile” wholesale transport to third parties and direct to end users. That means some amount of channel conflict is inherent in the business model. Google might face the same dilemma.

How much does O3b cost compared to other available access options, after O3b backhaul is priced in? It is not easy to say. Google will face the same fundamental pricing issues.

O3b itself only says the “real answer depends” on the partner business model and assets. That will be true for Google as well, even if Google will attempt to innovate on the operating cost model, as it has with Google Fiber.

O3b argues it will “compete strongly where high bandwidth, low latency services are required such as IP trunking and mobile backhaul,” as part of a total end user delivery system. Obviously, much depends on the cost profile of the actual access services partner.

In other words, O3b is an efficient supplier of backhaul. It is not yet so clear how efficient O3b will be in an “end-user direct” mode. Google will face the same fundamental concerns.

But Google likely will try to trim its capital costs by using different suppliers than O3b, including using “captive” or “in-house” supply where possible.

O3b, on the other hand, is using existing suppliers of earth station equipment and conventional launch methods for its middle earth orbit (MEO) approach, and therefore is more limited in terms of cost containment.

O3b relies on several established satellite equipment manufacturers, including
Viasat for gateway and enterprise customer earth stations (7.3m and 4.5m antennas).

For “smaller customers” (antennas of 1.8m and 2.4m, probably most suitable for a wholesale customer), General Dynamics earth stations will be used with a range of modems and hub management systems from Comtech, ViaSat and Gilat.

O3b uses 1.1-meter and 2.2-meter earth statiions for its maritime service, produced by Orbit.

Transmission systems are provided by CPI, Comtech, Norsat and NJRC.

O3b has has said US$1.3 billion in initial financing covers the cost of building and launching the first 12 satellites and running the business up to full commercial deployment. Google of course plans a fleet 10 times bigger.

Marketing and operating costs would mount from there, especially if the business leans towards retail, rather than wholesale.

Google will try to make choices that lower breakeven thresholds. It will use Kymeta Corp. antennas that have no moving parts and are controlled by software, which reduces manufacturing and maintenance costs.

Some argue that could reduce costs substantially. If Google Fiber provides any useful guidance, Google might be able to achieve a cost breakthrough.

Will AI Fuel a Huge "Services into Products" Shift?

As content streaming has disrupted music, is disrupting video and television, so might AI potentially disrupt industry leaders ranging from ...