Wednesday, April 29, 2015

No Consolidation Among Top-4 Mobile Operators is Possible; One Has to Fail

Even if one argues a four-provider U.S. mobile market is not sustainable, U.S. regulators have ruled out any consolidation among the top four suppliers. That means the market cannot consolidate to three strong providers by means of mergers among the contestants.

That leaves only one solution: one of the firms, or perhaps even a couple, would have to be so weakened that the top ranks shrink naturally to two or three providers, with a distant number four unable to keep up. Failure, in other words, is the only way the U.S. mobile market is going to consolidate.

That appears not to be the case in the video market.

Even 20 years ago it would have been possible to predict that, ultimately, both DirecTV and Dish Network would cease to be operating entities, and would have been acquired or merged with another entity. The logical candidates always have been Verizon and AT&T.

Now that AT&T has made the move to acquire DirecTV, half the prediction seems likely to be fulfilled. The issue now is what happens with Dish Network. Verizon likely has little interest, as Verizon thinks linear video is going to decline rather quickly.

That makes Dish Network exit options tougher. It never has seemed likely any cable TV operator would see the logic of acquiring Dish Network or DirecTV. If Verizon isn’t interested, the pool of buyers gets very thin, one might argue.

Likewise, some have argued that, long term, the U.S. mobile market simply cannot support four major national suppliers. But it is hard to see, at the moment, how that consolidation would happen.

In fact, the only scenario that would reduce immediately and clearly reduce the number of suppliers is the one development regulators will not presently support: Sprint merging with T-Mobile US, or either AT&T or Verizon buying T-Mobile US or Sprint.

In other words, none of the four leading national providers will be allowed to merge. That doesn’t mean there will not be acquisitions; there simply won’t be any mergers of the top four firms.

“I have always said on consolidation, it’s not a matter of if it’s when and how and now I’m going to add and who, because I think as we think ahead you need to think I still reiterate that in five years we will think it comical that we thought about the  industry structure as the four major wireless carriers,” said T-Mobile US CEO John Legere. “So I think you need to think about the cable industry and players like us as not competitors but potential partners and alternatives for each other in the future.”

That would not necessarily reduce the number of providers from four to three, as a firm such as Comcast would still remain in the market. But a Comcast acquisition of either T-Mobile US or Sprint would give the acquired company the heft to secure its number three spot in the market on a long-term basis.

On the other hand, it always is possible that Dish Network might also try to acquire T-Mobile US, to transform itself. That likewise would not immediately reduce the number of leading mobile providers.

So, like it or not, no consolidation of the U.S. mobile market is possible by means of any mergers among the top four providers.

Instead, one of the firms would have to be weakened so much that it essentially drops from contention. Weakened sufficiently, the number-four provider might well be acquired by a firm that has a different business model, and essentially does not compete directly with the leading three providers.

How Much Must Small Cell Costs and Backhaul Drop?

Ask yourself how much you would be willing to pay for an Internet connection to a $100 appliance (router, for example). At the low end of that range, one is looking at consumer-grade Internet access connections.

But put yourself in the shoes of an executive of an Internet service provider looking at big networks of small cell access points, perhaps with per-site costs in excess of $3,000, just for equipment, and not including site rent costs.

Consider a single macrocell, where backhaul could cost $24,000 a month, supporting perhaps 160 Mbps to 500 Mbps of capacity per site. How much capacity might a small cell require?

On the assumption a small cell only makes sense in a high-traffic area, the answer might be that the small cell requires as much capacity as a macrocell, in some instances. In other cases, perhaps a small cell only has to support a fraction of total macrocell bandwidth.

The issue is that traditional access using T-1 equivalents does not work. Small cells are going to require Ethernet bandwidth, between perhaps 100 Mbps and a gigabit.

As a rough rule of thumb, assume a small cell requires cost parameters about an order of magnitude less than a macrocell, with backhaul costs likewise scaled about an order of magnitude. That implies a maximum small cell backhaul cost of about $2,400 a month.

Greg Weiner, Vertix co-founder, argues that small cell sites would have to drop to about $100 to $250 per location to drive mass adoption. Again, that suggests something on the order of magnitude drop in costs is needed.

On the other hand, the cost of business grade gigabit connection is dropping dramatically, in some markets. Connections supplied by cable TV companies with dense fiber deployment (think Comcast) are one reason for thinking backhaul costs will drop to levels enabling mass small cell networks.

The One Disruptive Implication of Google Fi

Fi, the new mobile service by Google, has been criticized in some quarters for not being disruptive enough. With the caveat that many changes could come as the service develops, observers have argued that the retail pricing isn’t really all that unusual, not especially affordable, and that the selection of handsets is nil.

All of that might be true, for the moment. But there is one huge shift in retail packaging and pricing that Fi does represent, and it is odd that it comes from a firm such as Google, which might be presumed to favor unmetered or unlimited usage.

Generally speaking, telecom service providers have preferred metered usage based on consumption. Generally speaking, Internet app providers prefer unlimited and unmetered usage plans.

The reason is simple: access provider make more money when users pay for what they consume, while app providers arguably make more money when usage fees place no constraints on the amount of app usage.

The reason “buckets of usage” exist is that Internet service providers, mobile service providers and fixed network service providers are trying to slowly wean customers off of the notion that Internet access literally is “unlimited.”

Conversely, for those services whose demand is plummeting (voice and messaging), service providers now offer “unlimited” plans in place of the buckets or usage-based billing that once were the industry norm.

Again, the reasons are simple. With demand dropping, the network has plenty of spare capacity to support “unlimited usage,” especially when end user demand is highly predictable. Also, because of the high predictability of usage, and low bandwidth impact, unlimited plans can safely be offered because few customers will actually use very much of the resource.

Fi might be the first significant change in thinking about what sorts of plans are most “consumer friendly.”

Up to this point, the “consumer friendly” approach has been considered the uncapped or unlimited consumption approach. Now, Fi is arguing that it is the metered approach that is most consumer friendly. That is a huge shift.

In fact, most ISPs would prefer a metered approach for Internet access, as much as they would argue they benefit by offering “unlimited” use of voice and text messaging.

So, oddly enough, the real innovation Fi represents is a possible sea change in thinking about what form of retail packaging and pricing is most consumer friendly.

Is Facebook Next Regulatory Target in Europe?

Is Facebook the next target for European Union antitrust authorities? Possibly. 

Ostensibly, Facebook only wants a single, unified framework. Facebook VP Richard Allan warns that multiple, overlapping investigations of Facebook, instead of a single EU policy, would lead to higher costs and a decrease in the rate at which new features are developed.

Facebook’s concern partly is a “balkanization” of rules. National regulators in a number of countries, including Belgium and the Netherlands, appear to be initiating multiple, overlapping investigations of Facebook.

“In effect, this would mark a return to national regulation,” said Allan. “If it is allowed to stand, complying with EU law will no longer be enough; businesses will instead have to comply with 28 independently shifting national variants.”

The unstated concern is simply that antitrust investigations seem to be proliferating. The other, unstated issue, is growing regulatory scrutiny, as Google now faces.

Cablevision, Hulu Sign "First Ever" Deal

Cablevision Systems Corporation will offer the Hulu subscription streaming service to Optimum customers, becoming the first linear video provider to do so. The announcement contained no details about pricing or packaging, but it might be reasonable to assume the cost will be about $8 a month, the current price for a paid Hulu subscription, and that the tweak is that the content will be viewable directly from within Cablevision’s linear TV channel lineup.

In other words, Hulu likely will be offered in the same way a linear channel such as HBO is sold and packaged.

Some might say the deal adds value to both parties somewhat incrementally. Consumers already can buy Hulu directly, and can watch on TVs, PCs, tablets or smartphones. That likely will not change as Cablevision becomes a distributor.

But that’s where the incremental value might be created. Hulu gets a big distributor with a customer base and the ability to promote the service. Cablevision gets a new network nobody else offers as part of the linear package.

The move arguably adds a bit more momentum to the “over the top” trend.

The bigger impact plausibly could come if Netflix strikes a similar deal with another major cable TV company (in terms of subscriber share), for the simple reason that Hulu has relatively small market share among OTT providers, while Cablevision likewise has a limited footprint.

So far, only a couple of smaller cable TV providers (Suddenlink. Grande Communications, RCN and Atlantic Broadband) have signed deals to integrate Netflix into their channel lineups.

Tuesday, April 28, 2015

T-Mobile US Addes 1.8 Million Accounts in 1Q 2015

T-Mobile US reported revenue growth of 13.1 percent in the first quarter of 2015, on the strength of robust net account additions.


T-Mobile US had 1.8 million total net account additions in the quarter, marking two straight years of adding at least a million net new customers every quarter. Still, that was a slowdown from the 2.4 million net adds in the same quarter of 2014.


Some 1.1 million net adds were branded postpaid accounts, one million of those accounts being phone accounts.


At the same time, branded postpaid phone churn was 1.30 percent.


For the full year, T-Mobile US now expects to add a total three million to 3.5 million branded postpaid net adds.


T-Mobile’s branded prepaid net customer additions were 73,000 in the first quarter of 2015. Branded prepaid churn was 4.62 percent in the first quarter of 2015.


For the full-year of 2015, T-Mobile US expects adjusted EBITDA to be in the range of $6.8 to $7.2 billion, unchanged from previous guidance.

T-Mobile revenue rose to $7.8 billion from $6.88 billion a year earlier. But heavy promotions resulted in a first-quarter loss of nine cents per share.

At some point, some of us have noted, T-Mobile US would have to switch from rapid subscriber growth to actual profits. If the rate of subscriber growth continues to fall, then that time is coming.

Sprint might still continue with a focus on subscriber growth rather than profits, but a shift by either, or both firms would signal that the fierce marketing war, driven by those two firms, is about to wane.

Impact of 25-Mbps "Broadband" Definition

Conspiracy thinkers might see a pattern in recent decisions by the Federal Communications Commission to manipulate statistics on U.S. Internet adoption, even if each single decision has a logic of its own.

Beyond some greater inability to track progress, and some obvious marketing and strategic issues for entire classes of providers, it might be difficult to determine the impact--beyond universal service funding implications.

That might be considered odd, since the presumed reason for changing the definition was to spur faster investment in higher-speed networks.

Consider the decision to define “broadband” as a minimum of 25 Mbps downstream, in January 2015. Since access speeds are increasing, seemingly at a faster pace, it makes sense to periodically review definitions, at least for the purpose of determining performance levels relevant for government universal service purposes.

Oddly, the announcement about the speed redefinition began with the statement that “broadband deployment in the United States--especially in rural areas--is failing to keep pace.” The irony is that the new definitions make the “problem” bigger.

“The 4 Mbps/1 Mbps standard set in 2010 is dated and inadequate for evaluating whether advanced broadband is being deployed to all Americans in a timely way,” the FCC said.

“Using this updated service benchmark, the 2015 report finds that 55 million Americans--17 percent of the population--lack access to advanced broadband,” the FCC said.

Cynics might point out that redefining an issue to increase the number of incidents will logically create a bigger defined problem. Those who argue a higher definition will cause faster investment might argue the new definitions almost immediately will mean recipients of universal service funds will have to build at a minimum 25 Mbps, not 4 Mbps, to receive funds.

The impact on the broader market is unclear, in terms of investment incentives. Many commercial providers, especially those in the fixed networks segment, having long passed the 4 Mbps standard, arguably will not be much affected, however, as the market level of competition comes from the Google Fiber gigabit challenge, not the “minimum” 25 Mbps definition.

So changing the definition likely has little impact on many major commercial providers, in terms of investment incentives. 

But there are some odd implications, with high sector impact. Some Internet service providers have been redefined out of existence. 

That includes most satellite and fixed wireless ISPs, plus many rural telcos. Selling a 15-Mbps access service no longer qualifies as “high speed” or “broadband.”

And some of those service providers cannot increase speeds to 25 Mbps, across the board, for reasons related to lack of spectrum. If speeds higher than 25 Mbps are the future, those contestants face long term issues related to ability to compete.

Some might see other important implications, as well. Even most Long Term Evolution networks do not consistently deliver 25 Mbps, though they easily meet the 4 Mbps definitions. So, by raising the definition of “broadband” to 25 Mbps, the entire U.S. mobile industry was eliminated from contention.

Again, the result is a suddenly “bigger” problem, even if few really believe the actual state of high speed access has gotten significantly worse over the last year.

Also, the new definition meant Comcast, had it acquired Time Warner Cable, would have had at least 57 percent market share of “broadband” connections in the United States. By historical standards, that was far beyond the 30-percent share maximum that has governed antitrust thinking in the communications and video entertainment businesses.

So some might see ulterior motives. Others might see an understandable logic, less sinister but still designed to create problems, not solve them.

Students of organizational “mission creep” or development will note that organizations never declare victory and disband, when the original problem they sought to fix has been vanquished. Instead, they search for new missions.

Cynics might argue something similar was at work: definitional changes that dramatically affected the status and scope of a defined problem, with a bigger implied need for action to solve the problem.

Others might note that the fuzziness about Internet access has grown for some time, though. Classically, “broadband” was defined as any speed at or above 1.5 Mbps. That definition now is archaic, but has been replaced by a subjective notion that “broadband is what we say it is,” in essence.

That is perhaps unavoidable in a market where speed improvements, since the earliest days of dial-up access, have been increasing nearly at the rate Moore’s Law would suggest.

There arguably is no "conspiracy." But real issues are created. Comparing performance or progress over time will be tougher, since the definitions have changed. 

Many ISPs no longer can say they sell high speed access or broadband. In many cases, there are physical limits to progress, such as lack of access to additional spectrum. What policy changes can, or should, flow from that situation are unclear.


Zoom Wants to Become a "Digital Twin Equipped With Your Institutional Knowledge"

Perplexity and OpenAI hope to use artificial intelligence to challenge Google for search leadership. So Zoom says it will use AI to challen...