Monday, May 7, 2018

Is Sprint Plus T-Mobile US the Right Merger?

Arguments can be made about the impact--positive or negative-- of a successful Sprint merger on competitive dynamics in the U.S. mobile market. Additional arguments can be made about the sustainable structure of the mobile market, or whether the market will change in the future, in any case.

If so, is this the right merger for each firm?

Some might argue that competition actually is increased, as a merged entity will have the financial ability to match promotions by AT&T and Verizon; make investments in 5G; and enjoy other benefits of scale.

Others might argue that price competition likely will decrease. That is why equity analysts universally favor consolidation in the mobile market. “Less price competition” is the expected upside from the merger.

Yet others might argue that even if the number of suppliers decreases, that is the inevitable future, in any case, as Sprint and T-Mobile US are too small to challenge the other leaders if they remain independent entities.

Yet others might argue that additional competitors are coming, though not facilities-based providers, so the market might not ultimately reduce from four to three leaders.

A different class of arguments might be made about the minimum conditions for some amount of sustainable competition in the mobile market. Some believe the U.S. market cannot sustain four facilities-based suppliers. That is an empirical matter and will eventually be tested further, as new competitors enter the market.

The point is that many observers consider a three-leader market sustainable, while a four-provider market cannot be sustained.

Yet others might argue that the market itself is changing, as access services and content and app providers actually are starting to merge vertically. The Comcast acquisition of NBCUniversal was the primordial step. AT&T’s acquisition of Time Warner would be another illustration.

Hypothetically, other entities (tier-one app providers, device or platform providers) might eventually assume major roles in the access business as well. So, eventually, how we define “the relevant market” could be quite different.

Regulators taking a look at this particular transaction, though, will not have the ability to make determinations about market concentration based on those future developments, and will have to make decisions based on the “market as it now stands.”

It undoubtedly will be argued that the number of facilities-based national providers matters more than the number of mobile virtual network operators (MVNOs), because only a facilities-based supplier has the owner economics to attack pricing levels. Others will argue that the total number of potential tier-one retailers does suffice.

Yet others might argue that the relevant boost in competition will not come in the mobile arena at all, but in the fixed network realm, as 4G and 5G networks are used to displace fixed network services. That might be the case, though the relevant definition of “market” would be broader than “mobile only,” in that case. The challenge is that future competition is not the same as competition already existing in the market.

Up to this point, T-Mobile US has had no interest in that possibility, though obviously it will say it does have such interest, if only to boost its merger chances.

Some will argue the merger boosts investment in 5G. Much of that argument rests on the assumption that 5G will cost significantly more than 4G, something Verizon and AT&T already seem to dispute. And both Sprint and T-Mobile US have touted the speed with which they are independently moving to 5G networks, already.

It is reasonable that a merged Sprint and T-Mobile US could spend less than if each firm built separately, of course. In that sense (infrastructure deployment), a merger might bring more 5G competition, faster.

The more-immediate problem is that the merger will clearly increase market concentration using the traditional antitrust metrics used by antitrust officials.

Business strategy is the other big question. If, as even supporters might argue, the market is changing in ways that favor vertical integration of access and other assets (content, platform, device), a big horizontal merger is no more than an interim step. The next consolidation would have to have the new entity merging with one or more firms in the app, platform or device areas.

Some would argue the better outcome (from a strategy and competition standpoint) would be mergers by Sprint and T-Mobile US separately with “up the stack” partners that would position each firm for the future market (beyond simple mobile connectivity).

In other words, either asset, paired with a tier-one app, platform or device supplier would make more long-term sense. Sprint combined with T-Mobile US would be a bigger supplier of mobile access, but with no fixed network assets, no content assets, no platform capabilities or app assets that create a firm positioned to compete in the coming market.

That is a judgment call, but is the logical conclusion if one argues that a “mobile-only” connectivity position is not sustainable; that a “connectivity only” strategy likewise is unsustainable and that new revenue sources elsewhere than connectivity must be found.

Both Sprint and T-Mobile US need to merge, in other words.  Just not with each other.

Sunday, May 6, 2018

How Many U.S Households Really Do Not Have Multiple Suppliers of Broadband Internet Access?

One often hears it said that “only 50 percent of U.S. households have a choice of more than one broadband provider.” It is not true, unless you consider all the qualifications, such as eliminating wireless and mobile access platforms, and using definitions of "broadband" that, while accurate, are not necessarily "true."

In other words, if a user has 20 Mbps access--not "broadband"-- does that mean end user experience is not substantially the same as if that user had 25 Mbps?


What people mean when saying such things is that certain percentages of homes only have one provider of “fixed network” internet access service at particular speeds, as virtually every U.S. home has a choice of two satellite providers, in addition to fixed service, plus mobile access from three to four suppliers, and in rural areas, quite often a fixed wireless supplier.


In fact, though we might argue that the speeds and prices are not optimal, most U.S. households probably have choices for internet access ranging from five to seven, at a variety of speeds, from a variety of platforms.  


To make the argument that half of U.S. household only have one broadband provider, one has to eliminate available mobile options, satellite providers and service below certain thresholds, be that 4 Mbps, 10 Mbps, 25 Mbps or any other number.


In 2016, some 92 percent of the population has access to both fixed terrestrial services at 25 Mbps/3 Mbps and mobile LTE at speeds of 5 Mbps/1 Mbps, the FCC said in 2018, using 2016 data.


That almost certainly means access and speeds have gotten better over the last couple of years.


In rural areas, 68.6 percent of Americans have access to both services (fixed at 25 Mbps, mobile at 5 Mbps or better), as opposed to 97.9 percent of Americans in urban areas.


With respect to fixed 25 Mbps/3 Mbps and 10 Mbps/3 Mbps LTE services, 85.3 percent of all Americans have access to such services, including 61 percent in evaluated rural areas and 89.8 percent in evaluated urban areas, the FCC says.


At year-end 2016, 92.3 percent of all Americans had access to fixed terrestrial broadband at speeds of 25 Mbps/3 Mbps, up from 89.4 percent in 2014 and 81.2 percent in 2012.


It was true that, in 2016, perhaps 24 million Americans still lack fixed terrestrial broadband at speeds of 25 Mbps/3 Mbps from at least one provider. But that is not the same as saying those households do not have internet access at 10 Mbps or faster, from multiple non-wired suppliers.






The point is, everything hinges on the definition of “broadband” and the range of suppliers who sell it in any given market.



Most of the “one provider markets arguably are in rural areas.




As you would guess, most of the problems exist in rural areas.




Gates and Hastings were Right: Near-Zero Pricing Matters

The most-startling strategic assumption ever made by Bill Gates was his belief that horrendously-expensive computing hardware would eventually be so low cost that he could build his own business on software for ubiquitous devices. .

How startling was the assumption? Consider that, In constant dollar terms, the computing power of an Apple iPad 2, when Microsoft was founded in 1975, would have cost between US$100 million and $10 billion.


The point is that the assumption by Gates that computing operations would be so cheap was an astounding leap. But my guess is that Gates understood Moore’s Law in a way that the rest of us did not.

Reed Hastings, Netflix founder, apparently made a similar decision. For Bill Gates, the insight that free computing would be a reality meant he should build his business on software used by computers.

Reed Hastings came to the same conclusion as he looked at bandwidth trends in terms both of capacity and prices. At a time when dial-up modems were running at 56 kbps, Hastings extrapolated from Moore's Law to understand where bandwidth would be in the future, not where it was “right now.”

“We took out our spreadsheets and we figured we’d get 14 megabits per second to the home by 2012, which turns out is about what we will get,” says Reed Hastings, Netflix CEO. “If you drag it out to 2021, we will all have a gigabit to the home." So far, internet access speeds have increased at just about those rates.

As frightening as it might be for executives and shareholders in the telecommunications industry, a bedrock assumption of mine about dynamics in the industry is that, over time, retail prices for connectivity services also will trend towards zero.

“Near-zero pricing” does not mean absolute zero (free), but only prices so low there is no practical constraint to using the services, just as prices of computing appliances trend towards lower prices over time, without reaching actual “zero.”


Communications capacity might not be driven directly by Moore’s Law, but it is affected, as chipsets power optical transmitters, receivers, power antenna arrays, switches, routers and all other active elements used by communications networks.

Also, at least some internet access providers--especially Comcast--have been increasing internet access bandwidth  in recent years almost directly in line with what Moore’s Law would predict.

If that is the case, the long-term trend should be that speed doubles about every 18 months. Service providers have choices about what to do, but generally try and hold prices the same while doubling the speed at that same price (much as PC manufacturers have done).

In that case, what changes is cost-per-bit, rather than posted price. But an argument can be made that actual retail prices actually have dropped, as well. According to the U.S. Bureau of Labor Statistics, prices for internet services and electronic information providers were 22 percent lower in 2018 versus 2000.

That is not always so obvious for any number of reasons. Disguised discounting happens when customers buy service bundles. In such cases, posted prices for stand-alone services are one thing; the effective prices people pay something else.

Also, it matters which packages people actually buy, not simply what suppliers advertise. Customers do have the ability to buy faster or slower services, with varying prices. So even when posted prices rise, most people do not buy those tiers of service.

And promotional pricing also plays a role. It is quite routine to find discounted prices offered for as much as a year.

The point is that, taking into account all discounting methods and buyer habits, what people pay for internet access has arguably declined, even when they are buying more usage.


To be sure, the near-zero pricing trend applies most directly to the cost per bit, rather than the effective retail price. But the point is that use of internet bandwidth keeps moving towards the point where using the resource is not a constraint on user behavior.

And that is the sense in which near-zero pricing matters: it does not constrain the use of computing hardware or communications networks for internet access.

Gates and Hastings have built big businesses on the assumption that Moore’s Law changes the realm of possibility. For communications services providers, there are lessons.

As Gates rightly assumed big businesses could be built on a widespread base of computers, and Hastings assumed a big streaming business could be based on low-cost and plentiful bandwidth, so service providers have to assume their future fortunes likewise hinge on owning assets in the app, device or platform roles within the ecosystem, not simply connectivity services.

Near-zero pricing matters.

Friday, May 4, 2018

Near-Zero Pricing Forces Continue to Operate in Much of Telecom Business

One of the biggest long-term trends in the communications business is the tendency for connectivity services to constantly drop towards “zero” levels. That is arguably most true in the capacity parts of the business (bandwidth), the cost of discrete computing operations, the cost of storage or many applications.

One can see this clearly in voice pricing, text messaging and even internet access (easier to explain in terms of cost per bit, but even absolute pricing levels have declined).

The reason it often does not seem as though prices have declined is that the value keeps increasing, as retail prices drop or remain the same.

In large part, marginal cost pricing is at work. Products that are "services," and perishable, are particularly important settings for such pricing. Airline seats and hotel room stays provide clear examples.

Seats or rooms not sold are highly "perishable." They cannot ever be sold as a flight leaves or a day passes. So it can be a rational practice to monetize those assets at almost any positive price.

Whether marginal cost pricing is “good” for traditional telecom services suppliers is a good question, as the marginal cost of supplying one more megabyte of Internet access, voice or text messaging might well be very close to zero.

Such “near zero pricing” is pretty much what we see with major VoIP services such as Skype. Whether the traditional telecom business can survive such pricing is a big question.

That is hard to square with the capital intensity of building any big network, which mandates a cost quite a lot higher than “zero.”

In principle, marginal cost pricing assumes that a seller recoups the cost of selling the incremental units in the short term and recovers sunk cost eventually. The growing question is how to eventually recover all the capital invested in next generation networks.


On the other hand, we also must contend with product life cycles. As we have seen, in developed markets people use voice services less, so there is surplus capacity, which means it makes sense to allow people unlimited use of those network resources.

That was why it once made sense for mobile service providers to offer reduced cost, or then eventually unlimited calling “off peak.”

Surplus capacity caused by declining demand also applies to text messaging, where people are using alternatives. If there is plenty of capacity, offering lower prices to “fill up the pipe” makes sense. And even if most consumers do not actually use those resources, they are presented by value propositions of higher value.

Video entertainment and internet access are the next products to watch. Video is more complicated, as it is an “up the stack” application, not a connectivity service. Retail pricing has to include the cost of content rights, which have not historically varied based on demand, but on supply issues.  

Linear video already has past its peak, while streaming alternatives are in the growth phase.

Internet access, meanwhile, is approaching saturation. That suggests more price pressure on linear video and internet access, as less demand means stranded supply, and therefore incentives to cut prices to boost sales volume.

Marketing practices also play a big part, as the economics of usage on a digital network can be quite different than on an analog network. And some competitors might have assets they can leverage in new ways.

In 1998, AT&T revolutionized the industry with its “Digital One Rate” plan, which eliminated roaming and long-distance charges, effectively eliminating the difference between “extra cost” long distance and flat-fee local calling.

Digital One Rate did not offer unlimited calling at first, but that came soon afterwards. In the near term, lots of people figured out they could use their mobiles to make all “long distance” calls, using their local lines for inbound and local calling only.

With unlimited calling, it became possible to consider abandoning landline service entirely.

At least in part, the growth of mobile subscriptions from 44 million in 1996 to 182 million by the end of 2004 is a result of the higher value of mobile services, based in part on “all distance” calling.

Mobile revenue increased by more than 750 percent, from 10.2 billion dollars in 1993 to more than 88 billion dollars in 2003.


During this same time period, long distance revenue fell by 67 percent to 4.3 billion dollars, down from 13.0 billion dollars.

The point is that connectivity prices and some application (voice, messaging) prices have had a tendency to drop closer to zero over time. Moore’s Law plays a part. Open source also allows lower costs, and therefore more-competitive prices.

Optical fiber and microwave play a part in boosting capacity and lowering unit prices.  Internet protocol also helps (lower network interface costs).

Competition has had a larger impact. Regulatory cost reductions have been key in some markets.

What Revenue Sources Drive the Next 50% Growth?

As a rule, I expect that any given communications service provider will have to replace about 50 percent of current revenue about every decade. Among the best examples (because we have the data) is the change in composition of U.S. telecom revenues between 1997 and 2007.

Back in 1997, nearly half of total revenue was earned from “toll” services (long distance, including international and domestic long distance voice. Profits also were disproportionately driven by long distance services.

A decade later, toll service had dropped to 18 percent of total revenue, while mobile services had risen to about half of total revenues, up from about 16 percent of total.


A similar trend can be noted for European Union mobile revenues between 2010 and 2018, a period of less than a decade, but still a time when voice revenue dropped from about 80 billion euros to about 45 billiion euros, while messaging dropped from about 19 billion euros to perhaps 10 billion euros and mobile internet access grew from about 18 billion euros to perhaps 42 billion euros.

The point is that revenue sources changed at least 50 percent over eight years. So the big question now, in developed markets, is what will replace mobile internet access, voice and messaging revenues over the next decade, when half of current revenues, it must be assumed, will disappear.

That is why--for better or worse--internet of things, video entertainment, application, platform and other ultra-low-latency services are so important. If there are other candidates for revenue replacement, it is hard to say what they might be.

How Much Would You Pay for Ad-Free Facebook?

With word that Facebook is indeed looking at the possibility of offering subscription access, the question obviously arises: what price would a revenue neutral, assuming Facebook only needs to replace current revenue?


Facebook average revenue per user overall (globally) was a bit more than $6 a quarter in the fourth quarter of 2017. But U.S. revenue was in the $26.76 per user, per quarter range. So it will matter where the paid subscribers come from.


There  is, for that reason, no simple answer. Replacing a U.S. free user might involve replacing $9 a month in lost ad revenue, but just $3 in Europe, 83 cents in Asia or 67 cents per month in Africa.


source: Data N Charts



Source: Facebook


The “simple” answer is to charge different amounts in each market. And that is where matters get very tricky. How much additional cost is required to create a new marketing and fulfillment mechanism for paid subscriptions in any market? Does that cost scale linearly with living costs in each market, or is there some universal cost of computing infrastructure and marketing that applies globally?


That cost will hinge on volume, but assume a reasonable assumption is made that the incremental cost of selling a subscription amounts to 25 percent higher costs. Then the retail price of a U.S. Facebook subscription would be perhaps $11.25 a month.


Compare that rate with consumer willingness to pay for other apps. Even the most-desired mobile apps seem to be priced at just $3, total, all in, with no recurring costs. Netflix might cost $11 a month in the U.S. market, if fulfillment and marketing costs were just $2 per sub, per month.


Facebook arguably could sell for a price point closer to Netflix than a mobile app (low one time cost). If marketing costs are higher than $2 per month, retail prices could reach far higher levels.


It will be a tricky exercise. Prices much above $10 to $11 a month might face significant user resistance. Ask yourself whether you would pay $30 a month for an ad-free version of Facebook, for example. Most of us likely would refuse to do so.


So is “more privacy”  worth $11 a month to perhaps $15 a month on Facebook?

Thursday, May 3, 2018

Cost is Being Ripped Out of the Video Ecosystem, Less Opportunity for Cisco CPE

There are many reasons why Scientific-Atlanta and General Instrument do not exist anymore, why those firms initially were gobbled up by the likes of Cisco and Motorola, and why the set-top box business has changed.

The changes also explain why Cisco now has sold off those former Scientific-Atlanta assets.

Initially, the set-top box business was seen as a way equipment suppliers could get into the customer premises equipment supplier end of the video distribution business. But changes in the consumer video subscription business now mean there is less profit in the value chain.

That means taking cost out of the business. So open source has become a more-important trend, and with open source, less opportunity for set-top suppliers. Moves by the U.S. Federal Communications Commission to open up the set-top box to third party competition also play a role.

Comcast’s X1 initiative, which has Comcast supplying its own decoders , is part of that firm’s effort to create additional value third-party approaches do not offer. In other words, the biggest U.S. cable set-top customer now has become a competitor.

Changes are even more advanced in the mobile business, where the smartphone itself effectively becomes an access device, with no need for third-party gear. In large part, that is possible because internet protocol has become the next-generation network of choice, globally.

There are implications. In the mobile arena, the phone itself displaces the fixed network modem, the fixed network set-top, the fixed network Wi-Fi router. In the mobile business, as in the fixed business, CPE costs are being ripped out of the value chain.

Cloud Computing Keeps Growing, With or Without AI

source: Synergy Research Group .  With or without added artificial intelligence demand, c loud computing   will continue to grow, Omdia anal...