Sunday, October 4, 2015

Will Cloud Computing Prices Keep Dropping to Zero, or Close to It?

Amazon Web Services has cut prices about 50 times. So will it keep doing so? Most would say “yes.” Will other suppliers such as Google and Microsoft follow suit? Most would also say “yes.”

In most industries, “ruinous” levels of competition often are said to represent a “race to zero” in terms of retail pricing, with negative implications for firm or industry sustainability.

But AWS has chosen such a strategy deliberately. AWS rationally has decided to keep cutting prices as the foundation of its business model.  

“How can that be?” is a reasonable question for any outside observer. How can a market leader in cloud computing literally price its core services at nearly zero, in either consumer (free computing, free storage, free apps)  or business markets (cloud computing, storage, apps or platform)?

After all, big data centers and the software, hardware, real estate and energy required to run them are substantial.

The business advantages of huge scale are part of the answer. Firms such as Amazon and Google count on the fact that only a few providers, with enormous scale, can afford to compete in such a market.

So gaining scale, then lowering prices, feeds a virtuous cycle where additional customers, buying more services, allow the supplier to gain even more scale and drop prices even more, attracting yet more customers.

With sufficient scale, “scope” also becomes relevant: AWS and other leading cloud computing suppliers can sell additional services and features to the customers they already have aggregated.

So even if a “race to zero” has generally been considered dangerous and unsustainable in big existing markets, it is the foundation of strategy in many new digital--and some emerging physical markets--as well.

It is hard to compete with a competitor that gives away what you sell. That, in fact, is precisely the logic often driving business strategy in the Internet realm.

That strategy is at work with voice over IP, instant messaging, online streaming video and audio, Internet access, search and most “print” content. Many would agree, but note that these all are non-tangible, digital products. That notion is correct.

In most “physical product” areas, the Internet has lead to reduced prices, or less friction, but surely not to “near zero” levels.

That, of course, is not really the issue. The issue is a competitor’s ability to destroy enough gross revenue--and strategically, profit margin--as to break the market leader’s business model.

This is a rational strategy for some new contestants because they actually have other revenue models that are enhanced when an existing supporting market is “destroyed.”

In a real sense, Apple gains business advantage when content prices are very low. That helps it sell devices enabling content consumption. Facebook and Google gain when each additional Internet user is added, since they make money on advertising.

Prices for physical good distribution do not have to reach “near zero,” only “near zero profit,” for whole markets to be disrupted.

An attacker able to create a positive and sustainable business case in a market that is perhaps smaller (in terms of overall revenue) still wins is the attacker emerges as a market leader in the reshaped market.

One example: many observers would say that the chief revenue stream for Costco, the discount groceries retailer, is membership fees, not groceries. Likewise, the business model for most movie theaters is concessions, not admission tickets.

That is one sense of the term “zero billion dollar market.”

The strategy is inherent in business models used by many leading application, device or service providers.

The difference is that the trend is extending beyond businesses that are inherently “digital.” Some see shared vehicle businesses as disrupting the automobile market on a permanent basis. Shared accommodations businesses have potential to disrupt the commercial lodging business.

Without a doubt, we will see spreading efforts to replicate such sharing models in most parts of the economy where ownership is the dominant retail model.

Suppliers of cloud computing, especially infrastructure as a service (IAAS) but even the biggests segment--software as a service--also must directly confront pricing strategies that deliberately aim to reach near-zero levels.

There are several analogies you might might apply, to Moore’s Law, marginal cost pricing or experience curves, for example. Some might say that same logic is embedded in much of the economics of the Internet as well.

The notion is that, over time, performance vastly improves while retail price either remains the same or also shrinks, not just on a per-bit or per-instance basis, but absolutely, adjusted for inflation or not.

Suppliers of network bandwidth and computer chips long have had to create or recraft businesses built on such assumptions.

The obvious business implications are stark. Many firms, in a growing range of industries, face competitors who literally base their business models on marginal cost pricing, near zero pricing or actual “free” prices.

Those competitors can do so because widespread use of the “near zero” or “zero” price function allows them to make money indirectly. For Amazon, the other way is retailing all manner of products. For Google and Facebook the other way is advertising. For Apple the other way is device sales.

In all those cases, the direct revenue contribution for one input--while important--is less important than ubiquity or huge scale as it relates to the primary revenue model.

“Zero” levels of pricing are a fundamental reality in a growing range of industries. How successfully the legacy providers can adapt always is the issue. In many cases, the answer is “we won’t be able to do so.”

Some would say that is an example of creative destruction. But it is destruction, nevertheless.

Saturday, October 3, 2015

India DoT Recommends Regulation of VoIP, OTT Messaging

Over-the-top (OTT) services such as WhatsApp, Skype and Viber need to be regulated in some way, according to the Telecom Regulatory Authority of India (TRAI). That might mean use of services such as WhatsApp, Viber or Skype are required to charge users retail prices much closer to, or equal to, those of mobile service providers.

The India Department of Telecommunications (DoT) has recommended domestic voice over internet protocol (VoIP) calls offered by WhatsApp, Skype and Viber be regulated in line with voice calls offered by telecom operators.

Voice calls offered by mobile operators are estimated to be 12.5 times more expensive (at retail) than those through OTT services. In the case of text messages, the difference is 16 times, a DoT report argues.

For a one-minute phone call, a customer is charged about 50 paise, while a one-minute call made through the Internet costs four paise, according to TRAI. The disparity in text messaging costs is even wider, where a single mobile network text message might cost 16 times what an OTT message costs the end user.


"Bring Your Own Access" as a Major Paradigm

It seems inevitable that greater reliance on a mix of spectrum, networks and licensing regimes will be a hallmark of all next-generation networks from this point forward. You might call this an example of "bring your own access" in the communications-related businesses.

That is a new approach, historically. Telcos, TV and radio broadcasters, cable TV companies and others have essentially supplied the access function as part of their core services.

Wi-Fi was the first major break in the pattern. In that case, the end user supplies his or her own access, in the sense of paying for the access connection and the local small cell transmitter function. 

For the first time, the app is fully separated from the access, in terms of who pays for the access connection and features.

That is important for suppliers as well as end users. Aside from the dramatic impact on capital and operating cost, the shift to "bring your own access" also changes traditional thinking on how the "access" is supplied, when it is the service provider who actually supplies the access.

We are moving from communications using “only my owned resources” to a heterogeneous world where the access function routinely uses a mix of resources (both “my assets” and “any other available assets.”

That has implications for bandwidth, capital cost, operating cost, network design and protocols as well as business models.

Consider the implications for business models. Up to this point, almost all big commercial wireless industries have been built on the use of licensed spectrum that also has been highly regulated in terms of what protocols can be used in each frequency band and often even what applications are lawful in such bands.

Wi-Fi has been the first shift away from that pattern. Having shown the role license-exempt spectrum can play in supporting many industries, including those using licensed spectrum, new work is being done to increase the amount of license-exempt spectrum available for communications uses.

There are business model implications. To the extent spectrum remains a relatively scarce commodity, licensing creates moats around some business models.

Freeing up more license-exempt spectrum creates new ways for businesses to be built on communications spectrum at vastly lower cost.

The best example so far has been Wi-Fi “anything” compared to use of the same apps on networks using licensed spectrum.

There are capital cost advantages as well, since the cost of deploying license-exempt apps and devices does not have to incorporate the cost of spectrum licenses. And such devices and apps can build upon the fixed network infrastructure already in place to support new untethered and mobile apps and services.

To note only the most obvious implications, greater availability of license-exempt spectrum will allow many more types of service and app providers to build businesses based on local, small cell transmitters than are affordable and based on incremental capital, self-installed and activated by end users as needed, with the actual transmission infrastructure supplied by the end user, not the service provider.

That is one concrete example of a “bring your own access” approach to building a business, a service or application.

If a service can be built on the assumption that the customers supply their own Internet access as well as the untethered transmitting network, and that such functions are largely supplied by the end users themselves, clear operating cost advantages also are possible.

The app or service provider does not have to build, operate and maintain the access network or the transmitting cell sites.

Think of the analogy of “over the top,” where an app can be used independently from the method of access.

That has business model implications for all sorts of firms and industries.

Former Incumbent Telcos Still the Highest-Cost Provider in Most Markets

“In order to be successful, we must change our cost structure so we can fuel our growth and operate more efficiently,” Sprint spokesman Dave Tovar said, about the most-recent announced cuts at Sprint, said to entail reductions of about $2.5 billion, or roughly seven percent of Sprint’s annual operating costs.

Some problems don't change: former incumbent telecom service providers likely still are the highest-cost providers in most communication markets. So long as competition remains robust, that is going to represent an on-going challenge.


Similar cost-cutting initiatives have occurred rather routinely in the telecommunications industry since deregulation and competition have ramped up, beginning in limited ways in the 1980s, and reaching full-blown proportions in the wake of the Telecommunications Act of 1996.


Perhaps an equally-important rationale is changing in the industry business model since the advent of the Internet, which has turned a growing number of paid-for services into free or very low cost features.


Those pressures are not going to abate. The cost structure of the telecommunications industry, was set during the monopoly era where everything was “cost plus.” In fact, the more money telcos invested in their networks, the more money they made.


The new competitive market, as everybody in the business knows, enjoys no such luxury.


As any executive in the cable TV business can attest, cable’s cost structure is lower than that of the telcos the cable industry generally competes against.


It is not yet clear whether Google Fiber has lower operating costs than a typical cable company, but it is possible. Without a doubt, every independent wireless Internet service provider, every gigabit fixed network provider, every satellite Internet access provider, every communications specialist or niche services provider and every Wi-Fi hotspot network provider likewise has lower costs.

In other words, the former incumbent telco normally is the high cost provider in any market in which it competes. That means the cuts and streamlining will continue.

That particularly is true given the slim profits firms now are wringing from their fixed network operations. For Verizon, the fixed network supplied about 33 percent of revenue, but 21 percent of earnings, in 2013.

Deducting capital investment in the fixed network, Verizon earned just 11 percent from its fixed network, before interest, taxes and amortization (assuming depreciation basically represents the bulk of additional capital investment).

AT&T generates about the same percentage of revenue from its fixed network, about 33 percent. AT&T’s fixed network represents about 21 percent of earnings, and about 11 percent of earnings if capital investment is subtracted.


Friday, October 2, 2015

A Spectrum-Based New Strategy for Sprint?

A time-tested strategy in markets dominated by a few large providers is to attack a niche in the market. The issue is whether Sprint might be able to leverage its present spectrum assets to compete in a more-specialized way, as hard as that is to do when Sprint has been slugging it out as one of four national providers of service to all consumers and businesses.

The notion is that instead of slugging it out as a network that works “everywhere,” Sprint becomes a network that works where most people live. That is a gamble in a market where “coverage” and “speed” have been key marketing platforms.

Sprint would have to be willing to reposition as a specialist, however. It would have to break with the notion that Sprint has the best coverage map, and instead argue that it is the best choice for consumers living in urban areas, who want to upgrade their phones as soon as possible, especially if they are die-hard Apple brand believers, and who are willing to lease instead of own phones.

There is some new language required to capture that positioning in a simple, elegant way. But that approach would build on spectrum assets Sprint already has, much as T-Mobile US now seems to be talking about the values of a denser network.

Observers often note that 600 Mhz, 700 MHz and 800 MHz spectrum is “more valuable” because it propagates further, outdoors and indoors. Sprint and T-Mobile have more spectrum in the 2 GHz range, however.

That means less propagation distance, but more bandwidth (it’s the physics). And in urban areas, coverage arguably is not nearly the problem that bandwidth happens to be. That’s an example of turning a weakness into a strength.

The strategy is not without risks. Niches have proven difficult to sustain.

Up to this point, smaller providers have specialized as prepaid providers, with language or other affinity group markets. A few have tried to create brands around music, youth, sports or children’s content. Some have focused on “older” users.

BlackBerry was successful for quite some time with a business email niche. Nextel gained traction as a business-focused brand with key strength in construction and several other markets.

Few of those niches have proven enduring over time.

The issue is what T-Mobile US and Sprint can do to close the gap with AT&T and Verizon. So far, T-Mobile US, and more recently, Sprint, have turned up the heat by increasing value and competing on price. Both seem to have reversed long-term patterns of subscriber losses.

Where they go from here is the big question. The danger is that, at some point, the “compete on value and lower price” squeezes profit margins so much that the attack is not sustainable.

That might still work. At some point, it is conceivable that the market could reach a stable market share structure where none of the four leading providers has an absolute need to disrupt the market any longer.

A more likely outcome is that neither Sprint nor T-Mobile US remain independent companies long enough to test that thesis, but that new owners might do so.

Indeed, some might well argue that, ultimately, Sprint and T-Mobile US are destined to become parts of other firms who need a mobile product as part of their service  bundles.

In the interim, both firms have to operate today as going concerns. A change in strategy, built around spectrum attributes, might be conceivable.

3 Cable TV Companies Might Own 75% to 80% of the U.S. "Broadband" Market

The entire premise behind regulating the few large former monopoly telecom providers differently from everybody else in the same markets is that they continue to wield near-monopoly power in the market.

Most observers would agree that, in most local markets, a cable TV provider and a telco dominate triple-play markets, even if there is significant share held by third parties in some instances.

The policy issue is whether it makes sense to continue and operate as though one of those contestants (large telcos) is so fundamentally powerful, compared to cable TV, that such suppliers must be regulated more severely than all the other contestants.

To be sure, there has been some leveling of regulatory framework. Net neutrality rules apply equally to cable TV, telco, mobile and other ISPs.

But some observers would challenge the notion that the framework still makes sense. Some providers have market power, to be sure. But that is true in every established market.

And fixed network markets arguably are past prime, and clearly shrinking, in any case. Voice and linear video are shrinking markets, while high speed access, clearly the strategic service, now is dominated by cable TV providers.

Assuming pending mergers are approved just  three companies--all cable TV operators-- would have what some would call “an effective broadband monopoly” across the vast majority of the United States, using the FCC’s definition of broadband as a minimum of 25 Mbps.  

Comcast, Charter and Altice would control 75 percent to 80 percent or more of 25 Mbps-plus subscribers.  

While that should slowly drop as AT&T and Google Fiber add new markets, it will be a slow process.

Third-party ISPs also will enter the high speed access market, but total market share for such providers is expected to be quite low, on a national basis.

Where it comes to determining what happens to the market, it will continue to be the case that what the top half dozen companies do affects most potential customers.

The mobile segment, which in the past might have been considered separately from the fixed network business, is going to change as well. Much as the long distance market used to be separate from the local access business, the mobile segment will cease, in any serious way, to be a separate market .

Most of the long distance business essentially was absorbed by AT&T and Verizon Communications. A smaller portion was retained by Sprint, but that business has dwindled to near-insignificance.

Eventually, the “independent” portion of the mobile business, represented primarily by T-Mobile US and Sprint, seem destined to be acquired by other providers, likely cable TV interests. There remains some possibility that app or device suppliers eventually could enter the market, but it seems unlikely they would do so by acquiring either T-Mobile US or Sprint, outright.

Thursday, October 1, 2015

AT&T Testing Wireless Local Loop

As part of its argument for approval of its acquisition of DirecTV, AT&T said it would be able to deploy as many as 13 million new fixed wireless local loop networks in rural areas, and AT&T seems to be working on the economics now.

The service was said to be capable of downstream speeds between 15 Mbps and 25 Mbps.

AT&T says it is currently testing fixed wireless local loop technology in select areas of the country, including sites in Alabama, Georgia, Kansas and Virginia, and is seeing speeds of around 15 to 25 Mbps, according to Fierce Wireless.

If the program reaches close to such numbers, AT&T will become the biggest fixed wireless provider in the United States.

Also, should the deployment be as large as expected, some fixed wireless suppliers will see a surge of business that dwarfs anything they have ever seen, especially in the U.S. market.

The question some of us have is whether TV white spaces could be a potential platform, even if AT&T has said it is looking at using some of its fourth generation Long Term Evolution network and spectrum.

Nor is it outside the realm of possibility that other new platforms and middle mile partners could emerge.

Both Google, with its Project Loon balloon-based access, as well as Facebook’s unmanned aerial vehicle program, envision operating as wholesale backhaul platforms, with mobile service providers as the last mile access providers.

None of those middle mile backhaul platforms obviates the need for an access link. But such middle mile platforms could help with the overall business model, which AT&T has said is untested.

Will AI Fuel a Huge "Services into Products" Shift?

As content streaming has disrupted music, is disrupting video and television, so might AI potentially disrupt industry leaders ranging from ...