Monday, February 8, 2016

Better to be "Lucky" Than "Good" When Stimulating Competition

“Better to be lucky than good (or smart),” an adage suggests. Some might say that also applies to the main thrust of communications policy in the U.S. and, perhaps, other markets, which is to encourage competition.

One might argue that the Telecommunications Act of 1996, the most-sweeping revision of U.S. communications framework since 1934, failed, and yet competition has grown. The Internet is the reason.

Where the Act envisioned unbundling and voice competition, we now are in an Internet era where Internet-based apps and services now are at the forefront, not voice.

In other words, “competition in voice services” essentially was the problem to be solved, just at the point that the rise of the Internet was to make all that strategically unimportant.


U.S. policy further has had the emphasis of encouraging facilities-based competition, which initially took the form of assuming unbundling and wholesale would create the basis for sustainable investment in new platforms.

The apparent model was experience with long distance service, which initially took the form of unbundling, then lead to facilities investment.

Perhaps nobody could have foreseen the rise of the Internet and the collapse of the independent long distance market, which once was the cash cow for the whole industry, but now exists largely as a “feature,” not a product.

So far, what has happened is that cable TV networks were upgraded and repurposed to function as full-service telecom networks, and that much demand has shifted to mobile networks.

That is the "luck" part of policy. To the extent competitive policies have worked, it has been because of unforseen developments, not the direct result of direct policy intervention.

Local competition in the U.S., it turns out, was not the result of new entrants constructing new plant, but from the repurposing of the embedded cable television plant and the migration of many households to the exclusive use of mobile wireless services,” argue George Ford, Phoenix Center for Advanced Legal & Economic Public Policy Studies chief economist and Larry Spiwak, Phoenix Center president.

Unbundling has arguably not lead to much new facilities investment outside cable TV and mobility, with one possibly important new exception: Google Fiber, which has spurred a rethinking of possibility for sustainability of many smaller Internet service providers.

Still, it is noteworthy that local access no longer is considered a “natural monopoly” in the U.S. market.

But the larger issue is what to do next.

How Does Network Density Affect Old Debate About Use of Wi-Fi as a Rival Access Platform?

Time and Moore’s Law can change the strategic context within which decisions about business strategy are made.

Many note how the cost to start an app company have declined by an order of magnitude or more since the turn of the century, a direct reflection of Moore’s Law improvements as well as the commercialization of cloud computing platforms. Since 1992, for example, the cost of computing or storage has declined by five orders of magnitude.

That means the information technology cost to create a startup is easily an order of magnitude or even two orders of magnitude less than it was in 2000. Granted, IT costs are only part of the cost of creating a company and building its products.

The point is that, for a variety of reasons, app development has been democratized by cloud computing, Moore’s Law and widespread consumer use of cloud-based apps.

In a similar way, thinking about how to use mobile networks and Wi-Fi networks to support mobile services have changed, and will change more as new spectrum is released to support Internet usage, and as access methods broaden.

Broadly speaking, we have been debating for decades whether Wi-Fi can be an effective substitute for mobile network access, and under what conditions.

But Moore’s Law, new spectrum and new access methods continue to change the context. Put simply, all the trends point to denser wireless access networks, whether mobile or untethered (Wi-Fi).

Denser networks mean the value of platforms also changes. But both mobile and Wi-Fi networks are becoming more dense.

So even if denser Wi-Fi means a better ability to satisfy a wider range of connectivity needs, mobile networks also are becoming more dense, potentially obviating the value of using Wi-Fi, from an end user perspective.

Also, as it becomes easier for mobile network bandwidth to be bonded with Wi-Fi, the value of both networks will tend to rise, in tandem. In other words, better Wi-Fi means better mobile access. But it might not be the case that better mobile networks mean one-for-one increases in the value of Wi-Fi.

Higher frequencies for mobile access (millimeter wave, for example) mean mobile networks will be denser.

That very density, plus the bandwidth gains (higher frequencies carry more bits, because they have more Hertz), mean Wi-Fi is helpful, but less useful as a complete alternative access platform.

It remains unclear how network densification affects the value of Wi-Fi as a rival access platform, compared to mobile. Arguably, mobile operators will be able to monetize Wi-Fi easier than Wi-Fi can monetize mobile.



Bandwidth costs have declined by two orders of magnitude since 1999, as well. Some bandwidth costs have dropped even more. Comcast, for example, has been doubling bandwidth every 18 months.

That is as fast as Moore's Law might suggest is possible, though most might not believe it is possible in the capital-intensive and labor-intensive access networks business.

As crazy as it seems, U.S. Internet service provider Comcast, now the biggest supplier in that market, has doubled the capacity of its network every 18 months.

In other words, Comcast has  increased capacity precisely at the rate one would expect if access bandwidth operated according to Moore’s Law.

And that, also, means consumers of apps, as well as suppliers of apps, can reach more people than ever, more affordably than ever.

TRAI Rules "No Free Basics" in India

In a move that comes as no surprise, the Telecom Regulatory Authority of India has ruled that programs such as Facebook’s “Free Basics” are covered by rules related to non-discriminatory tariffs, and has banned such offers.

For that reason, TRAI has banned such programs in India. Simply put, Internet service providers cannot offer free access to packages of applications curated by the ISP.

Having done so, TRAI has framed this aspect of the network neutrality decision as a matter of common carrier tariffs, rather than as a matter of content freedom.

“No service provider shall enter into any arrangement, agreement or contract, by whatever name called, with any person, natural or legal, that has the effect of discriminatory tariffs for data services being offered or charged to the consumer on the basis of content,” the TRAI decision says.

As has been the case elsewhere, though, a distinction is made between managed services and “Internet” apps. “This regulation shall not apply to tariffs for data services over closed electronic communications networks,” the decision states.

As has been the case in other countries, Indian regulators say they have concerns about the impact of such “no charge” access to some applications as a tariff fairness issue.

In essence, the argument is that programs such as Free Basics create favored packages of content assets. In other settings, as TRAI essentially notes, that would not be an issue. TV broadcasters, radio broadcasters and others have editorial discretion where it comes to the content they wish to broadcast or publish.

In this case, TRAI essentially deems the “level playing field” more important than other values, including the obvious benefit of allowing more people of low income to use mobile Internet apps.

Lady Gaga Owns National Anthem



Totally, totally nailed it. 

Sunday, February 7, 2016

50

Thanks for the ride, Peyton

Friday, February 5, 2016

"3 or 4" for Mobile, "2 or 3" for Fixed Networks Now Are the Key Numbers

In most mobile markets, the key numbers are "three or four," representing the number of sustainable operations. In some fixed markets, the relevant numbers might now be "two or three," likewise representing the number of viable facilities-based competitors.

Technologists, service provider executives and analysts have endlessly debated the “cost” of deploying high-capacity networks of several types for a few decades. Over those decades, the cost parameters have changed.


Fiber to home and digital subscriber line hardware have gotten more affordable. Cable TV DOCSIS platforms have vastly improved. Now there are other potential options.


Internet access by balloon, unmanned aerial vehicles, fifth generation mobile networks, fixed wireless (TV white spaces, for example), Wi-Fi hotspots, municipal networks and new constellations of low earth orbit satellites are some of the reasons to argue that Internet access business models are liable to be redefined over the next decade.
On the other hand, other key business model inputs have not changed very much. The network cost (outside plant) still appears to hover around $1,500 per location, including construction and hardware.


Active elements still cost about $280 per location.


Activated drops and customer premises equipment costs remain substantial. CPE alone might represent costs of $455 or so per active customer location. Installing an actual drop for a customer can cost $300.


Taken all together, the cost of an FTTH install in a medium-sized U.S. city, for example, might be about $1780. The cost to connect a paying customer adds another $755.


So it still can take $2535 in network-related costs to serve a customer. Marketing costs have to be added on top, as well. New attackers likely can figure out ways to spend less than the $850 to $2,000 tier-one competitors typically spend (service discounts plus out of pocket costs) to acquire a new account.






But business models are even more sensitive to take rates, in turn driven in large part by the number of locations served by a competitor that cannot be dislodged.


In other words, in markets where two other competitors--both accomplished--continue to hold about 60 to 66 percent customer share, per-customer costs are substantially higher than per-location costs.


If per-location fixed cost is $2535, and take rates are 33 percent, then network cost per paying customer is $7605. After drop costs and CPE, per-customer cost (without marketing) is $8360.


If per-month revenue is $100, it takes more than seven years to recover network costs. Few, if any, private firms would undertake such an endeavor, given those obstacles.


The business model works better in markets where just one serious competitor operates (a cable operator, for example), allowing the attacker to contemplate take rates more along the lines of 50 percent. That reduces per-customer network cost to $5070, with total connection cost of $5825.


That possible variance is possible because, in some markets, competitors might be very vulnerable to a challenge. The incumbents might not be able to afford to reinvest in their own networks.


In other cases they might prefer taking a big share loss to reinvesting at the level required to blunt the attack (some companies focus on major urban markets and are willing to lose rural or low-density markets).


Higher market share (50 percent rather than 33 percent) reduces the break-even point on network investment by about two years. That still is a tough hurdle, though, with more than five years required just to recover installed network costs.


Boosting average revenue per account therefore is a key strategy for quickening the payback. If per-account revenue is $150, instead of $100, break even times shrink. At 33 percent take rates, and $150 monthly revenue per account, break even on network costs comes in less than five years.


At 50-percent take rates, and $150 monthly revenue per account, break even on the network investment can come in a bit more than three years. That is a workable business model for many firms.


One new approach, in that regard, is to strand fewer assets, building only in some neighborhoods, for example. That also lowers overall capital investment, since a smaller network is built.


A 2016 DIscus Project white paper on high speed access network business models reviewed models where more than one physical network operated, asking the key question: how many such networks are sustainable?


The conclusion, as you would expect, is that few such networks are sustainable in any given region or market: “two or three.” The important number is “three.” Under some conditions, three facilities-based competitors might be sustainable, where today the number is “two.”


Where the contestants are private firms, sustainable operations are possible in dense urban areas, for example, or where one existing incumbent cannot, or will not, upgrade to match the attack.

The point is that, as hard as the business model might yet be, the business case for new gigabit networks--even in markets where telcos and cable TV already operate--might well be improving to the point where, in some markets, three facilities-based competitors can sustain themselves.

Thursday, February 4, 2016

Gigabit Access Business Models are Ripe for Innovation

Business models for gigabit Internet access networks are in a very-fertile phase right now. 

Old assumptions about what was possible are being redefined as first Google Fiber, and now many independent ISPs and a growing number of private efforts supported by local government assets are being launched or considered.

Before the process plays out substantially, we also are likely to discover that incumbents and attackers alike have redefined business models in substantial ways. Surviving incumbents will have taken out more costs than they believed possible.

Attackers will have established that low operating costs make the difference in the business model. And, along the way, new revenue streams are likely to emerge as crucial inputs.

It might finally be possible to monetize mobility services using Wi-Fi hotspots supported by such networks, for example.

One of the crucial assumptions of a 2013 DIscus Project white paper on high speed access network business models was the evaluation of possible business models where the key variables were where and what to bundle and provide wholesale.

Another more-recent paper reviewed models where more than one physical network operated, asking the key question: how many such networks are sustainable?

The conclusion, as you would expect, is that few such networks are sustainable in any given region or market: “two or three.”

Where the contestants are private firms, sustainable operations are possible in dense urban areas.

The study reached no conclusions about public access networks built and owned by local government units.

In recent days, there are some new innovations. Among the more interesting developments are the Google Fiber “fiberhoods” approach, where no construction occurs until a minimum number of customers sign up.

The fact that regulators allow such approaches is important, allowing service providers to invest only where there is enough demand to justify new gigabit connections.

The new cable TV DOCSIS 3.1 platform also is important, as it allows delivery of gigabit services across the entire customer base, without a retrofit of the physical plant.  

For new housing developments, bundling network cost with home purchase prices in greenfield developments might be possible in some cases.

But most potential fiber-to-home connections are going to be retrofits, where that approach is not possible.  

So it is noteworthy that a growing number of smaller Internet service providers and local governments are actively exploring, or launching, gigabit access networks, generally privately operated and funded, but with governments contributing some assets.

Just how important such efforts ultimately will be cannot yet be predicted. Still, it is almost shocking that networks as extensive as those built by Google Fiber, or as small as the smaller-town networks being built by small ISPs, are suggesting the business case for gigabit networks can work, even in markets where telcos and cable TV already operate.

Directv-Dish Merger Fails

Directv’’s termination of its deal to merge with EchoStar, apparently because EchoStar bondholders did not approve, means EchoStar continue...