Thursday, April 16, 2015

CenturyLink Adds Another Gigabit Community

It can be argued that if a firm is a provider of fixed telecom network services; does not own mobile assets; has a normal mix of consumer and business accounts and does not intend to sell its assets, exiting the business, then high speed access holds the key to survival, and any hope for prosperity.

CenturyLink might now be among the best example of that strategic imperative. CenturyLink has been aggressively launching symmetrical gigabit services across its service territory, most recently adding La Crosse, Wisc. to the list of communities able to buy the service.

Since 2013, when CenturyLink lit its first gigabit network in Omaha, CenturyLink in 2014 announced neighborhoods in 16 cities would get gigabit networks.

Residential customers can purchase 1 Gbps service for a monthly recurring charge of $79.95 with a 12-month term commitment and when bundled with additional, qualifying CenturyLink services. Stand-alone prices are likely to be in the $130 a month to $150 a month range.

What If No Business Case for Gigabit Metro Networks Exists?

One has to wonder why a concentrated 20-mile downtown fiber network apparently was not financially interesting to a single U.S. fiber specialist, Internet service provider or competitive local exchange carrier, causing San Leandro in 2011 to create and build its own fiber backbone serving the downtown area.

The likely answer is that commercial suppliers could not create a business case. And that might be the story for such networks: they might wind up being built because no single commercial service supplier actually can earn a financial return.

A federal grant of about $2 million, for example, was used to build 7.5 miles of the initial 18-mile core network, which was itself financed by OSIsoft CEO Patrick Kennedy, as the anchor tenant.

San Leandro Dark Fiber LLC, the firm created by Kennedy to build the network, invested $3 million to pull fiber strands through existing conduit.

San Leandro is sandwiched between Oakland on the north and Hayward to the south. The suburb of perhaps 85,000 people features any number of industrial (food processing) operations, several corporate anchor firms (JanSport, The North Face, Ghirardelli, OSIsoft, Otis Spunkmeyer), a Coca-Cola plant. Maxwell House coffee roasting plant and five shopping centers.

But the story here might just be that no commercial provider could create a viable business plan for the whole network. In San Leandro, an anchor tenant was motivated to create gigabit connectivity because such connectivity is essential for its own business.

Net Neutrality Founded on Bad Science

Analyst Martin Geddes has been arguing for “science-based” telecom policy. Unfortunately, he argues, U.S. network neutrality fails, in that regard.

Discussing the Federal Communications Commission’s new rules, Geddes spares no words. “Regrettably, they have proceeded to issue rules without having their science in order first,” Geddes says. “As a result they have set themselves up to fail. My fear is that other countries may attempt to copy their approach, at a high cost to the global public.”

Consider the apparently easy issue of “no blocking of lawful packets.” Most people agree lawful packets should not be blocked (some governments disagree). But is it “blocking” when a specific Internet service provider does not interconnect with some other Internet domain?

“How will the FCC differentiate between ‘blocking’ and ‘places our ISP service doesn't happen to route to’"?

Geddes says there are issues of business practice. “Why can't an ISP offer ‘100 percent guaranteed Netflix-free!’ service at a lower price to user who don't want to carry the cost of their neighbors' online video habit?”

“A basic freedom of (non-)association is being lost here,” Geddes notes. “To this foreigner, ‘no blocking’ is a competition issue for the FTC and antitrust law, not the FCC (and the FTC agrees, by the way).
Similar problems exist with "no throttling" policies.

“Broadband is a stochastic system whose properties are entirely emergent (and potentially non-deterministic under load),” Geddes says.

How will a regulator distinguish between "throttling" and mere "unfortunate statistical coincidences leading to bad performance"?

And fairness is an issue. “Why should someone who merely demands more resources be given them?” Geddes rhetorically asks. “Where's the fairness in that!”

What's the metric used to determine if "throttling" has taken place? User behavior matters.

Optimizing networks for "speed" performance produces better results for large file downloads, not interactive apps, for example.

What are the proposed metrics for performance and methods for measuring traffic management? What's the reference performance level for the service? Without these, "no throttling" is meaningless and unenforceable, Geddes notes.

The real issue is whether the service performance is good enough to deliver the quality of experience outcome(s) that the user seeks. And that’s a problem. By definition, “best effort” is just that: best effort.

The other problem is that such an approach necessarily prevents creation and use of classes of service that users benefit from, and might well desire to buy and use.

Traffic scheduling (packet “prioritization”) is a good thing, even if it violates the rules, in other words.

Net neutrality “undermines rational market pricing for quality.”

We already have "paid priority", he notes. “All CDNs offer de facto priority by placing content closer to the user, so it can out-compete the control loops of content further away. Paid peering is perfectly normal.

“If you tried to make spectrum policy rules that broke the laws of physics, you'd be ignored by informed people,” Geddes says. “Boadband is similarly constrained by ‘laws of mathematics.’ Why don't we try making rules that fit within those, for a change?”

“We need a new regulatory approach, grounded in the science of network performance, that directly constrains market power,” Geddes argues.

Missing Really Big Trends is Par for the Course

Nothing better illustrates our inability to foresee the future than the unexpected, non-linear growth and exponential impact of mobile services and the Internet.

Who would have predicted, in 1990, that voice communications would be available to nearly everyone on the planet before 2010, and used by most people by 2015?

Likewise, we sometimes miss--with all the talk about the digital divide--how fast Internet access is spreading, and how fast access speeds are growing. In Nigeria, mobile Internet adoption already is 59 percent of all mobile users.

Mobile data traffic in Sub-Saharan Africa is predicted to grow around 20 times between the end of 2013 and the end of 2019. Globally, mobile data traffic will grow 10-fold during the same period, according to Ericsson estimates.  

In fact, by about 2019, 75 percent of users in the region will have mobile Internet devices that are
video capable. That is a radical departure in a region where most devices today are feature phones.

In 2002, roughly 10 percent of people owned a mobile phone in Tanzania, Uganda, Kenya and Ghana. Now, adoption ranges from 65 percent to 83 percent. In just a little over a decade, mobile usage grew 600 percent to 800 percent.

Today, mobile phone ownership is as common as in the United States (89 percent adoption) in South Africa.

At the same time, fixed network adoption  penetration in the seven countries surveyed (Ghana, Kenya, Nigeria, Senegal, South Africa, Tanzania and Uganda) is close to zero.

About two percent of respondents surveyed across these nations say they have a working landline telephone in their house.

It is fair to say that as recently as 1980, few would have predicted most of the world’s people would be connected to communications networks by about 2015, solving a major development and social problem that had seemed nearly intractable.

Back then, the only feasible solution was deemed to be the traditional fixed telephone network. Mobility changed all that.

That precedent is one reason why many believe supplying Internet access to the world’s people likewise will be solved, with a few decades, if not within a single decade, by the use of non-tethered, mobile or other spectrum-based networks.

Netflix Looks Past HBO

Netflix once argued there was a race going on between HBO and Netflix, each racing to become more like the other. Perhaps the race is over. Maybe the race no longer matters, as both are favored brands with distinct content .

Maybe the coming inflection point is coming, and the real challenge for Netflix is not “catching HBO,” but displacing linear TV.

That shift in emphasis, confirming that the competition is not HBO, arguably represents an intensification of the “Internet TV” trend.

There are times when a firm succeeds with one goal, and needs to switch to bigger goals. Netflix seems to be at such a place.

India Inches Towards Banning "Zero Rating"

Though action has not yet been taken, it is starting to look as if Indian Internet regulators will eventually put an end to “zero rated apps” that have proven effective ways of introducing non-Internet users to the benefits of using the Internet.

So here we have an issue of “good things” in conflict. One is the notion that innovation is promoted when every app has an “equal chance” of being discovered and used (even if, in practice, that rarely is true, or possible).

The other good thing is the ability to provide people access to useful apps without those people having to pay.

And it appears one or the other of those good things will not be lawful, eventually.

Should such a framework remain in place for a long time, more new apps are going to move away from “Internet” delivery, though. Some apps work better when quality of service measures are available. And some apps might have life-threatening consequences if absolute low latency or bandwidth is unavailable.

Such apps will move away from the public Internet and into “walled gardens.” That might be useful, in some instances. Medical apps, driverless cars and other automated processes arguably would benefit from higher performance guarantees than can lawfully be provided using the consumer Internet.

Wednesday, April 15, 2015

Google is Almost at a Watershed Moment

If history and precedent matter, then Google has nearly reached a watershed moment.

The evidence comes from what happened to Microsoft when it faced its own antitrust troubles in the late 1990s and early 2000s.

Microsoft ruled computing in the mid 1990s. By 2010, it was an also-ran. All that happened despite the fact that Microsoft escaped being broken up, and also avoided crippling fines.

Google now faces the same potential problems (forced divestiture and huge fines). Some day, when the case has run its course, Google is likely to be fined a relatively small amount, and will have avoided any danger of a forced breakup of the company.

But Google’s momentum will have been halted. And if precedent serves as guide, Google never again will “lead” computing markets.

Will We Break Traditional Computing Era Leadership Paradigm?

What are the odds that the next Google, Meta or Amazon--big new leaders of new markets--will be one of the leaders of the present market,  b...