Monday, October 12, 2015

Can Regulators Actually Detect Network Neutrality Infractions? Study Suggests Answer is "No"

Is it possible to detect and measure network neutrality infractions?

It is not a rhetorical question, according to Ofcom, the U.K. communications regulator. After commissioning a study on whether it is possible to create meaningful network neutrality rules, Ofcom has produced a report suggesting it is not--at present--actually possible to determine whether impermissible traffic shaping actually has occurred.

The obvious implication: how can engineers trying to enforce network neutrality rules actually ascertain that the rules are respected, when measurement is not possible?

Those are questions raised by a study conducted by Predictable Network Solutions for Ofcom, the U.K. communications regulator.

The study found that none of the existing and methods  tools actually succeed at detecting most forms of traffic management.

The reason for the ineffectiveness of the these tools is the complexity of the Internet; when there’s a delay between two endpoints, the tools are unable to pinpoint the cause of the delay.

That poses a key problem for network neutrality: there is no way, at present, to determine if an infraction has happened.

Though it always is dangerous to infer too much from any single study, on a subject as complicated as traffic management, that is potentially highly significant.

The findings cast a shadow on net neutrality regulations, which are an attempt to ban behavior that generally can’t be detected, let alone measured, says Richard Bennett, a communications consultant.

And even if the behavior could be measured, it’s not always anti-consumer.

“If I was on a business Skype call from my house while the kids were watching multiple video streams, I would like it to be differentially treated because I would like my voice call not to fail,” says Neil Davies, Predictable Network Solutions principal. “Differential management itself is not necessarily against the end user’s interest. It could be very much for it.”

Differentiated traffic “obviously has a value to the end user,” and could “potentially garner a price premium,” Davies says. That, alas, is the argument many supporters of QoS, would make.

Users should be able to choose which traffic gets priority, Davies says. Some would argue ISPs likewise should be able to offer such capabilities to their customers.

Though there will be debate about the findings, the PNS study suggests network neutrality rules are unenforceable, because application-specific discrimination isn’t detectable by any known tool, whether it’s NetPolice, NANO, DiffProbe, Glasnost, ShaperProbe, or ChkDiff.

QoS for Consumer Mobile Apps--Despite Net Neutrality--Is Possible, Even Likely in the Future

LIke it or not, some important consumer applications actually benefit from, and under conditions of congestion, might require, quality of service (packet prioritization) mechanisms. That is true whether “best effort only” is the mandated regulatory regime, or not.

And even where “best effort only” is the law, consumer services increasingly might take advantage of quality of service mechanisms, based on Wi-Fi capabilities.

The Wireless Broadband Alliance, which created the Passpoint standard, also has promulgated quality of service mechanisms. Wi-Fi Certified WMM added quality of service (QoS) functionality in Wi‑Fi networks.

With WMM, introduced in 2004, network administrators and residential users can assign higher priority to real-time traffic such as voice and video, while assigning other data traffic to either best-effort or background priority levels.

Introduced in 2012, WMM-Admission Control further improves the performance of Wi‑Fi networks for real-time data such as voice and video by preventing oversubscription of bandwidth.

Prioritization of traffic includes categories for voice, video, best effort data, and background data, managing access based on those categories.

The business implications are that as more large hotspot networks deploy Passpoint, consumer Internet connections will be supported by quality of service mechanisms provided by the Wi-Fi network, not the access network.

And that makes the QoS lawful.

And many large hotspot network operators presently believe carrier-grade hotspots will represent 57 percent of all their locations, with carrier-grade hotspots accounting for  will support 90 percent of locations by 2020.

Among operators with hotspot networks in place, 57 percent have a timeline in place to deploy a next generation hotspot (Passpoint) standard network, a survey conducted for the Wireless Broadband Alliance finds.

Some 61.5 percent of respondents already have NGH or plan to deploy it over the coming year, while a further 29.5 percent will roll it out in 2017 or 2018.

The dominant business driver is the need to enhance or guarantee customer experience for revenue streams such as  TV everywhere or enterprise services.

Improving customer experience to reduce churn and boost average revenue per account or user was seen as the primary advantage by 28 percent of  respondents.

Seamless access from hotspot to hotspot or hotspot to mobile also was a key concern.

Respondents tend to believe they will be able to generate revenue from location‑based services (69 percent), roaming (68 percent) and Wi‑Fi analytics (66 percent).

Compared to the 2014 survey findings, there is far less emphasis on Wi‑Fi offload, and more on Wi‑Fi first mobility, Wi‑Fi calling and support for entertainment video.

Many consumer services--especially those for which consumers are paying a fee--benefit from QoS mechanisms. Despite network neutrality rules, support for such apps likely is coming. All the technology tools are there to do so on big Wi-Fi hotspot networks.

Project Loon for U.S. Internet Access Market?

Will Google’s Project Loon, providing Internet access services from balloons, be a meaningful access platform across the United States? That might seem as fanciful as the notion of using balloon-based access.

But Google already is saying it expects deployment across the United States, not simply across the Southern Hemisphere.

A Google executive says it has “almost perfected” its Loon balloon technology, with the first deal with operators set to be announced “hopefully very soon”, said Wael Fakharany, Google regional business lead.

“The operators control the distribution, marketing, OSS, BSS, CRM – the customer relationship is with the telcos. We are just the infrastructure provider,” he said. “There is a viable commercial business model and is based on skin-in-the-game, sharing costs and revenue with operators for completely untouched potential.”

Telefonica, Telstra and Vodafone are among the mobile operators to have tested the Project Loon platform so far.

Fakharany said Project Loon commercial operations are expected not only in the Southern Hemisphere, where its initial tests have taken place, but also in the Northern Hemisphere, including, notably, the United States.

“The idea right now, which we are very, very excited about, is that as we enter 2016 it’s all about scalability,” said Fakharany. “It’s all about marketing this as fast as possible not only in rural Africa, but rural India, parts of the US.”

That latter clause might be the most-significant portion of the statement. While Google Fiber continues to slowly add metro areas to it footprint, many would note that Google Fiber will take years and billions in new capital to build a business big enough to challenge the largest telcos and cable TV operators.

Project Loon will accelerate the number of households able to buy Internet access from Google, in less-dense areas beyond the Google Fiber footprint.

Hotspot Operators Expect Heavy Deployment of "Carrier Grade" Wi-Fi

With the caveat that executives sometimes are wrong about how much investment they will make, where, in the future, hotspot network operators presently believe carrier-grade hotspots will represent 57 percent of all their locations, with carrier-grade hotspots accounting for  will support 90 percent of locations by 2020.

Among operators with hotspot networks in place, 57 percent have a timeline in place to deploy a next generation hotspot (Passpoint) standard network.

Some 61.5 percent of respondents already have NGH or plan to deploy it over the coming year, while a further 29.5 percent will roll it out in 2017 or 2018.

The dominant business driver is the need to enhance or guarantee customer experience for revenue streams such as  TV everywhere or enterprise services.

Improving customer experience to reduce churn and boost average revenue per account or user was seen as the primary advantage by 28 percent of  respondents.

Seamless access from hotspot to hotspot or hotspot to mobile also was a key concern.

Respondents tend to believe they will be able to generate revenue from location‑based services (69 percent), roaming (68 percent) and Wi‑Fi analytics (66 percent).

Compared to the 2014 survey findings, there is far less emphasis on Wi‑Fi offload, and more on Wi‑Fi first mobility, Wi‑Fi calling and support for entertainment video. .

Is Internet Access Business Sustainable?

Is the Internet access business sustainable--able to earn a return exceeding its capital investment--at the moment, or over the longer term? It’s a key question, and at least some analysts think the answer is “no.” 
Others disagree.

“We haven’t actually got a sustainable system at the moment,” Dr. Neil Davies, Predictable Network Solutions principal, says.

It’s a “crisis for the world’s telecom industry, in that they are not being able to construct the returns on investment they need for the capital,” he notes. That’s one view.

Others suggest the access business is stable and profitable, at least relatively recently,


Comcast, supposedly the greatest cable monopolist, averaged just a 4.5 percent return on invested capital for the five-year period from 2007 to 2012. The Time Warner Cable five-year average is -1.3 percent, some would note.


The problem, in essence, is that the switched telephone network, because of its design, actually enabled control of quality in ways that Internet Protocol architectures actually prevent.


For those of us who are more “business types,” the reasons for those conclusions are the domain of “bit doctors” who understand statistical multiplexing and its implications for large networks. But if I understand the argument correctly, the problem is that the IP architecture effectively removes the ability to control quality, on the part of any single domain within the broader network of networks.


Simply, no single domain owns or controls all the other elements that affect quality. So quality itself cannot actually be guaranteed. And if capital investment to protect or enable quality becomes nonlinear, with unknown results, then costs cannot be determined, for quality of service of any expected level.


It then becomes difficult to set retail prices at levels that recover, with certainty, the cost of investments. As the protocols are statistical, so profit and loss become statistical.


One salient implication might therefore be that no single domain actually can be certain that its own investments in network quality actually will have the desired results.


In the switched telephone network, there was a clear cost identifiable for every connection because all the resources along that path were now associated and reserved for that data stream. That is not actually possible with an IP network.


In the past, bits flowed along a fixed circuit, with one key advantage. A service provider could derive the actual cost of doing so, and price accordingly.


Packet networks are based on virtual and statistical processes at every turn, essentially. That means there are contention and congestion mechanisms happening “all over the place.”


In practice, this means no broadband provider can ever guarantee the quality and performance of the end-to-end transmission chain.

The corollary is that access providers might not be able to determine the cost of doing so, either, at least when levels of quality are part of the offer, and when there are performance guarantees, with financial penalties.

Saturday, October 10, 2015

5G Changes Everything

If you look at all the capabilities the coming fifth generation (5G) mobile network will have to support, and 5G relationship to the core network, you’d have to conclude that 5G is the “network of everything.” It will have to support low, medium and high use cases for bandwidth, latency, mobility, battery life and reliability.

The proposed 5G network will have to be location and context aware; flexible; efficient; secure; energy efficient; software optimized and therefore virtualized.

The new network will share spectrum and networks.

In other words, 5G will change the whole network, not simply air interfaces.


Mobile App Revenes to Double, Access Revenues Will Fall 9%, Between 2013 and 2020

Global aggregated statistics, though useful, can hide significant regional or ecosystem differences, it always is fair to note.

Nice upward-sloping bar charts are compelling at a high level, but can obscure other trends. In the earlier parts of a lifecycle, such bar charts give the illusion of solidity. Only later, when growth tops out, and decline begins, do we see the expected full product lifecycle curve.

Also, many other trends often occur below the “headline” numbers. Consider the relative revenue shares in the mobile ecosystem, for example. As virtually all the participants are aware, shares of overall ecosystem revenue are shifting.

In 2013, mobile service providers claimed 59 percent of ecosystem revenue, app providers 10 percent of total revenue. By 2020, the GSMA estimates, service providers will earn half of ecosystem revenues, while app providers earn 20 percent of ecosystem revenue.

The global market still is growing, as many potential customers remain to be gotten. But, as tends to be the case, the incremental new users represent less revenue per account. And, inevitably, the market will saturate, rather sooner than many expect.

The key ecosystem change: app share of revenue doubles while access revenue declines nine percent.



Friday, October 9, 2015

There is no Net Neutrality for Wi-Fi, Which Increasingy is Going Carrier Grade

Perhaps oddly, given all the attention to network neutrality as applied to Internet access providers (mobile and fixed), it is Wi-Fi where the balance between “best effort” and “assured access” or carrier grade paradigms will be most important.

At least according to Rethink Research, there will be more “carrier grade” Wi-Fi hotspots than “best effort” hotspots by 2017. That means a majority of hotspots will use prioritization mechanisms.

So with or without network neutrality rules, hotspot providers will be able to groom traffic and take other measures to provide consistent user experience.


O3b Says it is the Fastest-Growing Satellite Constellation Ever

O3b Networks says it has become the fastest growing in satellite history, selling more capacity in the inaugural year of operation than any other satellite operator with global operations.

O3b also says it has sold capacity equivalent to nearly 10 percent of the contracted capacity of the three largest Fixed Satellite Service (FSS) operators combined.

O3b says it now is providing transport to 40 customers in 31 countries, supporting  mobile operators to expand 3G and 4G/LTE services to rural populations; ISPs to provide true broadband on isolated island chains; cruise lines to bring guests and crew high-speed broadband and mobile connections; oil and gas companies to reduce costs and improve crew welfare and governments.

The O3b Networks Medium Earth Orbit (MEO) satellite constellation greatly reduces latency and helps the constellation provide extremely high throughput.

Other contestants likewise are lining up to supply services using even lower orbits, despite some skepticism from providers of geosynchronous service, as one would expect, since MEO and LEO constellations pose new competition (and some would say better user experience).

Australia Orders 9.4% Lower Wholesale Acess Prices for Copper Network

The Australian Competition and Consumer Commission has ordered a 9.4 percent decrease in wholesale access prices on the Telstra copper access network, beginning on No. 1, 2015, and continuing until June 30, 2019.

That decrease, despite fewer customers on the network, incorporates other changing input prices.

Downward pressures come largely from lower expenditures, falling cost of capital and impact of migration of users to the National Broadband Network.

These more than offset upward pressures from a shrinking fixed line market due to consumers moving away from fixed line services and to mobile services, ACCC Chairman Rod Sims said.

The ACCC noted, however, that fewer customers on the copper network will mean higher costs to serve the remaining customers.

That is a generic issue for many fixed network operators. Sowmyanarayan Sampath, Verizon Communications SVP of transformation says Verizon’s copper-based revenue is declining eight percent to 10 percent a year.

At that rate, the revenue stream disappears in a decade. The business will have become unprofitable long before then. Some might argue the business already is unprofitable, if allocated overhead is included. Others would argue the business is slightly profitable, but getting worse as more customers churn off the network.

Verizon might have operations spanning 150 countries, but its revenue is highly concentrated in the U.S. mobile business. By 2016, the mobile business is likely to account for 85 percent of Verizon earnings (EBITDA).

In 2014, mobile contributed 70 percent of revenue, so mobile is generating an increasing share of earnings.

Comcast Continues to Grow Bandwidth at Moore's Law Rates

As crazy as it seems, U.S. Internet service provider Comcast, now the biggest supplier in that market, has doubled the capacity of its network every 18 months.


In other words, Comcast has  increased capacity precisely at the rate one would expect if access bandwidth operated according to Moore’s Law.


U.S. telcos have generally not been able to increase speed at such rates. That, in large part, might account for Comcast’s leadership of the Internet access market.


That said, across the whole market, access bandwidth has grown at rates very close to what one would expect if Internet access were governed by Moore’s Law.



Structural Separation or Facilities-Based Competition?

Creating more fixed network services competition--leading to consumer benefits--is never easy. High fixed costs, heavy capital investment and a plethora of competing delivery platforms are ever-present realities.

Where policymakers believe there is little practical opportunity for rival facilities-based networks to emerge, structural separation remains one of the few potential avenues for change.

Structural separation--breaking an incumbent telco into a wholesale unit and a retail unit--has in the past been a way policymakers attempt to create competition and foster investment in the Internet access market at the same time.

That is the actual policy in Australia, New Zealand and Singapore.

The argument has been that multiple facilities-based approaches are inefficient and a waste of capital. That might often be the case, especially in regions where there is no facilities-based cable TV industry that already offers a facilities-based alternative to incumbent telcos.

That is a tough matter, politically speaking. Few incumbent service providers ever have been willing to submit to such separattion policies. 

SingTel was willing to do so in order to obtain freedom to grow internationally, in new lines of business. Telstra, after much struggle, agreed to surrender its monopoly in exchange for assets and freedom in the mobile arena.

The former Telecom New Zealand simply seems to have been motivated by a belief that it would do better if separate retail and wholesale companies (Chorus becoming the wholesale company) were created.

On the other hand, some would argue that more interesting amounts of competition and innovation come when competition takes the form of  facilities-based rivalry. But that largely hinges on pre-existing and substantial investment by cable TV operators.

The reason is simple: when every retail provider uses the same network, the amount of innovation and pricing is limited. Compare that to a situation where two to three access networks exist, and the managers of each network look for all sorts of ways to create distinctiveness.

As many executives would say, an entity relying on wholesale access cannot control its costs.

The new wrinkle, at least in U.S. markets, is the emergence of third-party Internet service providers such Google Fiber and other independent ISPs. That emergence, in turn, appears partly fueled by a change in local government thinking and policy.

For decades, the objective, in substantial part, had been the ability to wring revenues from access provider operations, in the form of franchise and other fees.

Today, the thinking seems more focused on creating infrastructure that supports economic development (which, in turn, leads to higher tax revenue).

The change means municipalities generally are willing to forsake franchise fee revenue to gain state-of-the-art Internet access facilities, and also are willing to substantially improve the speed and efficiency of other key rules such as issuance of permits.

Where policymakers believe it will be possible to encourage facilities-based investment and competition, what happens at the municipal level might well be far more important than what happens at the level of national policy.

National policy still matters, and can be decisive in situations where “only one network” is the expected outcome. How much fiber-to-home progress is possible might hang in the balance.



Thursday, October 8, 2015

AT&T Voice over Wi-Fi: Feature, Not a Service

A waiver granted by the U.S. Federal Communications Commission to AT&T now allows the firm to offer Wi-Fi calling. The business model is not yet fully visible, as the service has not yet been fully launched.

The context is that Apple’s iPhone, since iOS 8, has supported voice calling using Wi-Fi connections. But full value also requires that the service interwork seamlessly with carrier voice. Until now, that has not been possible.

The new feature illustrates the challenge of voice business models. At least so far, voice calling using Wi-Fi is mostly a capability, not a direct revenue driver for AT&T.

As with use of Wi-Fi for Internet access, the feature might be most useful, in the U.S. mobile market, for callers in areas where indoor mobile signal is weak and Wi-Fi signals are strong.

Though some tier-one service providers have launched their own voice over IP services, they arguably have gained little traction, compared to the third party app and service providers.

In a broad sense, the notion that access providers could compete successfully with over the top providers in voice has proven incorrect.

VoIP has mostly shrunk the retail revenue opportunity for voice, and shifted demand to third parties, and away from carriers.

Eventually, the feature might have greater indirect revenue implications, however.

As one or more cable TV operators enter the mobile market, they are expected to lean on their own hotspot networks and Wi-Fi for network infrastructure.

In that case, voice over Wi-Fi will help the overall business model, offloading demand from mobile to the fixed network.

That could have direct financial implications. To the extent that cable TV companies rely on wholesale access provided by other mobile operators, offload to Wi-Fi will mean lower payments to the wholesale services provider.

Bonding of Mobile and Wi-Fi Spectrum is a Land Grab

Wi-Fi interests worry about interference issues as new protocols are developed for bonding mobile and Wi-Fi resources. “Playing nice” always is a legitimate matter for users of shared spectrum.

As always, there are commercial advantages and interests at stake as well. “It’s a land grab,” said Roger Entner, Recon Analytics principal. “Are the cable guys blocking or are the mobile operators responding to future spectrum shortages?”

Maybe some element of each is at work.  

Cable operators see their huge networks of public hotspots as an asset to be monetized. Dense networks of hotspots can support a mobile business plan. Those same networks can drive wholesale capacity businesses as well.

As mobile and Wi-Fi bonding becomes possible, and assuming interference and access rules are respected, the wholesale opportunity arguably diminishes.

Mobile operators have their own incentives. Wi-Fi offload already is an essential part of network operations. Wi-Fi bonding would make the process more seamless, and might even create some new revenue opportunities.

Among the available strategies for dealing with emerging new competition is to get regulatory bodies involved. Keeping innovations from being deployed, if nothing else, allows more time for some contestants to get their commercial offers ready for mass deployment.

“Wait for standards” is one argument sometimes made, as part of that strategy. But competitors often want to seize business advantage now, rather than waiting.

Every technology standard has commercial implications. Every change in network capabilities has potential business model impact, both within and between industry segments or value chain participants.

There are, and will be, many legitimate technology issues to be addressed as various new forms of spectrum sharing are developed and deployed. There will be lots of sparring about the “right framework” and “right policy.”

But contestants are not unmindful of their commercial interests. It is a land grab.

Wednesday, October 7, 2015

Google Wants Faster Mobile Web

Google’s core business model is enhanced when everybody uses the Internet, and when the Internet can be experienced “faster.” Most of what Google has done in the access area relates directly to those two interests.

Google now is launching a new open source content initiative intended to speed up performance of the mobile web.

Accelerated Mobile Pages aims to dramatically improve the performance of the mobile web, allowing rich content to load instantaneously.

The other objective is allowing the same code to work across multiple platforms and devices so that content can appear everywhere in an instant, no matter what type of phone, tablet or mobile device you’re using.

The project relies on AMP HTML, a new open framework built entirely out of existing web technologies, which allows websites to build light-weight web pages.

Over time we anticipate that other Google products such as Google News will also integrate AMP HTML pages. Nearly 30 publishers globally have agreed to support the framework.

Goldens in Golden

There's just something fun about the historical 2,000 to 3,000 mostly Golden Retrievers in one place, at one time, as they were Feb. 7,...