Tuesday, October 13, 2015

"Interconnected Era" Arrives, says Equinix

Direct connections between private Internet Protocol domains lead to six nines levels of availability, with 15 percent fewer network incidents and outages, according to studies produced for Equinix.

Increased direct interconnection of domains contributed to a 42 percent average reduction in latency and 40 percent reduction in bandwidth costs, Equinix says.

“We are now in the interconnected era,” Equinix says. “This period is dominated by the need for a level of interconnection that delivers instant collaboration between and within dense industry ecosystems consisting of partners, employees, customers and data sources.”

The number of interconnected enterprises is set to more than double from 38 percent to 84 percent globally by 2017, Equinix argues.

There are a couple of reasons for that growth Some 75 percent of enterprise employees reside in locations other than the corporate headquarters, while 82 percent of enterprises report a multi-cloud strategy. Hence the need to create connections with very low latency, enabled by domain-to-domain direct connections.

Dell Merger with EMC Points to Cloud Future

The merger of Dell and EMC, at $67 billion, is the biggest-ever merger in the information technology industry.

Whatever else the deal might mean, it shows the evolution of computing from the personal computer era to the era of cloud and mobile computing.

In the past, most computing happened locally, whether on mainframes, minicomputers or PCs. These days, most computing happens remotely, in data centers. So one might logically argue that the Dell merger with EMC is an effort to transition to the next computing era.

It is virtually impossible to compare the volume of local computing compared to remote--or cloud--computing. It is easier to quantify the amount of computing-related traffic.

According to Cisco, perhaps 85 percent of applications traffic volume now is directly produced by cloud computing.


Likewise, the installed base of computing devices has shifted dramatically to smartphones, devices that rely on cloud or remote computing for most of their value.

By 2017, 87 percent of the global smart connected device market will be tablets and smartphones, with PCs (both desktop and laptop) being 13 percent of the market.



Cloud-based apps now matter because nearly all apps--enterprise or consumer--now are consumed that way.

In 2015, 83 percent of all mobile apps used by U.S. consumers are cloud apps, according to Cisco. By 2019 cloud apps will represent 90 percent of all mobile apps.

The trend often will appear less advanced in the enterprise computing segment, as cloud computing, by volume of operations or enterprise spending, rarely is more than small percentage of total spending.

By 2018, cloud data centers should represent as much as 76 percent of all data center workloads, according to Cisco.


Overall, most U.S. businesses use cloud computing to some extent, with most enterprises using a hybrid approach (both internal and external cloud operations).

Of the 37 percent of U.S. small businesses that already buy cloud-based services, most buy software as a service, with marketing, e-commerce, sales, collaboration and productivity functions used among the top six functions being cloud sourced, although back office often is the function most likely to be moved to the cloud.

The core market might be about three million firms, largely in the services segment, followed by retail and healthcare. There are perhaps nine million additional small or home office based businesses (owner operated or with a maximum of four employees).

By definition, software as a service is consumed without any need to buy cloud infrastructure, and the largest segment of the cloud services market is SaaS, representing 81 percent of all cloud spending.

In other words, a direct purchase of cloud capability represents about 19 percent of cloud purchases by business users.


Monday, October 12, 2015

Arguably Different Regulatory Treatment of Video Providers Now an Issue in Tempe, Scottsdale, Ariz.

One characterisitc of newly-competitive markets is the entry of suppliers not historically in the deregulated industry. That also tends to mean that legacy regulations, applied differently to companies in different industries, become new issues.

The unfairness is most obvious when companies selling the same services, to the same customers, are treated differently.

It is not immediately clear which clauses of a video services license Scottsdale, Ariz. wants to grant Google Fiber are objectionable to Cox Communications, which has sued to block the awarding of a license because it violates state and federal law.

Presumably, the objection is that the new license imposes fewer requirements than Cox and other licensees must comply with.

Google Fiber already had gotten a video franchise from Tempe, Ariz. that includes waivers of some fees or city codes, but Cox Communications has challenged that award as well.

The lawsuit against the city of Tempe claims the city has enacted regulations specifically to benefit Google Fiber high-speed internet.firms

“As set forth below, the City has violated Federal law in a manner that directly harms Cox by establishing a discriminatory regulatory framework,” the lawsuit says. “The City’s regulatory framework imposes substantial statutory and regulatory obligations on providers of video services that the City deems to be cable operators (such as Cox).”

“The City exempts from such rules and obligations providers of video services that the City deems not to be cable operators (such as Google Fiber),” the lawsuit argues. “Legally, however, Google Fiber’s proposed video offering is indistinguishable from Cox’s cable service offering.”

Disparate treatment of fundamentally similar from historically-different industries that now actually offer the same services has been an issue for decades.

How much that is the case in Scottsdale and Tempe, Ariz. is not clear. Irrespective of the particulars of those cases, the issue of like treatment of like firms competing in the same markets is an issue.

At a basic level, that is a factor in disputes over the lawfulness of municipal broadband, regulation of voice services, video franchise rules and other obligations levied on “carriers of last resort.”

The rules increasingly are nonsensical, to the extent such rules are applied differently to companies selling the same products, to the same customers.  

Can Regulators Actually Detect Network Neutrality Infractions? Study Suggests Answer is "No"

Is it possible to detect and measure network neutrality infractions?

It is not a rhetorical question, according to Ofcom, the U.K. communications regulator. After commissioning a study on whether it is possible to create meaningful network neutrality rules, Ofcom has produced a report suggesting it is not--at present--actually possible to determine whether impermissible traffic shaping actually has occurred.

The obvious implication: how can engineers trying to enforce network neutrality rules actually ascertain that the rules are respected, when measurement is not possible?

Those are questions raised by a study conducted by Predictable Network Solutions for Ofcom, the U.K. communications regulator.

The study found that none of the existing and methods  tools actually succeed at detecting most forms of traffic management.

The reason for the ineffectiveness of the these tools is the complexity of the Internet; when there’s a delay between two endpoints, the tools are unable to pinpoint the cause of the delay.

That poses a key problem for network neutrality: there is no way, at present, to determine if an infraction has happened.

Though it always is dangerous to infer too much from any single study, on a subject as complicated as traffic management, that is potentially highly significant.

The findings cast a shadow on net neutrality regulations, which are an attempt to ban behavior that generally can’t be detected, let alone measured, says Richard Bennett, a communications consultant.

And even if the behavior could be measured, it’s not always anti-consumer.

“If I was on a business Skype call from my house while the kids were watching multiple video streams, I would like it to be differentially treated because I would like my voice call not to fail,” says Neil Davies, Predictable Network Solutions principal. “Differential management itself is not necessarily against the end user’s interest. It could be very much for it.”

Differentiated traffic “obviously has a value to the end user,” and could “potentially garner a price premium,” Davies says. That, alas, is the argument many supporters of QoS, would make.

Users should be able to choose which traffic gets priority, Davies says. Some would argue ISPs likewise should be able to offer such capabilities to their customers.

Though there will be debate about the findings, the PNS study suggests network neutrality rules are unenforceable, because application-specific discrimination isn’t detectable by any known tool, whether it’s NetPolice, NANO, DiffProbe, Glasnost, ShaperProbe, or ChkDiff.

QoS for Consumer Mobile Apps--Despite Net Neutrality--Is Possible, Even Likely in the Future

LIke it or not, some important consumer applications actually benefit from, and under conditions of congestion, might require, quality of service (packet prioritization) mechanisms. That is true whether “best effort only” is the mandated regulatory regime, or not.

And even where “best effort only” is the law, consumer services increasingly might take advantage of quality of service mechanisms, based on Wi-Fi capabilities.

The Wireless Broadband Alliance, which created the Passpoint standard, also has promulgated quality of service mechanisms. Wi-Fi Certified WMM added quality of service (QoS) functionality in Wi‑Fi networks.

With WMM, introduced in 2004, network administrators and residential users can assign higher priority to real-time traffic such as voice and video, while assigning other data traffic to either best-effort or background priority levels.

Introduced in 2012, WMM-Admission Control further improves the performance of Wi‑Fi networks for real-time data such as voice and video by preventing oversubscription of bandwidth.

Prioritization of traffic includes categories for voice, video, best effort data, and background data, managing access based on those categories.

The business implications are that as more large hotspot networks deploy Passpoint, consumer Internet connections will be supported by quality of service mechanisms provided by the Wi-Fi network, not the access network.

And that makes the QoS lawful.

And many large hotspot network operators presently believe carrier-grade hotspots will represent 57 percent of all their locations, with carrier-grade hotspots accounting for  will support 90 percent of locations by 2020.

Among operators with hotspot networks in place, 57 percent have a timeline in place to deploy a next generation hotspot (Passpoint) standard network, a survey conducted for the Wireless Broadband Alliance finds.

Some 61.5 percent of respondents already have NGH or plan to deploy it over the coming year, while a further 29.5 percent will roll it out in 2017 or 2018.

The dominant business driver is the need to enhance or guarantee customer experience for revenue streams such as  TV everywhere or enterprise services.

Improving customer experience to reduce churn and boost average revenue per account or user was seen as the primary advantage by 28 percent of  respondents.

Seamless access from hotspot to hotspot or hotspot to mobile also was a key concern.

Respondents tend to believe they will be able to generate revenue from location‑based services (69 percent), roaming (68 percent) and Wi‑Fi analytics (66 percent).

Compared to the 2014 survey findings, there is far less emphasis on Wi‑Fi offload, and more on Wi‑Fi first mobility, Wi‑Fi calling and support for entertainment video.

Many consumer services--especially those for which consumers are paying a fee--benefit from QoS mechanisms. Despite network neutrality rules, support for such apps likely is coming. All the technology tools are there to do so on big Wi-Fi hotspot networks.

Project Loon for U.S. Internet Access Market?

Will Google’s Project Loon, providing Internet access services from balloons, be a meaningful access platform across the United States? That might seem as fanciful as the notion of using balloon-based access.

But Google already is saying it expects deployment across the United States, not simply across the Southern Hemisphere.

A Google executive says it has “almost perfected” its Loon balloon technology, with the first deal with operators set to be announced “hopefully very soon”, said Wael Fakharany, Google regional business lead.

“The operators control the distribution, marketing, OSS, BSS, CRM – the customer relationship is with the telcos. We are just the infrastructure provider,” he said. “There is a viable commercial business model and is based on skin-in-the-game, sharing costs and revenue with operators for completely untouched potential.”

Telefonica, Telstra and Vodafone are among the mobile operators to have tested the Project Loon platform so far.

Fakharany said Project Loon commercial operations are expected not only in the Southern Hemisphere, where its initial tests have taken place, but also in the Northern Hemisphere, including, notably, the United States.

“The idea right now, which we are very, very excited about, is that as we enter 2016 it’s all about scalability,” said Fakharany. “It’s all about marketing this as fast as possible not only in rural Africa, but rural India, parts of the US.”

That latter clause might be the most-significant portion of the statement. While Google Fiber continues to slowly add metro areas to it footprint, many would note that Google Fiber will take years and billions in new capital to build a business big enough to challenge the largest telcos and cable TV operators.

Project Loon will accelerate the number of households able to buy Internet access from Google, in less-dense areas beyond the Google Fiber footprint.

Hotspot Operators Expect Heavy Deployment of "Carrier Grade" Wi-Fi

With the caveat that executives sometimes are wrong about how much investment they will make, where, in the future, hotspot network operators presently believe carrier-grade hotspots will represent 57 percent of all their locations, with carrier-grade hotspots accounting for  will support 90 percent of locations by 2020.

Among operators with hotspot networks in place, 57 percent have a timeline in place to deploy a next generation hotspot (Passpoint) standard network.

Some 61.5 percent of respondents already have NGH or plan to deploy it over the coming year, while a further 29.5 percent will roll it out in 2017 or 2018.

The dominant business driver is the need to enhance or guarantee customer experience for revenue streams such as  TV everywhere or enterprise services.

Improving customer experience to reduce churn and boost average revenue per account or user was seen as the primary advantage by 28 percent of  respondents.

Seamless access from hotspot to hotspot or hotspot to mobile also was a key concern.

Respondents tend to believe they will be able to generate revenue from location‑based services (69 percent), roaming (68 percent) and Wi‑Fi analytics (66 percent).

Compared to the 2014 survey findings, there is far less emphasis on Wi‑Fi offload, and more on Wi‑Fi first mobility, Wi‑Fi calling and support for entertainment video. .

Is Internet Access Business Sustainable?

Is the Internet access business sustainable--able to earn a return exceeding its capital investment--at the moment, or over the longer term? It’s a key question, and at least some analysts think the answer is “no.” 
Others disagree.

“We haven’t actually got a sustainable system at the moment,” Dr. Neil Davies, Predictable Network Solutions principal, says.

It’s a “crisis for the world’s telecom industry, in that they are not being able to construct the returns on investment they need for the capital,” he notes. That’s one view.

Others suggest the access business is stable and profitable, at least relatively recently,


Comcast, supposedly the greatest cable monopolist, averaged just a 4.5 percent return on invested capital for the five-year period from 2007 to 2012. The Time Warner Cable five-year average is -1.3 percent, some would note.


The problem, in essence, is that the switched telephone network, because of its design, actually enabled control of quality in ways that Internet Protocol architectures actually prevent.


For those of us who are more “business types,” the reasons for those conclusions are the domain of “bit doctors” who understand statistical multiplexing and its implications for large networks. But if I understand the argument correctly, the problem is that the IP architecture effectively removes the ability to control quality, on the part of any single domain within the broader network of networks.


Simply, no single domain owns or controls all the other elements that affect quality. So quality itself cannot actually be guaranteed. And if capital investment to protect or enable quality becomes nonlinear, with unknown results, then costs cannot be determined, for quality of service of any expected level.


It then becomes difficult to set retail prices at levels that recover, with certainty, the cost of investments. As the protocols are statistical, so profit and loss become statistical.


One salient implication might therefore be that no single domain actually can be certain that its own investments in network quality actually will have the desired results.


In the switched telephone network, there was a clear cost identifiable for every connection because all the resources along that path were now associated and reserved for that data stream. That is not actually possible with an IP network.


In the past, bits flowed along a fixed circuit, with one key advantage. A service provider could derive the actual cost of doing so, and price accordingly.


Packet networks are based on virtual and statistical processes at every turn, essentially. That means there are contention and congestion mechanisms happening “all over the place.”


In practice, this means no broadband provider can ever guarantee the quality and performance of the end-to-end transmission chain.

The corollary is that access providers might not be able to determine the cost of doing so, either, at least when levels of quality are part of the offer, and when there are performance guarantees, with financial penalties.

Saturday, October 10, 2015

5G Changes Everything

If you look at all the capabilities the coming fifth generation (5G) mobile network will have to support, and 5G relationship to the core network, you’d have to conclude that 5G is the “network of everything.” It will have to support low, medium and high use cases for bandwidth, latency, mobility, battery life and reliability.

The proposed 5G network will have to be location and context aware; flexible; efficient; secure; energy efficient; software optimized and therefore virtualized.

The new network will share spectrum and networks.

In other words, 5G will change the whole network, not simply air interfaces.


Mobile App Revenes to Double, Access Revenues Will Fall 9%, Between 2013 and 2020

Global aggregated statistics, though useful, can hide significant regional or ecosystem differences, it always is fair to note.

Nice upward-sloping bar charts are compelling at a high level, but can obscure other trends. In the earlier parts of a lifecycle, such bar charts give the illusion of solidity. Only later, when growth tops out, and decline begins, do we see the expected full product lifecycle curve.

Also, many other trends often occur below the “headline” numbers. Consider the relative revenue shares in the mobile ecosystem, for example. As virtually all the participants are aware, shares of overall ecosystem revenue are shifting.

In 2013, mobile service providers claimed 59 percent of ecosystem revenue, app providers 10 percent of total revenue. By 2020, the GSMA estimates, service providers will earn half of ecosystem revenues, while app providers earn 20 percent of ecosystem revenue.

The global market still is growing, as many potential customers remain to be gotten. But, as tends to be the case, the incremental new users represent less revenue per account. And, inevitably, the market will saturate, rather sooner than many expect.

The key ecosystem change: app share of revenue doubles while access revenue declines nine percent.



Friday, October 9, 2015

There is no Net Neutrality for Wi-Fi, Which Increasingy is Going Carrier Grade

Perhaps oddly, given all the attention to network neutrality as applied to Internet access providers (mobile and fixed), it is Wi-Fi where the balance between “best effort” and “assured access” or carrier grade paradigms will be most important.

At least according to Rethink Research, there will be more “carrier grade” Wi-Fi hotspots than “best effort” hotspots by 2017. That means a majority of hotspots will use prioritization mechanisms.

So with or without network neutrality rules, hotspot providers will be able to groom traffic and take other measures to provide consistent user experience.


O3b Says it is the Fastest-Growing Satellite Constellation Ever

O3b Networks says it has become the fastest growing in satellite history, selling more capacity in the inaugural year of operation than any other satellite operator with global operations.

O3b also says it has sold capacity equivalent to nearly 10 percent of the contracted capacity of the three largest Fixed Satellite Service (FSS) operators combined.

O3b says it now is providing transport to 40 customers in 31 countries, supporting  mobile operators to expand 3G and 4G/LTE services to rural populations; ISPs to provide true broadband on isolated island chains; cruise lines to bring guests and crew high-speed broadband and mobile connections; oil and gas companies to reduce costs and improve crew welfare and governments.

The O3b Networks Medium Earth Orbit (MEO) satellite constellation greatly reduces latency and helps the constellation provide extremely high throughput.

Other contestants likewise are lining up to supply services using even lower orbits, despite some skepticism from providers of geosynchronous service, as one would expect, since MEO and LEO constellations pose new competition (and some would say better user experience).

Australia Orders 9.4% Lower Wholesale Acess Prices for Copper Network

The Australian Competition and Consumer Commission has ordered a 9.4 percent decrease in wholesale access prices on the Telstra copper access network, beginning on No. 1, 2015, and continuing until June 30, 2019.

That decrease, despite fewer customers on the network, incorporates other changing input prices.

Downward pressures come largely from lower expenditures, falling cost of capital and impact of migration of users to the National Broadband Network.

These more than offset upward pressures from a shrinking fixed line market due to consumers moving away from fixed line services and to mobile services, ACCC Chairman Rod Sims said.

The ACCC noted, however, that fewer customers on the copper network will mean higher costs to serve the remaining customers.

That is a generic issue for many fixed network operators. Sowmyanarayan Sampath, Verizon Communications SVP of transformation says Verizon’s copper-based revenue is declining eight percent to 10 percent a year.

At that rate, the revenue stream disappears in a decade. The business will have become unprofitable long before then. Some might argue the business already is unprofitable, if allocated overhead is included. Others would argue the business is slightly profitable, but getting worse as more customers churn off the network.

Verizon might have operations spanning 150 countries, but its revenue is highly concentrated in the U.S. mobile business. By 2016, the mobile business is likely to account for 85 percent of Verizon earnings (EBITDA).

In 2014, mobile contributed 70 percent of revenue, so mobile is generating an increasing share of earnings.

Comcast Continues to Grow Bandwidth at Moore's Law Rates

As crazy as it seems, U.S. Internet service provider Comcast, now the biggest supplier in that market, has doubled the capacity of its network every 18 months.


In other words, Comcast has  increased capacity precisely at the rate one would expect if access bandwidth operated according to Moore’s Law.


U.S. telcos have generally not been able to increase speed at such rates. That, in large part, might account for Comcast’s leadership of the Internet access market.


That said, across the whole market, access bandwidth has grown at rates very close to what one would expect if Internet access were governed by Moore’s Law.



Goldens in Golden

There's just something fun about the historical 2,000 to 3,000 mostly Golden Retrievers in one place, at one time, as they were Feb. 7,...