Thursday, October 15, 2015

Not Only "3 or 4," but "Which" 3 or 4

In many mobile markets, all fundamental policy choices about the right mix of competition and investment center on the numbers "three" and "four." Those numbers correspond to the number of leading providers in the market.

It might be fair to qualify the notion by adding that "which three" and "which four" also are important. Some firms arguably are better able to compete, either because their cost structures are lower or because they have other key revenue streams to rely upon.

The sheer number of firms in a stable and sustainable market still matters. But so do the business models of those firms matter. How can Google, Facebook and others offer valuable services "for free?"

They have different business models than firms relying on subscriptions or transactions. How can cable TV firms sustainably offer lower prices than telcos? Their cost structures are lower.

Economics, alas, is not a science, any more than any of the “social” sciences. Beyond general principles, it is very difficult (impossible, many would say) to “scientifically” tune whole economies, or even reliably predict the actual impact of most proposed policies.

That always applies to the matter of telecommunications service provider regulation, particularly as it applies to the matter of how to fashion policies that stimulate investment in facilities and promote enough competition to improve consumer welfare.

Obviously, policies can do too little, or too much, in either case leading to sub-optimal levels of competition, investment and consumer welfare.

Generally speaking, the bigger problems are structural: rules that arguably “artificially” restrict the amount of competition, prevent rationalized markets or reduce incentives to invest. But finding the right balance is tricky.

That is precisely at the heart of regulatory thinking in the European Union, for example.

Broadly speaking, the matter of promoting investment and competition takes practical expression in policies related to the “right” number of providers in markets. In the European Union mobile market, the key numbers are “four” and “three,” referring to the minimum number of leading contestants believed to be necessary to support robust competition.

The obverse also holds: four versus three also is believed to shape the profitability of investments. Three, in that sense, is more inviting than four.

Industry and regulators do not agree on the numbers. Telcos argue more scale--and therefore more mergers--are necessary to reduce the number of suppliers. Regulators now argue that no more big mergers are desirable, as that would reduce the level of competition.

The parties are not talking past each other, just focusing on different problems. Policies that promote “more competition” often can create less-inviting prospects for “more investment.”

“Less competition” can create better prospects for “more investment.” That is why the balance matters so much. More of one outcome means less of the other outcome. But there are worse outcomes.

Getting the balance wrong ultimately implies--at least for a time--less competition and less investment, however.

That happens because too much competition inevitably leads to supplier death. That can happen in several ways. Struggling firms typically reduce capital  investment to try and survive. In other cases, firms overinvest in facilities that ultimately do not produce a return. Either way, firms eventually exit the market.

What form the exits take is another matter. Firms can disappear, to be sure. But the more typical exit is absorption of failing firms by stronger firms. In those cases, there is at least a possibility that the level of competition actually is enhanced, not reduced.

That arguably will be the case in the U.S. market, for example, if two of the leading four U.S. mobile firms are acquired by cable TV, app provider or device supplier owners. In principle, that could happen in some EU markets as well.

So the issue is perhaps not only “three or four,” but “which three, and which four.”

Tuesday, October 13, 2015

U.S. Cable TV Operator Capex to Grow in 2015, Decline Afterwards

SNL ImageU.S. cable operators will in 2015 will have made more capital investment than ever before in a single year (not adjusted for inflation).

According to SNL Kagan estimates, U.S. cable operators will invest $16.66 billion. Some of that capital will go to plant extensions and upgrades, but about $7 billion, or 42 percent, is for customer premises equipment. Much of that is for video set-tops, while some is for high speed access routers and modems.
SNL Image


Comcast plans to allocate 14.5 percent of cable segment revenue to capital investment. 

Other firms also will boost spending, while some will decrease capex. 

Time Warner Cable will boost capex to $4.45 billion, up 8.6 percent, year over year. 

Suddenlink will boost spending about 16 percent.

Charter will drop capex 27 percent from 2014 levels, as will Cablevision Systems.

SNL Kagan expects modest declines in 2016 capex, industry-wide.

SNL Image

Spending on scalable infrastructure on network virtualization, DOCSIS 3.1, the Converged Cable Access Platform, increased on-demand and multiscreen content delivery, enhanced cloud-based guides and increased reliance on unmanaged devices also will grow modestly.

SNL Image

The upgrades to 1 Gbps broadband services are boosting spending on capacity upgrades, as well.

SNL Image

"Interconnected Era" Arrives, says Equinix

Direct connections between private Internet Protocol domains lead to six nines levels of availability, with 15 percent fewer network incidents and outages, according to studies produced for Equinix.

Increased direct interconnection of domains contributed to a 42 percent average reduction in latency and 40 percent reduction in bandwidth costs, Equinix says.

“We are now in the interconnected era,” Equinix says. “This period is dominated by the need for a level of interconnection that delivers instant collaboration between and within dense industry ecosystems consisting of partners, employees, customers and data sources.”

The number of interconnected enterprises is set to more than double from 38 percent to 84 percent globally by 2017, Equinix argues.

There are a couple of reasons for that growth Some 75 percent of enterprise employees reside in locations other than the corporate headquarters, while 82 percent of enterprises report a multi-cloud strategy. Hence the need to create connections with very low latency, enabled by domain-to-domain direct connections.

Dell Merger with EMC Points to Cloud Future

The merger of Dell and EMC, at $67 billion, is the biggest-ever merger in the information technology industry.

Whatever else the deal might mean, it shows the evolution of computing from the personal computer era to the era of cloud and mobile computing.

In the past, most computing happened locally, whether on mainframes, minicomputers or PCs. These days, most computing happens remotely, in data centers. So one might logically argue that the Dell merger with EMC is an effort to transition to the next computing era.

It is virtually impossible to compare the volume of local computing compared to remote--or cloud--computing. It is easier to quantify the amount of computing-related traffic.

According to Cisco, perhaps 85 percent of applications traffic volume now is directly produced by cloud computing.


Likewise, the installed base of computing devices has shifted dramatically to smartphones, devices that rely on cloud or remote computing for most of their value.

By 2017, 87 percent of the global smart connected device market will be tablets and smartphones, with PCs (both desktop and laptop) being 13 percent of the market.



Cloud-based apps now matter because nearly all apps--enterprise or consumer--now are consumed that way.

In 2015, 83 percent of all mobile apps used by U.S. consumers are cloud apps, according to Cisco. By 2019 cloud apps will represent 90 percent of all mobile apps.

The trend often will appear less advanced in the enterprise computing segment, as cloud computing, by volume of operations or enterprise spending, rarely is more than small percentage of total spending.

By 2018, cloud data centers should represent as much as 76 percent of all data center workloads, according to Cisco.


Overall, most U.S. businesses use cloud computing to some extent, with most enterprises using a hybrid approach (both internal and external cloud operations).

Of the 37 percent of U.S. small businesses that already buy cloud-based services, most buy software as a service, with marketing, e-commerce, sales, collaboration and productivity functions used among the top six functions being cloud sourced, although back office often is the function most likely to be moved to the cloud.

The core market might be about three million firms, largely in the services segment, followed by retail and healthcare. There are perhaps nine million additional small or home office based businesses (owner operated or with a maximum of four employees).

By definition, software as a service is consumed without any need to buy cloud infrastructure, and the largest segment of the cloud services market is SaaS, representing 81 percent of all cloud spending.

In other words, a direct purchase of cloud capability represents about 19 percent of cloud purchases by business users.


Monday, October 12, 2015

Arguably Different Regulatory Treatment of Video Providers Now an Issue in Tempe, Scottsdale, Ariz.

One characterisitc of newly-competitive markets is the entry of suppliers not historically in the deregulated industry. That also tends to mean that legacy regulations, applied differently to companies in different industries, become new issues.

The unfairness is most obvious when companies selling the same services, to the same customers, are treated differently.

It is not immediately clear which clauses of a video services license Scottsdale, Ariz. wants to grant Google Fiber are objectionable to Cox Communications, which has sued to block the awarding of a license because it violates state and federal law.

Presumably, the objection is that the new license imposes fewer requirements than Cox and other licensees must comply with.

Google Fiber already had gotten a video franchise from Tempe, Ariz. that includes waivers of some fees or city codes, but Cox Communications has challenged that award as well.

The lawsuit against the city of Tempe claims the city has enacted regulations specifically to benefit Google Fiber high-speed internet.firms

“As set forth below, the City has violated Federal law in a manner that directly harms Cox by establishing a discriminatory regulatory framework,” the lawsuit says. “The City’s regulatory framework imposes substantial statutory and regulatory obligations on providers of video services that the City deems to be cable operators (such as Cox).”

“The City exempts from such rules and obligations providers of video services that the City deems not to be cable operators (such as Google Fiber),” the lawsuit argues. “Legally, however, Google Fiber’s proposed video offering is indistinguishable from Cox’s cable service offering.”

Disparate treatment of fundamentally similar from historically-different industries that now actually offer the same services has been an issue for decades.

How much that is the case in Scottsdale and Tempe, Ariz. is not clear. Irrespective of the particulars of those cases, the issue of like treatment of like firms competing in the same markets is an issue.

At a basic level, that is a factor in disputes over the lawfulness of municipal broadband, regulation of voice services, video franchise rules and other obligations levied on “carriers of last resort.”

The rules increasingly are nonsensical, to the extent such rules are applied differently to companies selling the same products, to the same customers.  

Can Regulators Actually Detect Network Neutrality Infractions? Study Suggests Answer is "No"

Is it possible to detect and measure network neutrality infractions?

It is not a rhetorical question, according to Ofcom, the U.K. communications regulator. After commissioning a study on whether it is possible to create meaningful network neutrality rules, Ofcom has produced a report suggesting it is not--at present--actually possible to determine whether impermissible traffic shaping actually has occurred.

The obvious implication: how can engineers trying to enforce network neutrality rules actually ascertain that the rules are respected, when measurement is not possible?

Those are questions raised by a study conducted by Predictable Network Solutions for Ofcom, the U.K. communications regulator.

The study found that none of the existing and methods  tools actually succeed at detecting most forms of traffic management.

The reason for the ineffectiveness of the these tools is the complexity of the Internet; when there’s a delay between two endpoints, the tools are unable to pinpoint the cause of the delay.

That poses a key problem for network neutrality: there is no way, at present, to determine if an infraction has happened.

Though it always is dangerous to infer too much from any single study, on a subject as complicated as traffic management, that is potentially highly significant.

The findings cast a shadow on net neutrality regulations, which are an attempt to ban behavior that generally can’t be detected, let alone measured, says Richard Bennett, a communications consultant.

And even if the behavior could be measured, it’s not always anti-consumer.

“If I was on a business Skype call from my house while the kids were watching multiple video streams, I would like it to be differentially treated because I would like my voice call not to fail,” says Neil Davies, Predictable Network Solutions principal. “Differential management itself is not necessarily against the end user’s interest. It could be very much for it.”

Differentiated traffic “obviously has a value to the end user,” and could “potentially garner a price premium,” Davies says. That, alas, is the argument many supporters of QoS, would make.

Users should be able to choose which traffic gets priority, Davies says. Some would argue ISPs likewise should be able to offer such capabilities to their customers.

Though there will be debate about the findings, the PNS study suggests network neutrality rules are unenforceable, because application-specific discrimination isn’t detectable by any known tool, whether it’s NetPolice, NANO, DiffProbe, Glasnost, ShaperProbe, or ChkDiff.

QoS for Consumer Mobile Apps--Despite Net Neutrality--Is Possible, Even Likely in the Future

LIke it or not, some important consumer applications actually benefit from, and under conditions of congestion, might require, quality of service (packet prioritization) mechanisms. That is true whether “best effort only” is the mandated regulatory regime, or not.

And even where “best effort only” is the law, consumer services increasingly might take advantage of quality of service mechanisms, based on Wi-Fi capabilities.

The Wireless Broadband Alliance, which created the Passpoint standard, also has promulgated quality of service mechanisms. Wi-Fi Certified WMM added quality of service (QoS) functionality in Wi‑Fi networks.

With WMM, introduced in 2004, network administrators and residential users can assign higher priority to real-time traffic such as voice and video, while assigning other data traffic to either best-effort or background priority levels.

Introduced in 2012, WMM-Admission Control further improves the performance of Wi‑Fi networks for real-time data such as voice and video by preventing oversubscription of bandwidth.

Prioritization of traffic includes categories for voice, video, best effort data, and background data, managing access based on those categories.

The business implications are that as more large hotspot networks deploy Passpoint, consumer Internet connections will be supported by quality of service mechanisms provided by the Wi-Fi network, not the access network.

And that makes the QoS lawful.

And many large hotspot network operators presently believe carrier-grade hotspots will represent 57 percent of all their locations, with carrier-grade hotspots accounting for  will support 90 percent of locations by 2020.

Among operators with hotspot networks in place, 57 percent have a timeline in place to deploy a next generation hotspot (Passpoint) standard network, a survey conducted for the Wireless Broadband Alliance finds.

Some 61.5 percent of respondents already have NGH or plan to deploy it over the coming year, while a further 29.5 percent will roll it out in 2017 or 2018.

The dominant business driver is the need to enhance or guarantee customer experience for revenue streams such as  TV everywhere or enterprise services.

Improving customer experience to reduce churn and boost average revenue per account or user was seen as the primary advantage by 28 percent of  respondents.

Seamless access from hotspot to hotspot or hotspot to mobile also was a key concern.

Respondents tend to believe they will be able to generate revenue from location‑based services (69 percent), roaming (68 percent) and Wi‑Fi analytics (66 percent).

Compared to the 2014 survey findings, there is far less emphasis on Wi‑Fi offload, and more on Wi‑Fi first mobility, Wi‑Fi calling and support for entertainment video.

Many consumer services--especially those for which consumers are paying a fee--benefit from QoS mechanisms. Despite network neutrality rules, support for such apps likely is coming. All the technology tools are there to do so on big Wi-Fi hotspot networks.

Zoom Wants to Become a "Digital Twin Equipped With Your Institutional Knowledge"

Perplexity and OpenAI hope to use artificial intelligence to challenge Google for search leadership. So Zoom says it will use AI to challen...