Friday, December 15, 2017

FTC Has Argued for a "Wait and See" Policy on Net Neutrality

The end of common carrier regulation of internet access returns oversight over internet access services to the Federal Trade Commission, which oversees, with the Department of Justice, antitrust matters and consumer protection.

It is worth noting, though the fact is rarely mentioned in reporting on the change, that the specific change made by the FCC is the end of Title II common carrier regulation of internet access.

Under “normal” circumstances, virtually all providers in the communications business would prefer lighter-touch regulation to common carrier regulation, so support for that change is hardly unusual.

The end of common carrier regulation is said to represent the end of “network neutrality.” That is a complicated matter. In principle, the change means that consumer internet access once again becomes a lightly-regulated data service.

Since common carrier regulation was used as the justification for imposing “best effort only” (first in, first out) handling of consumer internet traffic, that prohibition--imposed by common carrier rules--is lifted.

Assuming Congress does not specifically request that the FCC reimpose the rules, ISPs are free to consider quality of service mechanisms for “internet” access, in the same way they are free to implement QoS for business grades of service.

That worries net neutrality supporters.

So it is worthwhile to recall what the FTC said about its views on competition policy back in 2007, before the move to regulate internet access under common carrier rules, something that never was government policy until 2015.

Basically, the FTC notes the dangers and advantages, and essentially argues regulators should wait and see what happens, as moving quickly, in a fast-changing environment, essentially means regulating without actual facts to consider.

“It is impossible to determine in the abstract whether allowing content and applications providers (or even end users) to pay broadband providers for prioritized data transmission will be beneficial or harmful to consumer welfare,” the FTC said. “Such prioritization may provide benefits, such as increased investment and innovation in networks and improved quality of certain content and applications that require higher-quality data transmission.”

The FTC noted network neutrality proponents raise important  issues, ranging from a diminution in innovation by content and applications providers; the intentional or passive degradation of non-prioritized data delivery and increased transaction costs resulting from negotiations between broadband providers and content and applications providers over prioritization.

But no a priori assessment can be made, the FTC argued. Instead, “evidence of particular conduct” is necessary to determine whether there is harm.

Thursday, December 14, 2017

Why Net Neutrality Discussion is So Difficult

There is a good reason why some find the “network neutrality” discussion so frustrating.

“The most confounding aspect of the contemporary net neutrality discussion to me is the social meanings that the concept has taken on,” says Gus Hurwitz, a professor at the Nebraska College of Law.

“These meanings are entirely detached from the substance of the debate, but have come to define popular conceptions of what net neutrality means,” Hurwitz notes. “They are, as best I can tell, wholly unassailable, in the sense that one cannot engage with them.”

“The most notable aspect is that net neutrality has become a social justice cause,” says Hurwitz. “Progressive activist groups of all stripes have come to believe that net neutrality is essential to and allied with their causes.”

One might argue that such weaponization of an issue is unfortunate. Network neutrality “raises important questions about the nature of regulation and the administrative state in complex technical settings,” he adds.

Beyond that, net neutrality as a concept is very complicated, despite the effort to paint it in simple, caricatured ways. How network management is different from “no blocking” of lawful content is often impossible to clearly distinguish.

Longtime observers of communication networks will tend to agree that no network, as a practical matter, is engineered to support any conceivable level of traffic. That is too costly. Instead, networks are built to support expected, typical peak traffic. That necessarily means spikes in traffic can tax any network.

When that happens, congestion occurs, and user experience degrades. In the old voice network, in fact, actual blocking of access was among the tools used to preserve network performance under unusual load. “I’m sorry, all circuits are busy now; please try your call again later” was a message users sometimes heard at such times.

Keep in mind, actual blocking of attempted lawful communication was among the tools used to manage traffic. Network neutrality principles actually do not permit such practices. The Federal Communications Commission has uniformly acted to preserve consumer access to all lawful applications, without any blocking or interference.

Among the greatest “threats” posed by the end of common carrier regulation is “paid prioritization,” the practice whereby app providers pay transport networks for better quality of service.

The problem, rarely mentioned, is that “almost all of today’s big content providers--the Googles and Netflixes--have invested massively in content delivery networks,” Hurwitz notes. “These are networks that allow their content to bypass almost the entire Internet, dramatically improving performance. In other words, they have already paid for prioritization.”

Latency control is a technical means for improving end user quality of experience, and a routine way of optimizing content and application access. If anybody opposed such quality of service mechanisms, few of us have heard of it.

Ironically, in a competitive internet access market that is growing more competitive; where internet service providers are investing to improve speed and other elements of the experience, the end of common carrier regulation is supposed to lead--inevitably--to major ISPs intentionally degrading their performance, relative to peers, to stoke demand for CDN-style QoS.

Competition is a mechanism that controls such behavior, to the extent that better QoS (with some mechanism for maintaining it)  is viewed as a “bad thing.”

Some argue there is rent seeking at work here. Rent seeking is the generation of revenue surplus by firms with market power. What is not clear, many ask, is whether rent seeking is happening, and if so, “by whom.”

By one common definition, rent seeking is attempted by virtually every entity in the economy.

“One of the most common ways companies in the 21st century engage in rent seeking is by using their capital to contribute to politicians who influence the laws and regulations that govern and industry and how government subsidies are distributed within,” says Investopedia.

In that sense, every industry (and other groups such as trade unions) attempt rent seeking. So rent seeking is a rather useless concept, where it comes to analyzing industry behavior: any industry’s behavior.

AT&T Expects to Have 12.5 Million New FTTH Passings by Mid-2019

AT&T says it now markets its AT&T Fiber (fiber to the home) service to more than seven million locations across 67 metropolitan areas and 21 states.

AT&T plans to reach at least 12.5 million locations by mid-2019. In part, some will argue, that deployment is driven by terms of the Federal Communications Commission clearance of the AT&T acquisition of DirecTV.

Deal approval includes the obligation to deploy optical fiber to 12.5 million new homes. Some might argue (rightly, perhaps) that AT&T will have to stretch to hit that target.

Iit has taken roughly four years to add seven million locations. AT&T has to add 5.5 million more connections in less than two more years. What that means for AT&T’s capital investment
budget is clear: it has had to boost capex beyond prior years.

Some have argued that reaching that level of fiber-to-home household coverage would be difficult. But AT&T has boosted its capex beyond former levels, now spending about $6 billion a quarter on capex.

AT&T has allocated as much as $22 billion in 2017 annual capex, in part to build out the FirstNet emergency responder network.

Some observers will worry about what that level of capex means for overall financial performance, including debt load. But, so far, AT&T seems to have handled it.

Video Entertainment "Market" Now Smashes Separate Regulatory Walls between "Content and Apps" and "Delivery"

The new move by T-Mobile US video streaming business is portrayed by the company itself, and news reports, as representing competition with cable TV.

“The Un-carrier will build TV for people who love TV but are tired of the multi-year service contracts, confusing sky-high bills, exploding bundles, clunky technologies, outdated UIs, closed systems and lousy customer service of today’s traditional TV providers,” said John Legere, T-Mobile US CEO.

A few reports correctly described the service as a streaming offer more akin to over-the-top services offered by AT&T, Dish Network, YouTube and Netflix.

But that might be quite the point. T-Mobile US itself describes its move as representing a move into the $100 billion revenues subscription TV market dominated by cable and telco suppliers.

““We’re in the midst of the Golden Age of TV, and yet people have never been more frustrated by the status quo created by Big Cable and Satellite TV,” said Mike Sievert, Chief Operating Officer of T-Mobile.

The over the top service represents the “successor” service to linear TV, virtually all observers agree. That is why Disney is launching its own retail streaming services, for example.

And that is perhaps among the most-important ramifications of the move. In the application business--including the application businesses traditionally operated by telcos and cable TV--app delivery has been decoupled from the use of access networks.

Relevant competition for cable TV includes satellite and telco services, but also DirecTV Now, Netflix, Amazon Prime, Hulu, Sling TV and other services, with additional competition coming from Facebook, YouTube and many social networking apps.

In other words, the traditional regulatory distinction between unregulated “content or data services” and regulated access service providers is evaporating. Netflix and others create their  own content, bundle and license content and deliver that content.

That makes Netflix (if not a “perfect” substitute) a rival for linear TV subscriptions. The move by T-Mobile US into the OTT video subscription business represents that evolution.

Streaming services might be owned by app providers (social media, YouTube), commerce providers (Amazon), content studios (Sony, Hulu), or distributors (AT&T, Dish Network, Verizon, Comcast).

Whomever the owner of the assets, the new reality is that content creation, packaging and delivery now is becoming independent of the access mechanism. That will--or should--eventually have regulatory implications of major scope.

Defining the scope of a “market” now is more complicated--and much broader--than it once was.

Wednesday, December 13, 2017

AT&T Expands AirGig Trials

AT&T has launched an international trial of its Project AirGig access technology, and also has launched a second trial in the United States.
 

Unlike a “data over power line” system, AirGig does not actually use the power conductor, but only travels along the exterior of a power line. 

 AirGig, it is hoped, could deliver internet access speeds well over one gigabit per second using a millimeter wave (mmWave) signal guided by power lines. If so, internet access facilities would not require new towers or cables, but would be able to piggyback on existing electrical distribution lines. 

 The first international trial started earlier in 2017 with an electricity provider outside the United States The second U.S. trial recently started in Georgia with Georgia Power. While this trial is located in a rural area, AirGig could be deployed in many areas not served by high speed broadband today – rural, suburban, or urban, AT&T says.

Network Effects Explain Oligopolistic Structure of the Internet

Image result for network effects
source: medium.com
Oligopolies (functional, rather than enforced by law) now are a key characteristic of most parts of the internet ecosystem. In other words, there are functional "gatekeepers" across most of the ecosystem.

That "winner take all" structure might emerge as the natural consequence of consumer choices, supplier skill and timing.

The biggest driver, though, is that some markets have "network effect" characteristics. That is why the "platform" role is so desirable.

Platforms benefit from scale, and grow with increasing scale. That arguably applies for operating systems, access services, devices and apps/

So most markets with scale economics and network effects arguably develop in the same way.

The point is that markets where winners are able to exploit network effects virtually always leads to oligopoly outcomes. Regulators can break up such markets, but to the extent that network effects actually matter, concentration always will reoccur.

In the application space, advertising revenue is dominated by Google and Facebook, which claim 63 percent of U.S. digital ad revenue in 2017. In the operating system market, Android and Apple iOS are the leaders, with 99-percent market share. The device portion of the market is the least concentrated , although Apple and Samsung have earned most of the profits.  

Mobile and fixed network access markets likewise are oligopolies, in virtually every market. Fixed markets in many cases remain virtual monopolies, while mobile markets tend to be oligopolies.



Tuesday, December 12, 2017

Ericsson to Supply Verizon Early 5G Deployments

Ericsson will supply Verizon with 5G radio infrastructure, allowing Verizon to launch commercial “pre-5G” networks in 2018. The expected deployments will include the launch of fixed wireless services in a few U.S. cities.

That is important for several reasons. Although the creation of new apps, services and revenues is a hoped-for development for 5G, that expectation has existed for 3G and 4G as well, where service providers expected new use cases and apps  to develop, but were not sure precisely what would happen.

That remains the case for 5G as well, where the key issue is the business model: what incremental new revenue sources will develop?

Verizon, learning in part from history, is following a known deployment path. As 4G initially was launched in the first markets to support computing devices, not mobile voice, so Verizon will launch 5G as a platform for fixed wireless internet access, and later add the full mobility network functions. That allows a scaling of investment and matches early investment with revenue generation.

Use of 5G to support fixed wireless access, both in-territory and out of region, is among the first new revenue sources to develop for 5G. Deployment of fixed wireless out of Verizon’s existing fixed network region also is a first.

So one other way to characterize Verizon’s early 5G deployments is to note that the platform will enable an out-of-region assault on consumer markets, where Fios has been totally in-region.

Will Generative AI Follow Development Path of the Internet?

In many ways, the development of the internet provides a model for understanding how artificial intelligence will develop and create value. ...