Thursday, December 14, 2017

Why Net Neutrality Discussion is So Difficult

There is a good reason why some find the “network neutrality” discussion so frustrating.

“The most confounding aspect of the contemporary net neutrality discussion to me is the social meanings that the concept has taken on,” says Gus Hurwitz, a professor at the Nebraska College of Law.

“These meanings are entirely detached from the substance of the debate, but have come to define popular conceptions of what net neutrality means,” Hurwitz notes. “They are, as best I can tell, wholly unassailable, in the sense that one cannot engage with them.”

“The most notable aspect is that net neutrality has become a social justice cause,” says Hurwitz. “Progressive activist groups of all stripes have come to believe that net neutrality is essential to and allied with their causes.”

One might argue that such weaponization of an issue is unfortunate. Network neutrality “raises important questions about the nature of regulation and the administrative state in complex technical settings,” he adds.

Beyond that, net neutrality as a concept is very complicated, despite the effort to paint it in simple, caricatured ways. How network management is different from “no blocking” of lawful content is often impossible to clearly distinguish.

Longtime observers of communication networks will tend to agree that no network, as a practical matter, is engineered to support any conceivable level of traffic. That is too costly. Instead, networks are built to support expected, typical peak traffic. That necessarily means spikes in traffic can tax any network.

When that happens, congestion occurs, and user experience degrades. In the old voice network, in fact, actual blocking of access was among the tools used to preserve network performance under unusual load. “I’m sorry, all circuits are busy now; please try your call again later” was a message users sometimes heard at such times.

Keep in mind, actual blocking of attempted lawful communication was among the tools used to manage traffic. Network neutrality principles actually do not permit such practices. The Federal Communications Commission has uniformly acted to preserve consumer access to all lawful applications, without any blocking or interference.

Among the greatest “threats” posed by the end of common carrier regulation is “paid prioritization,” the practice whereby app providers pay transport networks for better quality of service.

The problem, rarely mentioned, is that “almost all of today’s big content providers--the Googles and Netflixes--have invested massively in content delivery networks,” Hurwitz notes. “These are networks that allow their content to bypass almost the entire Internet, dramatically improving performance. In other words, they have already paid for prioritization.”

Latency control is a technical means for improving end user quality of experience, and a routine way of optimizing content and application access. If anybody opposed such quality of service mechanisms, few of us have heard of it.

Ironically, in a competitive internet access market that is growing more competitive; where internet service providers are investing to improve speed and other elements of the experience, the end of common carrier regulation is supposed to lead--inevitably--to major ISPs intentionally degrading their performance, relative to peers, to stoke demand for CDN-style QoS.

Competition is a mechanism that controls such behavior, to the extent that better QoS (with some mechanism for maintaining it)  is viewed as a “bad thing.”

Some argue there is rent seeking at work here. Rent seeking is the generation of revenue surplus by firms with market power. What is not clear, many ask, is whether rent seeking is happening, and if so, “by whom.”

By one common definition, rent seeking is attempted by virtually every entity in the economy.

“One of the most common ways companies in the 21st century engage in rent seeking is by using their capital to contribute to politicians who influence the laws and regulations that govern and industry and how government subsidies are distributed within,” says Investopedia.

In that sense, every industry (and other groups such as trade unions) attempt rent seeking. So rent seeking is a rather useless concept, where it comes to analyzing industry behavior: any industry’s behavior.

AT&T Expects to Have 12.5 Million New FTTH Passings by Mid-2019

AT&T says it now markets its AT&T Fiber (fiber to the home) service to more than seven million locations across 67 metropolitan areas and 21 states.

AT&T plans to reach at least 12.5 million locations by mid-2019. In part, some will argue, that deployment is driven by terms of the Federal Communications Commission clearance of the AT&T acquisition of DirecTV.

Deal approval includes the obligation to deploy optical fiber to 12.5 million new homes. Some might argue (rightly, perhaps) that AT&T will have to stretch to hit that target.

Iit has taken roughly four years to add seven million locations. AT&T has to add 5.5 million more connections in less than two more years. What that means for AT&T’s capital investment
budget is clear: it has had to boost capex beyond prior years.

Some have argued that reaching that level of fiber-to-home household coverage would be difficult. But AT&T has boosted its capex beyond former levels, now spending about $6 billion a quarter on capex.

AT&T has allocated as much as $22 billion in 2017 annual capex, in part to build out the FirstNet emergency responder network.

Some observers will worry about what that level of capex means for overall financial performance, including debt load. But, so far, AT&T seems to have handled it.

Video Entertainment "Market" Now Smashes Separate Regulatory Walls between "Content and Apps" and "Delivery"

The new move by T-Mobile US video streaming business is portrayed by the company itself, and news reports, as representing competition with cable TV.

“The Un-carrier will build TV for people who love TV but are tired of the multi-year service contracts, confusing sky-high bills, exploding bundles, clunky technologies, outdated UIs, closed systems and lousy customer service of today’s traditional TV providers,” said John Legere, T-Mobile US CEO.

A few reports correctly described the service as a streaming offer more akin to over-the-top services offered by AT&T, Dish Network, YouTube and Netflix.

But that might be quite the point. T-Mobile US itself describes its move as representing a move into the $100 billion revenues subscription TV market dominated by cable and telco suppliers.

““We’re in the midst of the Golden Age of TV, and yet people have never been more frustrated by the status quo created by Big Cable and Satellite TV,” said Mike Sievert, Chief Operating Officer of T-Mobile.

The over the top service represents the “successor” service to linear TV, virtually all observers agree. That is why Disney is launching its own retail streaming services, for example.

And that is perhaps among the most-important ramifications of the move. In the application business--including the application businesses traditionally operated by telcos and cable TV--app delivery has been decoupled from the use of access networks.

Relevant competition for cable TV includes satellite and telco services, but also DirecTV Now, Netflix, Amazon Prime, Hulu, Sling TV and other services, with additional competition coming from Facebook, YouTube and many social networking apps.

In other words, the traditional regulatory distinction between unregulated “content or data services” and regulated access service providers is evaporating. Netflix and others create their  own content, bundle and license content and deliver that content.

That makes Netflix (if not a “perfect” substitute) a rival for linear TV subscriptions. The move by T-Mobile US into the OTT video subscription business represents that evolution.

Streaming services might be owned by app providers (social media, YouTube), commerce providers (Amazon), content studios (Sony, Hulu), or distributors (AT&T, Dish Network, Verizon, Comcast).

Whomever the owner of the assets, the new reality is that content creation, packaging and delivery now is becoming independent of the access mechanism. That will--or should--eventually have regulatory implications of major scope.

Defining the scope of a “market” now is more complicated--and much broader--than it once was.

Wednesday, December 13, 2017

AT&T Expands AirGig Trials

AT&T has launched an international trial of its Project AirGig access technology, and also has launched a second trial in the United States.
 

Unlike a “data over power line” system, AirGig does not actually use the power conductor, but only travels along the exterior of a power line. 

 AirGig, it is hoped, could deliver internet access speeds well over one gigabit per second using a millimeter wave (mmWave) signal guided by power lines. If so, internet access facilities would not require new towers or cables, but would be able to piggyback on existing electrical distribution lines. 

 The first international trial started earlier in 2017 with an electricity provider outside the United States The second U.S. trial recently started in Georgia with Georgia Power. While this trial is located in a rural area, AirGig could be deployed in many areas not served by high speed broadband today – rural, suburban, or urban, AT&T says.

Network Effects Explain Oligopolistic Structure of the Internet

Image result for network effects
source: medium.com
Oligopolies (functional, rather than enforced by law) now are a key characteristic of most parts of the internet ecosystem. In other words, there are functional "gatekeepers" across most of the ecosystem.

That "winner take all" structure might emerge as the natural consequence of consumer choices, supplier skill and timing.

The biggest driver, though, is that some markets have "network effect" characteristics. That is why the "platform" role is so desirable.

Platforms benefit from scale, and grow with increasing scale. That arguably applies for operating systems, access services, devices and apps/

So most markets with scale economics and network effects arguably develop in the same way.

The point is that markets where winners are able to exploit network effects virtually always leads to oligopoly outcomes. Regulators can break up such markets, but to the extent that network effects actually matter, concentration always will reoccur.

In the application space, advertising revenue is dominated by Google and Facebook, which claim 63 percent of U.S. digital ad revenue in 2017. In the operating system market, Android and Apple iOS are the leaders, with 99-percent market share. The device portion of the market is the least concentrated , although Apple and Samsung have earned most of the profits.  

Mobile and fixed network access markets likewise are oligopolies, in virtually every market. Fixed markets in many cases remain virtual monopolies, while mobile markets tend to be oligopolies.



Tuesday, December 12, 2017

Ericsson to Supply Verizon Early 5G Deployments

Ericsson will supply Verizon with 5G radio infrastructure, allowing Verizon to launch commercial “pre-5G” networks in 2018. The expected deployments will include the launch of fixed wireless services in a few U.S. cities.

That is important for several reasons. Although the creation of new apps, services and revenues is a hoped-for development for 5G, that expectation has existed for 3G and 4G as well, where service providers expected new use cases and apps  to develop, but were not sure precisely what would happen.

That remains the case for 5G as well, where the key issue is the business model: what incremental new revenue sources will develop?

Verizon, learning in part from history, is following a known deployment path. As 4G initially was launched in the first markets to support computing devices, not mobile voice, so Verizon will launch 5G as a platform for fixed wireless internet access, and later add the full mobility network functions. That allows a scaling of investment and matches early investment with revenue generation.

Use of 5G to support fixed wireless access, both in-territory and out of region, is among the first new revenue sources to develop for 5G. Deployment of fixed wireless out of Verizon’s existing fixed network region also is a first.

So one other way to characterize Verizon’s early 5G deployments is to note that the platform will enable an out-of-region assault on consumer markets, where Fios has been totally in-region.

More Fixed Network ISP Competition Seems to be Coming

The fixed network internet access duopoly possibly is going to be challenged in new ways over the coming decade. New forms of mobile competition are going to develop, including both direct mobile substitution and mobile-enabled fixed wireless. Also, some new fixed network competitors are likely to enter the markets as well.

At least in principle, more than 100 Colorado communities could see some form of
municipal broadband network created, as voters in those communities have approved such moves. That clears a legal hurdle, but now means each community will grapple with the business model.

Longmont, Colo. already has built out a portion of its planned gigabit internet access network, aided by that city’s ownership of a municipal power utility, meaning Longmont owns rights of way, distribution facilities, rolling stock and other assets helpful to creating a city-wide internet access network.

In Centennial, Colo., private internet service provider Ting Internet will piggyback on a new government network to be built by the city of Centennial itself.    

In a few cases, state funds could play a role, as subsidies for middle-mile trunking can change the business model. Magellan Advisors, for example, identifies several roles cities can take, including streamlining of processing necessary for private ISPs to build or upgrade infrastructure; providing access to city-owned dark fiber; city-owned wholesale capacity services or actual provisioning of municipal services for businesses or consumers.

Risk and capital investment grows assume more active roles, including that of actual service  provider. One point worth making is that adoption rates vary based on the number of services offered, and by the ways adoption is measured.

These days, in competitive consumer markets, penetration is measured in terms of revenue-generating units, not “locations” or “households.” Each unit sold (voice, video or internet access) is counted against the base of locations. So a single location buying three services results counts as much as three other homes buying just one service.

So it is that a number of retail service providers such as Morristown, Tenn.; Chattanooga, Tenn.; Bristol, Va. or Cedar Falls, Iowa seem to have far higher penetration rates than Longmont, Colo.

That is partly because the Longmont network still is being built out, but also reflects the fact that Longmont’s network sells only internet access and voice, but not video entertainment services. The other networks have been in operation and marketing for three times as many years as Longmont.


Customer “penetration” by household therefore is different from penetration measured as a function of units sold. The difference is that determining the magnitude of stranded assets hinges on how many locations passed generate revenue.

Assume that, on average, a typical household buys 66 percent of the total suite of services (two of three triple play services or  three of five services, for example).

The difference is significant. Measuring “penetration” by units sold, penetration appears to be as high as 76 percent to 87 percent. Measured as a function of homes generating revenue, penetration could be as low as nine percent, or as high as 44 percent, with a “typical” value being something between 20 percent to 25 percent of homes passed.

Penetration: Units Sold or Homes Buying Service?

Morristown
Chattanooga
Bristol
Cedar Falls
Longmont
homes passed
14500
140000
16800
15000
4000
subscribers
5600
70000
12700
13000
500
units sold
39%
50%
76%
87%
13%
services sold
3
3
5
3
2
HH buys .66 =
2
2
3
2
1
Homes served
2828
35354
3848
6566
379
penetration
20%
25%
23%
44%
9%

It might be worth pointing out that all these communities (Morristown, Chattanooga, Bristol, Cedar Falls and Longmont) have municipally-owned utility companies, and might therefore represent a sort of best case for retail operations serving consumers.

That seems consistent with other evidence. In markets where a telco and a cable operator are competent, as is the attacking ISP (municipal or private), market share might take a structure of 40-40-20 or so, possibly 50-30-20 in areas where the telco does not have the ability to invest in faster broadband and the cable operator has the largest share.

Beyond the actual cost of the network, and the business role chosen by the municipality, details of revenue generation (homes that generate revenue as a percentage of total; number of services offered) are fundamental.

Beyond that are the other operating and marketing costs, overhead and need for repaying borrowed funds and making interest payments, on the part of the retail service provider.

One might argue that most other communities, without the advantages ownership of an electric utility provides, will often find the lower risk of a shared public-private approach more appealing.

Also, some ISPs might find the availability of some amount of wholesale or shared infrastructure makes a meaningful difference in a business model.

One might suggest there are a couple of potential practical implications. Efforts by incumbent ISPs to raise retail prices in the same way that video entertainment prices have grown (far higher than the rate of overall inflation) will increase the odds new competitors enter a market.

Higher prices, in fact, will increase the likelihood of new entrants entering a market, as the higher prices increase the attractiveness of doing so.

In at least some cases, the new competitors will be firms such as Verizon, which now has announced it will essentially overbuild an AT&T and Comcast markets in Sacramento, Calif.

Though it is not easy, more competitive ISPs are likely to enter more markets, as lower-cost access platforms evolve, helped in some cases by municipal facilities support.

Where that happens, it is conceivable that the incumbents will see a new limitation on their market share, dipping from possibly 50-percent share to a maximum of perhaps 40 percent each, on a long-term basis, assuming the new competitor is not eventually bought out by one of the incumbents.

Cloud Computing Keeps Growing, With or Without AI

source: Synergy Research Group .  With or without added artificial intelligence demand, c loud computing   will continue to grow, Omdia anal...