Sunday, February 17, 2019

Impact of Local Access Market Structure on Potential Revenue

One of the most-enduring problems faced by communications regulators and service providers is the market structure of any access business. Regulators generally want more competitors, while service providers understandably prefer fewer.

This chart originally produced by the Federal Communications Commission shows why connectivity providers dislike competition: it directly reduces both gross revenue (market share) and profits (average revenue per user, for example). When there are just two competent competitors in a single market, potential market share is reduced from 100 percent to 50 percent.

Add one more competent provider and maximum theoretical market share (or take rate) drops to 33 percent or so. In any capital-intensive business--especially one that has high stranded asset risk--those “small” changes in market participants can dramatically shape sustainability.

At least traditionally, in fixed services markets where three anchor products are sold (voice, video, data), survival is possible when market share is at least 33 percent. At some point below that there are serious questions about profitability, and therefore long-term survival.


With fixed network voice and video revenues under pressure, internet access increasingly emerging as the anchor service for U.S. cable companies, while video entertainment seems poised to make the difference for telcos. Voice is declining in both segments of the access business.

Still, the chief determinant of revenue arguably is the number of competent competitors in the market, as that reduces potential upside the most.

Saturday, February 16, 2019

Laurel and Hardy Western Dance Routine: Hilarious

Comedians Stan Laurel and Oliver Hardy are hilarious in this dance routine. I cannot explain why.

Why Telecom Policy So Often "Gets it Wrong"

Telecom and Internet regulators often create policies that have effects opposite of what they intended, said Dr. George Ford, Phoenix Center chief economist, from the center stage at the recent PTC’19 conference.

“Modern policy making is plagued by notions about competition,” he said. The point, Ford says, is that a small number of suppliers in fixed networks is the result of economic conditions, not a failure of policy. “If only two firms can profitably offer the service, then demanding more is wishful thinking and prone to produce bad policy,” he says.

If you ask the FCC how many more competitors they want, the answer always is “one more,” Ford quipped. But the agency also wants lower prices. “Policymakers often call for aggressive price competition, not realizing that doing so will, in turn, reduce the number of sellers, which they then lament,” he said.


Long term, lower prices and more competitors are mutually exclusive in any capital-intensive business, so the normal expected outcome is just a few competitors, Ford said.

“We may not like that outcome, but it is economically-driven and not some failure of public policy,” said Ford. Ironically, “where a small number of firms exist, that outcome may be the result of aggressive competition, rather than an indicator of lack of aggressive competition,” Ford notes.

“In my experience, ‘promoting competition’ is unlikely to have a material effect on actual competition,” he said. “In fact, it often has the opposite effect.”

When he first joined the staff of the Federal Communications Commission out of graduate school, the agency "would have been thrilled" if told policy could help create two competitors in the local telecom business, said Ford. "We'd  have packed up and gone home," said Ford.

The other big systemic problem is that policy debates often focus on desired outcomes rather than construction of policies to achieve those goals. “This confusion over ideas and policies is endemic to nearly all modern policy debates,” said Ford, pointing to network neutrality as a prime example.

An “open Internet” is an idea, not a policy; an outcome, not procedures to achieve that goal, he said. The recent net neutrality debate, which confused the two, lead to policies that were internally contradictory.

“No blocking” of lawful content is a Title I concept; common carrier regulation a Title II approach. Title I “no blocking”  means zero price, but all Title II services require some non-zero price. It is incoherent.

At the same time, the network neutrality rules imposed prices (set at zero) but also banned tariffs, which meant there was no procedural way to challenge the specifics of the rules.

So, it was not the idea of an “open Internet” that was the problem, it was the actual implementation of it that was problematic, said Ford.
 
Policies to promote broadband deployment suffer from similar problems. “I’ve seen little evidence of effective policy in this space,” he said, and incentives are the issue.

“Policymakers and advocates want more broadband because they believe it provides positive externalities—that is, social benefits that arise outside of the individual transactions,” Ford notred. “Put simply, if I use broadband, other people benefit.”

“But, externalities, by definition, are not captured by sellers or buyers, so they neither affect the supply or demand of broadband services,” he said.

Either firms must be incentivized, perhaps through subsidies, to increase supply or reduce prices, or buyers may be subsidized directly to increase effective demand, said Ford.

Most proposals, mandating zero pricing, prohibiting zero rating or outlawing different quality of service levels, are counterproductive, Ford said.

One example, he noted, is the effect of U.S. net neutrality regulation on capital spending. “Most of the analysis was silly,” he argued. Comparing capital spending from one year to the next several, after the new rules, “is meaningless.”

“The question is what would capital spending have been absent the regulation, which requires the construction of what we call a counterfactual,” Ford says.

Ford is about the the only human being in the communications industry to take seriously the notion of the counterfactual, a concept similar to opportunity cost.

As applied to communications policy, the problem is that claims are made about policies producing an outcome, without the ability to show what might have happened if a different policy choice had been made.

In an investment context, opportunity cost represents the benefits an investor might have reaped by making a different choice.

One clear example is the debate over whether infrastructure investment grew or declined because of network neutrality rules. A counterfactual analysis is always necessary when looking at policy outcomes, in other words.

It is possible that infrastructure investment might have been higher in the absence of net neutrality rules, for example. In principle, such investment could also have been lower, in the absence of the rules.

The same principle applies for analysis of fair use rules, or virtually any other proposed public policy.

The net neutrality argument originated from notions about payola, where radio stations would play music because of monetary payments. They would play what they got paid for. That is what people tend to think, Ford says.

In grocery stores, product suppliers often pay for placement on shelves. But those are examples of scarcity. You can only play one song at a time. The internet does not have that kind of scarcity component. Artificial scarcity does not work. Networks do not disable themselves at a standard level so they can upsell a faster product.

Policymakers sometimes also focus on ways to encourage the fastest-possible internet access speeds. But networks are in the business of selling a product, Ford said. “Everybody does not want a 10-gig or 1-gig service.”

People only buy what they believe has value, so over-investment, with business models that are not sustainable can happen. A lot of people argue everybody should have a gig, but where offered, most do not buy it, says Ford.

“So 75 percent of people in the past bought a 6-Mbps connection when they could have bought much-faster service,” he noted.

Or consider the matter of unbundled sales of phones and service contracts. In the past, consumers would sign two-year contracts and then got deeply-discounted phones. Regulators thought that unfair and anti-competitive.

So was the end of mobile phone contracts pro-consumer? It is not anti-competitive to allow people to buy a phone for $200, instead of $800, in exchange for signing a two-year contract, Ford says.

You can outlaw such tying of service to phone subsidies, but the policy solves nothing. It does not improve consumer welfare.

Friday, February 15, 2019

66% of North American IT Pros say They Use SD-WAN, or Will, by End of 2020

Software-defined WAN (SD-WAN) will be used at 66 percent of surveyed North American companies by the end of 2020, a survey by IHS Market finds.

Companies deploying SD-WAN use over 50 percent more bandwidth than those who have not deployed it, the survey suggests. Their bandwidth needs are also growing at twice the rate of companies using traditional WANs.

The first wave of SD-WAN deployments focused on cost reduction, and this is still clearly the case, with survey respondents indicating their annual megabits-per-second cost is 30 percent lower, with costs declining at a faster rate than in traditional WAN deployments.

SD-WAN solutions not only solve the transportation and WAN cost reduction issue, but also help enterprises create a fabric for the multi-cloud.



51% of North America Firms Surveyed will Hybrid Cloud in 2019

In 2019, 51 percent of North American network professionals surveyed by IHS Markit say they will use hybrid cloud. Some 37 percent will adopt multi-cloud for application delivery.

Enterprise IT architectures and consumption models are changing, from servers and applications placed at individual enterprise sites, to a hybrid-cloud model where centralized infrastructure-as-a-service (IaaS) complements highly utilized servers in enterprise-operated data centers, IHS Markit says.

Respondents also suggested that hybrid cloud is a stepping stone to multi-cloud.

Over time, certain functions requiring low latency will migrate back to the enterprise edge, residing on universal customer premises equipment (uCPE) and other shared compute platforms. This development is still in its infancy, though.

Performance is a top concern, and enterprises are not only adding more WAN capacity and redundancy, but also adopting SD-WAN, IHS Markit says. The primary motivation for deploying SD-WAN is to improve application performance and simplify WAN management.

Bandwidth consumption continues to rise. Companies are expecting to increase provisioned wide-area network (WAN) bandwidth by more than 30 percent annually across all site types.
Data backup and storage is the leading reason for traffic growth, followed by cloud services.

More Support for U.S. Rural Broadband

The U.S. Federal Communications Commission has awarded $1.5 billion in support for rural internet access services, intended to upgrade some 713,000 locations, at an average subsidy of $2103 per location.

As always is the case, the small percentage of very-rural locations have per-line capital investment costs far above those of urban and suburban locations.

The most isolated 250,000 U.S. homes of the seven million that in 2010 did not have fixed network internet access (or did not meet a minimum 4 Mbps downstream speed),  representing about 3.5 percent of those locations, require 57 percent of the capital required to connect all seven million locations.

“The highest-gap 250,000 housing units account for $13.4 billion of the total $23.5 billion investment gap,” an FCC paper has estimated.


“Our analysis indicates that there are seven million housing units (HUs) without access to terrestrial broadband infrastructure capable of meeting the National Broadband Availability Target of 4 Mbps download and 1 Mbps upload,” the FCC said in its Broadband Availability Gap technical paper.

Created in support of the FCC’s National Broadband Plan, the document says simply that “because the total costs of providing broadband service to those seven million HUs exceed the revenues expected from providing service, it is unlikely that private capital will fund infrastructure.”

Cost and density are inversely related, the FCC noted. The density pattern follows a basic Pareto rule, that 80 percent of the population lives in perhaps 20 percent of the land area.

source: FCC

Network Slicing and Native Edge Apps

Network slicing, enabled by network virtualization in a broad sense, might be directly related to performance requirements of edge computing and native edge apps.

Edge-native applications, as their name suggests, are applications which require the unique characteristics provided by edge computing to function satisfactorily, or in some cases to function at all. These applications will typically rely on the low latency, locality information or reduced cost of data transport that edge computing provides in comparison to the centralized cloud.

One practical issue is how to decide when edge computing is the preferred solution. In addition to the typical scheduling attributes such as requirements around processor, memory, operating system, and occasionally some simple affinity/anti-affinity rules, edge workloads might be interested in specifying some or all of the following:
• Geolocation
• Latency
• Bandwidth
• Resilience and/or risk tolerance (i.e., how many 9s of uptime)
• Data sovereignty
• Cost
• Real-time network congestion
• Requirements or preferences for specialized hardware (GPUs, FPGAs)

One important core network implication is that many of those attributes (geolocation, latency, bandwidth, resilience, cost and real-time congestion performance) are precisely the issues network slicing addresses.

The State of the Edge report (get it here) notes that edge computing grows out of the legacy of content delivery networks, which tells you much the potential advantages and use cases: application acceleration and lower latency are where edge computing adds value.

“There are lots of edges, but the edge we care about today is the edge of the last mile network,” the report authors suggest.    

AI Will Improve Productivity, But That is Not the Biggest Possible Change

Many would note that the internet impact on content media has been profound, boosting social and online media at the expense of linear form...