Thursday, April 16, 2015

Net Neutrality Founded on Bad Science

Analyst Martin Geddes has been arguing for “science-based” telecom policy. Unfortunately, he argues, U.S. network neutrality fails, in that regard.

Discussing the Federal Communications Commission’s new rules, Geddes spares no words. “Regrettably, they have proceeded to issue rules without having their science in order first,” Geddes says. “As a result they have set themselves up to fail. My fear is that other countries may attempt to copy their approach, at a high cost to the global public.”

Consider the apparently easy issue of “no blocking of lawful packets.” Most people agree lawful packets should not be blocked (some governments disagree). But is it “blocking” when a specific Internet service provider does not interconnect with some other Internet domain?

“How will the FCC differentiate between ‘blocking’ and ‘places our ISP service doesn't happen to route to’"?

Geddes says there are issues of business practice. “Why can't an ISP offer ‘100 percent guaranteed Netflix-free!’ service at a lower price to user who don't want to carry the cost of their neighbors' online video habit?”

“A basic freedom of (non-)association is being lost here,” Geddes notes. “To this foreigner, ‘no blocking’ is a competition issue for the FTC and antitrust law, not the FCC (and the FTC agrees, by the way).
Similar problems exist with "no throttling" policies.

“Broadband is a stochastic system whose properties are entirely emergent (and potentially non-deterministic under load),” Geddes says.

How will a regulator distinguish between "throttling" and mere "unfortunate statistical coincidences leading to bad performance"?

And fairness is an issue. “Why should someone who merely demands more resources be given them?” Geddes rhetorically asks. “Where's the fairness in that!”

What's the metric used to determine if "throttling" has taken place? User behavior matters.

Optimizing networks for "speed" performance produces better results for large file downloads, not interactive apps, for example.

What are the proposed metrics for performance and methods for measuring traffic management? What's the reference performance level for the service? Without these, "no throttling" is meaningless and unenforceable, Geddes notes.

The real issue is whether the service performance is good enough to deliver the quality of experience outcome(s) that the user seeks. And that’s a problem. By definition, “best effort” is just that: best effort.

The other problem is that such an approach necessarily prevents creation and use of classes of service that users benefit from, and might well desire to buy and use.

Traffic scheduling (packet “prioritization”) is a good thing, even if it violates the rules, in other words.

Net neutrality “undermines rational market pricing for quality.”

We already have "paid priority", he notes. “All CDNs offer de facto priority by placing content closer to the user, so it can out-compete the control loops of content further away. Paid peering is perfectly normal.

“If you tried to make spectrum policy rules that broke the laws of physics, you'd be ignored by informed people,” Geddes says. “Boadband is similarly constrained by ‘laws of mathematics.’ Why don't we try making rules that fit within those, for a change?”

“We need a new regulatory approach, grounded in the science of network performance, that directly constrains market power,” Geddes argues.

No comments:

Many Winners and Losers from Generative AI

Perhaps there is no contradiction between low historical total factor annual productivity gains and high expected generative artificial inte...