Showing posts with label network neutrality. Show all posts
Showing posts with label network neutrality. Show all posts

Saturday, October 1, 2011

Net Neutrality Wasn't a Story in 2011

"Network neutrality," believe it or not, was among the marketing trends that consultant Gini Dietrich thought in October 2010 would highlight 2011. She was dead on about some things, correct about the direction of others, but some would argue missed the mark about one or two, including network neutrality.

Among the clear correct calls, she argued that "content, content, content" would be key. "All companies should become media companies," she said.

Some of the other trends weren't so pronounced during the year, though partly correct in terms of direction. Dietrich mentioned heightened Federal Trade Commission scrutiny of the "blogging" world. That was correct to an extent, but has not, and some would argue will not, take a further step.

Dietrich argued that "next" in 2011 would be rules "around ethics and how we approach traditional journalists and bloggers." That didn't happen, and some would argue, will not.

Some of us would argue that Dietrich flatly misunderstood the "network neutrality" issue, both in timing and implications. She argued that "being able to write a blog post at 6:00 in the morning and post it two hours later and letting it reach audiences around the world for free will be gone." That's a common argument by supporters of strong versions of network neutrality, but is mistaken. The Federal Communications Commission has operated for years on some fundamental "Internet Freedoms principles" that enshrine consumer access to all lawful applications. Internet Freedom principles

"Consumers and innovators have a right to send and receive lawful traffic--to go where they want, say what they want, experiment with ideas--commercial and social, and use the devices of their choice," the Federal Communications Commission clearly has said. "The rules thus prohibit the blocking of lawful
content, apps, services, and the connection of devices to the network." In other words, non-blocking of lawful content already is policy, and already has been enforced by the Commission, on the couple of occasions when it even became an issue.

But there is another important issue that people become confused about, namely the need to manage traffic on any network to ensure the best possible performance for all users, especially when networks get congested.

"The rules recognize that broadband providers need meaningful flexibility to manage their networks to deal with congestion, security, and other issues," the FCC continues to believe. Some people have experienced "all circuits are busy now, please try your call again later" messages when trying to make a landline telephone call.

More have simply found they are unable, from time to time, to make a mobile call, either. Those are examples of lawful network management. When the network gets overwhelmed with admission requests, it simply blocks some attempts until the congestion is alleviated. That is neither illegal nor illogical.

Much of the confusion about network neutrality flows from not distinguishing between the "unimpeded access to lawful apps" and the "need to manage a network for congestion." Some network neutrality supporters argue that an ISP should be forbidden to manage its traffic demand in any way to optimize network performance. That can have several unpleasant implications for end users.

Congestion management on the Internet, or any Internet Protocol network (and all networks are becoming IP networks), typically involves some sort of blocking or delayed response at times of congestion. The networks "slow down" and some connection requests simply "time out." If ISPs cannot establish any priorities for traffic, then everything randomly slows down.

That isn't a major problem for email or web surfing, which will simply be "slower." Random lags in packet arrival are highly disruptive for video and voice, and both media types will be carried on all-IP networks, everywhere, in the near future. In principle, the best end user experience would be provided if, under congestion, priority is given to voice and video bits, while email, web surfing and software update packets are delayed.

Strict network neutrality rules would prevent that practice. There are some business practices issues of concern, such as ISPs favoring their own content over content supplied by rivals.

But that already happens, all over the Internet, as content delivery networks such as those operated by Akamai optimize content for faster delivery. It is not "equal treatment." That is the whole point. Akamai and other content delivery networks charge content providers money to expedite delivery of their packets. There are potential legitimate restraint of trade issues posed by packet prioritization. But content providers do this today, all the time.

In the future, when all traffic is carried over IP networks, there will be clear end user issues. Are you willing to pay for voice or video services that randomly fall apart, or do you expect some reasonable quality standards? Without packet prioritization, it will not be possible to ensure that the voice or video services a customer has paid for can actually be delivered with minimum quality levels. Calls will become garbled and then suddenly disconnect and video will freeze.

The point is that end user access to all lawful applications is not the issue. Whether quality measures can be taken, especially for latency-sensitive applications such as voice, online gaming, video or video conferencing and many transaction processes related to shopping and banking, is the issue.

The other issue is that all IP networks are shared. So what should an ISP do about the fact that a very small percentage of heavy users can disrupt quality of service for the 97 percent of other users who have to share a network? Right now, the way ISPs deal with the issue is to set a quota for total usage, and then throttle the few heavy users when they exceed the quota of usage. It's a crude way of managing heavy usage.

Some would argue the better approach is to allow users to decide whether they'd rather pay some premium so that, under heavy congestion, they'd get priority access, much as content and application providers now can pay Akamai to expedite packet delivery.

The confusion about network neutrality is widespread, and for good reasons. But the issue is not a matter of content access or freedom of speech. All networks have to be managed. All networks can become congested. The issue is how to preserve end user experience when that happens. Some network neutrality proponents say "do nothing." Few network engineers or architects would agree that is a wise choice.

Friday, October 22, 2010

Mobile Video Hiccups Once Second Out of Every Six

Despite the protestations of network neutrality advocates, there's a very good reason why real-time services ranging from voice to conferencing to entertainment video actually require some form of optimization and even packet prioritization.

Recent data collected in mid-year 2010 by Bytemobile from networks operated by the likes of AT&T, China Mobile, China Telecom, KDDI, KPN, O2, Orange, Orascom, Sprint Nextel, T-Mobile, Telecom Italia Mobile, Telefónica, TeliaSonera and Vodafone show that every minute of mobile video consumed by end users includes about 10 seconds of stalling.

That's 17 percent of every minute of video, or a stall every six seconds. That is hugely disruptive of viewing experience and will not be acceptable once users become accustomed to using video content. It will be completely unacceptable once users start paying for video content services.

The problems currently are worse about 10 p.m. local time, and best at 5:30 a.m,, says Bytemobile. Expecting such congestion and disruption, most mobile users opt to watch videos at lower-quality settings to improve their media experience. That probably isn't what content owners or their business partners prefer, to say nothing of mobile service providers who will inevitably be tarnished by that sort of performance.

Nor does "more bandwidth" solve such problems. Bytemobile data shows that stalling occurs on even the fastest of networks and a quality user experience requires optimization of video content. In other words, packet prioritization and, or, other measures to keep latency and jitter performance optimal.

That said, as network bandwidth decreases, video stalling dramatically increases.

Consumption of high-definition video is nearly non-existent on wireless networks, at 0.07 percent of video-specific traffic volume.

Moreover, video traffic directly impacts bandwidth availability on wireless networks all over the world.

read more here

Friday, September 3, 2010

FCC Wants More Input on Wireless, Managed Services

The Federal Communications Commission's Wireline and Wireless Bureaus are seeking further public comment on issues related to specialized or ‘managed services and mobile broadband, at least partially, and perhaps largely, because Verizon and Google have reached their own agreement about how to implement network neutrality on Verizon's fixed networks, but have agreed not to apply the rules to wireless access.

The FCC wants further input on the exemption of new managed services from the "best effort only" Internet access agreement. In essence, Google and Verizon have agreed to what network neutrality advocates have asked for on the fixed networks. That virtually ends discussion about Internet access and network neutrality.

But the mobile network now emerges as the area where policy advocates will focus their energy, and many will not be happy with the exemption for managed services, though the policy foundation for prohibiting such services seems quite weak. Lots of services, such as private network services or cable TV or telco TV routinely use the same physical facilities, but represent different services from "Internet access" and in fact are regulated using entirely different rules.

linkf

Thursday, August 12, 2010

Google Defends its Verizon Net Neutrality Deal

Richard Whitt, Google senior policy director, defends Google's agreement with Verizon, and inplicitly its belief that the compromise makes sense as a wider framework, for any number of reasons, not the least of which is that it moves the ecosystem forward at a time of apparently "intractable" obstacles.

"At this time there are no enforceable protections, at the Federal Communications Commission or anywhere else, against even the worst forms of carrier discrimination against Internet traffic," he says.

As is true with all grand compromises, the Verizon-Google deal represents "the best policy solution we could devise together," says Whitt. "We’re not saying this solution is perfect, but we believe that a proposal that locks in key enforceable protections for consumers is preferable to no protection at all."

Whitt likely is right about the "best achievable policy solution" angle. Given the serious business repercussions, no endurable solution is possible that fails to give key participants key victories.

If adopted, this proposal would for the first time give the FCC the ability to preserve the open Internet through enforceable rules on broadband providers. At the same time, the FCC would be prohibited from imposing regulations on the Internet itself, says Whitt.

Though ISPs might prefer another outcome, including unrestricted ability to create new tiers of service that optimize end user experience for real-time services, the Verizon-Google compromise does not preclude such Internet-based services. But the decision is left in the hands of application providers, and is a prohibited option for ISPs. That is a big deal.

Nor is Google foreclosing its ability to act later on wireless network neutrality, should it become necessary, and also gains an important regulatory precedent. "In the spirit of compromise, we have agreed to a proposal that allows this market to remain free from regulation for now, while Congress keeps a watchful eye," says Whitt.

The deal also implicitly creates other precedents. The logic implies that network neutrality is needed when markets are not robustly competitive. Some observers would strongly contest the notion that the fixed broadband market is functionally uncompetitive. But the point is that Google gets recognition that, in the future, if wireless networks become less competitive, rules might need to be extended.

Whitt argues that the wireless market is more competitive than the wireline market, given that consumers typically have more than just two providers to choose from. Whitt also concedes that wireless carriers need to manage their networks more actively for several reasons that make wireless technologically more challenging than fixed networks.

"In our proposal, we agreed that the best first step is for wireless providers to be fully transparent with users about how network traffic is managed to avoid congestion, or prioritized for certain applications and content," says Whitt. "Our proposal also asks the Federal government to monitor and report regularly on the state of the wireless broadband market."

The other angle is that the compromise does not prevent Congress from acting to impose new safeguards on wireless broadband providers. Whitt further argues that the new fourth-generation networks already are more open than 3G networks have been.

"So consumers across the country are beginning to experience open Internet wireless platforms, which we hope will be enhanced and encouraged by our transparency proposal," says Whitt.

There is no danger of Internet "cannibalization" because all Internet access services would have to remain "best effort" services. Non-Internet services could be offered. The best examples likely are the voice and video entertainment services consumers already buy, or private network services businesses buy.

"So, for example, broadband providers could offer a special gaming channel, or a more secure banking service, or a home health monitoring capability, so long as such offerings are separate and apart from the public Internet," he says.

If needed, the FCC could step in, should abuses in those separate areas occur.

http://feedproxy.google.com/~r/blogspot/MKuf/~3/icZfrW2iPuc/facts-about-our-network-neutrality.html

read more

What's in the Deal for Google?

The recenlty-announced agreement between Google and Verizon Wireless on network neutrality has just a couple key provisions: the exemption of wireless services; "best effort" as the only service level Verizon can offer for fixed consumer broadband, Google's ability to do so if it chooses, and Verizon's ability to create new managed services that do feature quality-of-service guarantees (such as today's voice or video services).

Some might wonder why Google would agree to a deal that many network neutrality supporters think is too generous to Verizon. Others might wonder why Verizon would agree to permanently limit its fixed consumer access services to "best effort only."

The short answer is that it is a compromise giving each company something each considers important for its own future revenue growth, while trading away other provisions that might have been nice, but which are less central to the future business.

From Verizon's point of view, the agreement puts pressure on the Federal Communications Commission not to adopt rules that could be worse. The deal also protects Verizon's ability to manage its wireless networks, which always will have less physical bandwidth than its fixed networks, and therefore poses the more-difficult network management challenge.

Though it might like to have had the ability to offer something other than "best effort" levels of service on its fixed network, Verizon does retain the ability to create new managed services that are more like today's voice and entertainment video services, which must have quality of service measures, even if it cannot do so for Internet access services.

Google's wins might be more complicated to assess, on the surface. Some might argue Google gains recognition of a sort of asymmetrical framework: it can create quality-assured services (such as a streaming video service), if it likes, but Verizon is blocked from ever doing so. That importantly means Google can shape its own destiny without worrying that Verizon might create some form of paid quality assurance service that would raise Google's costs of doing business.

In a business sense, that is at the heart of much of the network neutrality position. Since Google's suite of businesses are based on the Internet, not managed or private network services, that means the whole gamut of things Google might want to do, now and in the future, in terms of services or applications that control latency, remain subject to its exclusive control. Equally important, Verizon cannot do so.

You might argue Google gave up too much in allowing more network management on wireless networks, but those networks always face bandwidth and congestion challenges that might technically require much more management. Google always can take up those issues later, should abuses arise.

The other angle is that if Google decides it wants to create a low-latency service of some sort, and deploys it for wired access, it also will likely work on mobile as well. As users routinely encounter options for "low bandwidth" or optional "high bandwidth" application interaction, so they might be offered a lower-bandwidth mobile experience and higher-bandwidth fixed access versions. The point is that if it goes to the effort and expense of creating low-latency applications, the same techniques should allow such apps to work on mobile networks as well.

But it is the "cost of doing business" angles that likely are equally important. As matters now stand, if consumers decide they want to consume lots more bandwidth, then it is Verizon's problem to make the investments, without direct hope of offsetting the investment costs by essentially getting video providers to pay some of the cost (creating video tiers that cost more, for example).

Verizon might hope to create and sell lots of accounts that feature higher bandwidth and cost more, but that's it. Verizon cannot expect to receive business partner revenues for doing so. As most observers think that is an essential requirement for mobile operators and telcos going forward, that means in the broadband access business, Verizon will be restricted to an end-user-only revenue source.

Verizon will have to hope it can create such partner revenue models in other ways. The agreeemnt does not specifically "commoditize" the broadband access business, but it does complicate matters for Verison to the extent that it bans any effort to create higher-priced "quality assured" access services.

On the other hand, should consumer demand for such services arise, Google retains the ability to create them. At the same time, Google gains assurance that, at least for Verizon users (and it likely hopes the agreement will ultimately apply to all broadband ISPs), Google does not have to worry about the cost of paying for upgraded access bandwidth demand and capabilities the ISPs surely will have to keep providing.

That said, there are always reasons why grand compromises are reached in the communications or other businesses: each of the key parties gets something really important, and avoids something that could be dangerous.

The Google-Verizon compromise is such an agreement. Each gives up something important; and each gains something equally important.

Monday, August 9, 2010

Tiered Access Pricing the Result of Google-Verizon Net Neutrality Deal?

Well, yes, in a manner of speaking, but probably only in the sense that "cable TV" or multichannel video entertainment services are sold.

Google and Verizon Reach Net Neutrality Agreement

In a move intended to break the current logjam over network neutrality discussions, Google and Verizon have reached their own agreement on network neutrality principles, and the compromise offers something for most key stakeholders.


The agreement enshrines "best effort" access as the mandatory form of service consumers are sold. Internet access providers could not apply their own packet priorities to legal traffic. You might assume this precludes creation of new quality-assured applications. The agreement, though, seems to preserve this option, but makes it an option only application providers can supply.


Application providers, on the other hand, could create quality-assured versions of their applications, while ISPs cannot. 


The agreement also exempts wireless networks from any of the rules, and allows ISPs to create new managed services (sort of like cable TV or satellite TV) that are not limited to best effort features. 


The companies agree that there should be a new, enforceable prohibition against discriminatory practices. This means that for the first time, wireline broadband providers would not be able to discriminate against or prioritize lawful Internet content, applications or services in a way that causes harm to users or competition.

In addition to not blocking or degrading of Internet content and applications, wireline broadband providers also could not favor particular Internet traffic over other traffic. That is a key provision. It means an ISP cannot favor its own video services over rival video services, for example.

The proposal, however, also would allow broadband providers to offer additional, differentiated online services, in addition to the Internet access and video services (such as Verizon's FIOS TV) offered today. Such "managed services" would not be traditional "Internet access" or "broadband access" services, but rather new and separate services.

The Google-Verizon proposal also includes safeguards to ensure that such new online services must be distinguishable from traditional broadband Internet access services and are not designed to circumvent the rules.

The FCC would also monitor the development of these services to make sure they don’t interfere with the continued development of Internet access services.

Wireless broadband is different from the traditional wireline world, so the proposal refrains from applying new rules to wireless networks and services.

The Government Accountability Office would be required to report to Congress annually on developments in the wireless broadband marketplace, and whether or not current policies are working to protect consumers.

Both firms agree also about enforceable transparency rules, for both wireline and wireless services. Broadband providers would be required to give consumers clear, understandable information about the services they offer and their capabilities.

The two firms also call for new ability by the FCC to enforce these openness policies on a case-by-case basis, using a complaint-driven process. The FCC could move swiftly to stop a practice that violates these safeguards, and it could impose a penalty of up to $2 million on bad actors.

Both firms support reform of the Federal Universal Service Fund, so that it is focused on deploying broadband in areas where it is not now available.

Both companies say they favor turning the Federal Communications Commission's "Internet Freedoms" principles into enforceable rules. Those principles, already in place, stipulate that consumers have access to all legal content on the Internet, and can use what applications, services, and devices they choose.

Both firms hope the agreement can serve as the framework for the FCC's broader network neutrality rules.




Wednesday, June 30, 2010

Consumers, App Providers and Service Providers All Lose from Net Neutrality, Stratecast Argues

Some network neutrality proponents say users will benefit if all forms of packet priority are prohibited. In this view, more innovation and value will be produced if no applications can be given  favored use of the access pipe.

That would include streaming video, voice or any other real-time service.

Analysts at Stratecast do not believe the argument. Their analysis suggests application providers themselves, as well as end users and service providers, will be harmed if such policies are adopted.
In truth, nobody knows what might happen if all ability to prioritize bits were prohibited. The key thing, says Stratecast, is that there would be so much uncertainty that service providers would likely behave as though the downside were quite large in magnitude.

Higher prices for end users, less movement towards higher-speed access and ultimately even application experience degradation would occur, long term. The main reasons are the higher costs to "over-provision" physical networks, lower returns for such investment and less robust development of new services and revenue streams, Stratecast argues.

read the full position paper here

Friday, June 25, 2010

U.K. Regulator Not Initially Convinced Net Neutrality Rules Needed

Ofcom, the U.K. communications regulator, has opened an inquiry into network management and network neurality issues by suggesting it does not presently see evidence of anti-competitive behavior that requires "ex ante" (rules instituted before any obvious problems) regulation.

Ofcom's proceeding is noteworthy for its refreshing honesty about the "network neutrality" debate; namely that the stakes include the utlimate division of revenue and profit in the developing broadband ecosystem.


"As the telecommunications market, content sector and online sector change, points of friction will inevitably arise over who controls customer relationships and the rate of innovation," Ofcom said. "Firms across these sectors are also competing for a share of advertising revenues and consumers’ expenditure at a time when there are concerns about the sustainability of many of the existing business models, not just for traditional telco and content distribution businesses but also a surprisingly large number of online businesses."

"As the value chain is taking shape, network operators and content providers are bargaining over how future rents will be divided and technical measures such as DPI and DRM are being deployed in part to strengthen relative negotiating positions," Ofcom noted.

The situation is especially acute in the mobile space, where bandwidth consumed, and hence network cost, is growing far faster than revenue.


link

Thursday, June 10, 2010

Is There a Need for Economic Regulation of the Internet

Two necessary preconditions must be satisfied to justify market intervention in the form of economic regulation on the part of the government, says Dennis Weisman, Professor of Economics at Kansas State University and an editor of the Review of Network Economics and a member of the Free State Foundation's Board of Academic Advisors.

The first one inquires as to whether there is a problem and the second one inquires as to whether there is a solution? Only if both questions can be answered in the affirmative can such intervention be justified.

He says the case for economic regulation of broadband markets is weak at best. The Federal Communications Commission can point to, at most, two cases where things went awry — Madison River and Comcast.

Madison River was resolved with dispatch; and in the case of Comcast, the supposed cover-up was arguably worse than the alleged crime, Weisman says. "There is no offense in reasonable network management practices designed to prevent congestion and maintain service quality," he adds.

Nor is there evidence that the major incumbent telecommunications carriers or the cable companies were earning supra-normal returns that might be suggestive of market power," which might imply there is a problem waiting to be solved. http://ssrn.com/abstract_id=1525568

The structure of broadband prices is a problem in the economics of two-sided markets, though. The issue is that it is difficult to determine how the price structure should be changed to enhance economic welfare. "In other words, there can be no reasonable assurance that regulatory intervention to alter the price structure would not do more harm than good," says Weisman.

Apple Bans Google Mobile App Ads

Apple has changed the terms of its application developer agreement to block apps from using competitive ad networks operated by rivals such as Google.

That's ironic in light of "network neutrality" debates that some claim involve packet blocking, in the "restraint of trade" sense. Others point out that network management and grooming, as well as ability to create value-added services and features, are more the issue.

What is striking are the many ways packets are being groomed, blocked and shaped by application and device providers. Apple blocking Google ad network ads, or Apple refusing to share analytics with some third-party ad networks, are new examples.

Blunt instruments do not work well in a business and an ecosystem that changes this fast, especially when content pay walls, app stores, even operating systems and browsers can favor or deny access to "Internet bits."

Thursday, May 6, 2010

"Third Way?" Between Title I and Title II? Are you "Sorta Pregnant?"

One might argue that there's nothing wrong with the Federal Communications Commission trying to find some "middle way" or "third way" between common carrier and data services regulation. FCC Chairman Julius Genachowski, for example, notes that "heavy-handed prescriptive regulation can chill investment and innovation, and a do-nothing approach can leave consumers unprotected and competition unpromoted, which itself would ultimately lead to reduced investment and innovation."

Nor are many likely to disagree completely with the notion that "consumers do need basic protection against anticompetitive or otherwise unreasonable conduct by companies providing the broadband access service."

Likewise, most probably would agree that "FCC policies should not include regulating Internet content, constraining reasonable network management practices of broadband providers, or stifling new business models or managed services that are pro-consumer and foster innovation and competition."

But there is likely to be fierce disagreement about the proposal to regulate broadband access service as a common carrier offering governed by Title II regulations, even though the chairman says the FCC would "forebear" (not impose) all of the obligations and rules that cover Title II services.

The difference is that right now, the government "may not" regulate terms and conditions of service. Under the proposed rules, the government only says it "has the right to do so, but voluntarily agrees not to" impose such rules. There is a vast difference between those two approaches.

The first is a clear "thout shalt not" injunction; the new framework is only a "we promise not to" framework. The chairman argues that this new approach "would not give the FCC greater authority than
the Commission was understood to have" before the "Comcast v. FCC" case.

A reasonable person would find that hard to believe. Moving any service or application from Title I to Title II has unambiguous meaning. One can agree or disagree with the change. One can hardly call this a "reassertion of the status quo." Between Title I and Title II there is a gulf that would have to be crossed. Never before have any Internet services been considered "common carrier."

A mere promise not to act, after the change has been made, will hardly satisfy those who believe Title I is the better framework. Those who believe Title II is the better way to regulate likely will find the proposal satisfying. That would be reason enough to suggest it is not a "third way." There is in fact no third way, except for the Congress to direct the FCC to regulate broadband access as a Title II service.

The problem is that what the "service" is changes over time, making difficult the task of clearly separating what "access" is from what an enhanced feature is. Nor is it easy to differentiate between a "business" access and a "consumer" access. If business access is covered, is packet shaping still permissible? Are quality of service measures still permissible? Are virtual private networks still allowed?

Should consumer services acquire the richness of business services, or should business services be dumbed down to consumer grade? And who gets to decide? Even if one is willing to accept that an ISP cannot, on its own, provide any quality of service measures, can a customer request them? Can a customer demand them?

These are tough questions and there must be scores more people could ask. The problem is that the Title I and Title II frameworks are binary. We do have alternate models in Titles III and VI, as I recall, though I suppose both of those titles would provide more freedom, not less, and Title II is a move in the direction of less freedom.

read it here

FCC Goes for "Tactical" Nukes in Net Neutrality Fight; ISPs Will React as Though "Strategic" Weapons will Ultimately be Used

Federal Communications Commission officials seem well enough aware that proposed new "network neutrality" rules could lead to a reduction of investment in broadband facilities, which is why, reports the the Wall Street Journal, FCC officials are briefing market analysts who cover cable and telco equities before the market opens on Thursday, May 6.

The fear is that even before the rules have been announced, financial analysts will issue downgrades of cable and telco stocks as future revenue streams are jeopardized. Those analyst briefings will happen even before other FCC officials or congressional members are told how the FCC plans to proceed.

Chairman Julius Genachowski apparently plans to circulate a notice of inquiry to other FCC board members next week on his plans to reclassify broadband Internet access, provided by cable or telco providers, as common carrier services under Title II of the Communications Act.

That would put cable companies under common carrier regulation for the first time, something cable industry executives always have opposed, and will fight. Telco executives are hardly any more likely to support the changes.

The problem with the FCC's approach, which is to apply "some" Title II rules, but not all, is that there are no protections from future action that would simply apply all common carrier rules. The FCC wants to believe it can leave ISPs "sort of pregnant." They either are, or aren't, and can be expected to fight as though the outcomes were binary.

As often is the case, a natural desire for a "third way" is not possible. Title I or Title II is the issue. Forbearance rules or not, one or the other is going to apply. Get ready for war.

Wednesday, May 5, 2010

FCC Will Try to Apply Some Title II Rules to Broadband Access

Federal Communications Commission Chairman Julius Genachowski reportedly has decided to attempt Title II regulation of broadband access services, according to a report by the Wall Street Journal, despite some other reports that he was leaning against such rules.

We should know more on Thursday, May 6. Apparently the FCC will try to thread a camel through a needle, regulating only some parts of  broadband access using Title II rules, without applying every Title II provision that applies to voice services.

It does not appear the chairman will propose new wholesale access rules, but it isn't clear whether strict rules about packet non-discrimination will be sought, theoretically barring quality-of-service features from being offered. That seems unlikely, but much will depend on whether industry participants think the actual new rules open the way for further rules, down the road, that would be highly unacceptable, even if the new immediate rules are not viewed as burdensome. We shall see.

Monday, April 19, 2010

Revenue Sharing is the Heart of the Net Neutrality Matter

“The problem with mobile broadband so far has been most of the revenue it has generated has gone to over-the-top Internet content services, not to the operators,” says Pat McCarthy, Telcordia VP. “That’s what they are trying to change.”

And that is the heart of the matter as far as wrangling over network neutrality. Over time, consumers will have many options for buying customized wireless broadband plans, McCarthy says. And nearly everyone believes that will mean very-heavy users will have to pay more, in some way.

The key notion is that retail price will be related, in some way, to the cost of the services consumed. That doesn't necessarily mean billing by the byte, but probably a range of options for basic access that are similar to wireless voice plans, where users buy buckets of minutes or text messages a various prices, or unlimited use for higher prices.

Some have suggested pricing based on the value of services and applications and most providers tend to believe there should be the ability to buy optional services that maintain quality of service when the network is congested.

Standard users might get messages during peak congestion periods--perhaps rush hour or at a major sports or concert venue--that the network is congested, with their services shaped in some way. Premium users might get priority access and all users might be offered a temporary "power boost," for an additional fee, during the period of congestion.

link

Saturday, April 17, 2010

Title II A Potentially "Dangerous" Turn, Says Dvorak

"There is a proposal afoot developed by Sen. John Kerry that could undermine free speech on the Internet," says technology analyst and commentator John Dvorak. The proposal is to regulate broadband access as a regulated common carrier service, like telephone service.

You can ask yourself whether you think such a change would lead to more, or less, innovation; more or less investment; more or less choice. You can ask whether a 1934 method of regulating a stodgy monopoly service is appropriate in the 21st century for Internet services.

You can ask yourself whether having more choices, rather than less, is the likely outcome of such a move. You may ask yourself whether application priorities routinely available on private networks used by businesses, or application acceleration, as practiced by hundreds to thousands of content providers already, is a good thing or a bad thing to be forbidden.

So here comes the great idea from John Kerry: reclassifying broadband services as "telecommunications services" rather than "information services."

"This is the worst possible outcome as the FCC will eventually regulate the Internet like it does all the entities under its jurisdiction," says Dvorak.

"Why would we want the FCC to regulate the Internet?" Dvorak asks. "It's a terrible idea."

By redefining information services to telecommunications services, the Internet as we know it will be neutered as the FCC begins to crack down on foul language, porn, and whatever else it sees fit to proscribe, he argues.

"No matter the net neutrality outcome, it has nothing to do with increasing broadband penetration and speeds," he says. "It's a total scam invented to censor the Internet once and for all.

"I'm surprised people, no matter how idealistic, cannot see through it," he says.

source

Internet Oversight Needed, Just Not Title II

"Should the FCC have sway over the Internet?" a Washington Post Co. editorial asks.

For the past eight years, the FCC has rightly taken a light regulatory approach to the Internet, though it believed it had authority to do more. Now that the agency has lost in court, some advocates in the technology industries are urging the agency to invoke a different section of law and subject ISPs to more aggressive regulation, until now reserved for telephone companies and other "common carriers."

Such a move could allow the FCC to dictate, among other things, rates that ISPs charge consumers. This level of interference would require the FCC to engage in a legal sleight of hand that would amount to a naked power grab. It is also unnecessary: There have been very few instances where ISPs have been accused of wrongdoing -- namely, unfair manipulation of online traffic -- and those rare instances have been cleared up voluntarily once consumers pressed the companies. FCC interference could damage innovation in what has been a vibrant and rapidly evolving marketplace.

Some oversight of ISPs would serve the public interest as long as it recognizes the interests of companies to run businesses in which they have invested billions of dollars. Transparency and predictability are essential to encourage established companies and start-ups to continue to invest in technologies dependent on the Internet. ISPs, for example, should be required to disclose information about how they manage their networks to ensure that these decisions are legitimate and not meant to interfere with applications that compete with the ISPs' offerings.

Congress should step in to strike the appropriate balance. Enacting laws would take some time, but the process would allow for robust debate. In the meantime, any questionable steps by ISPs will be flagged by unhappy consumers or Internet watchdog groups. If ISPs change course and begin to threaten the openness of the online world, Congress could and probably would redouble its efforts.

source

Net Neutrality: Time for Evidence-Based Policy

By Thomas W. Hazlett, published in the Financial Times

A federal appeals court has bopped the Federal Communications Commission yet again. In Comcast v. FCC – the “network neutrality” case – the agency was found to be making up the law as it went. In sanctioning the cable operator for broadband network management it found dubious, the Bush-era FCC exceeded its charter. Cable modem services and digital subscriber line (DSL) connections provided by phone carriers compete – officially – as unregulated “information services.”

Congress could now mandate broadband regulation. This could have happened four years ago, when the Democrats took majority control and announced that they would impose network sharing mandates. That has not happened, and – with unemployment running at above 9 per cent – is not likely now. Net neutrality is seen, bluntly, as a jobs killer. That’s one take Congress has actually gotten right.

Alternatively, the FCC could flip its own rules, going back to a DSL regime discarded in 2005. But it would have to go further, extending “open access” to cable broadband, something is has always rejected. In 1999, when AOL and phone carrier GTE lobbied hard for cable regulation, Clinton-appointed regulators stood firm. “We don’t have a monopoly, we don’t have a duopoly,” stated FCC Chair Bill Kennard, “we have a no-opoly.” Forget regulation, encourage investment, get amazing new stuff.

But “open access” rules for DSL remained. These permitted phone company rivals to lease capacity at rates determined by regulators. It was not until February 2003 that the major requirements were ended. In August 2005, remaining rules were scrapped. A test was created. Deregulation would further investment and deployment, or quash competition and slow broadband growth. FCC member Michael Copps predicted the latter. He challenged the Commission to see if the policy would “yield the results” anticipated. “I’ll be keeping tabs,” he warned.

Yet, the market’s verdict is in – and the proponents of regulation have ignored them. Obama economic adviser Susan Crawford, arguing in the New York Times for broadband re-regulation, said that ending government DSL mandates was “a radical move… [that] produced a wave of mergers,” raising prices and lowering quality.

It is simply untrue. Mergers, governed by the FCC and antitrust agencies, have had no material impact on broadband rivalry. And the rate of broadband adoption significantly increased following deregulation. This pattern continued a trend.

Cable, unregulated, led DSL in subscribers by nearly two-to-one through 2002. Then, with DSL deregulated, phone carriers narrowed the gap, adding more customers, quarter-to-quarter, than cable operators by 2006. The spurt in DSL growth relative to cable modem usage takes place at precisely the time the former was shedding “open access” mandates, and cannot be explained by overall changes in technology. In short, DSL subscribership was up 65 per cent by year-end 2006 compared to the predicated (pre-2003) trend under regulation.

The story in ultra-high-speed fiber-to-the-home (FTTH) services is similar. There was virtually no deployment until the Commission, in late 2004, declared that fiber networks would not be subject to access regulation. That move, according to industry analysts, unleashed investment. FTTH is now offered to over 15m homes, and networks are capable of supplying 100 MBPS downloads, on a par with services delivered anywhere.

Not only has access regulation been shown to retard advanced networks, the Internet is loaded with “non-neutral” business deals where Internet Service Providers (ISPs) give preference to favored firms or applications. These negotiated contracts rationalize resource use, and drive incentives for innovation.

Data flows, unregulated, across large backbone networks that pay no fees to exchange their traffic, but collect billions from smaller networks that must fork out to inter-connect. This pay-to-play structure pushes networks to invest, grow, and cooperate.

Cable TV systems reserve broadband capacity for their own branded “digital phone” services. This special “fast lane” provides a premium service not available to independent VoIP applications. It has also transformed the competitive landscape, helping to forge fixed line competition for over 100m US households -- what the 1996 Telecommunications Act tried failed to do via network sharing mandates (tossed out by a federal court in 2004).

And the corporate history of Google offers a landmark date: on Feb. 1, 2002, the company’s search engine popped up as the default choice on 33m AOL subscribers’ home page. The coveted spot was purchased; the young firm mortgage its future to outbid search engine rivals. An application provider paying the country’s largest ISP for preferred access to its customers. That may not be a violation of net neutrality. But if not, many lawyers will be very busy explaining why.

Today’s FCC Chair, Julius Genachowski, has made a pledge: the Commission’s “processes should be open, participatory, fact-based, and analytically rigorous.” That would be a refreshing approach. In addressing new regulations for broadband, let’s first see how these markets actually work, and how well the last batch of network sharing mandates performed.

Let’s all keep tabs.

source

Monday, April 12, 2010

Verizon CEO Says Market Can Sort Out Tough Issues

Ivan Seidenberg, Verizon CEO, said at a Council on Foreign Relations meeting that there was a danger of government regulatory overreach of several types in the current environment.

" I always worry about unintended consequences of government reaching into our business," Seidenberg said. "But I believe the players in the industry--like Google, like Microsoft, like the Silicon Valley players, as well as AT&T, and us and the rest of the industry--we're creating a better dialogue."

Seidenberg also thinks the industry has to do a better job of self-policing, though, more on the model of the advertising industry. That would lessen the need for very-detailed rules crafted "in advance" of a particular problem occurring, rather than a focus on fixing such problems as actually do arise.

"In the telecom business we need industry to do a better job at policing behavior, because, in the final analysis, government could never possibly regulate every condition, in every single circumstance that could ever happen, and do it efficiently," Seidenberg said.

Seidenberg thinks one of the key problems with proposed "network neutrality" rules that would prohibit virtually any sort of packet prioritization is that it makes very hard the task of providing different types of service to customers who may want it, at the lowest-possible prices.

 "Most people think a carrier wants to charge for every minute on a linear basis in perpetuity, infinity," he said. But "we don't really want to do that."

"What we want to do is give you a chance to buy a bundle, a session of 10 megabits or a session of 30 megabits," he says. "The problem we have is five percent or 10 percent of the people are the abusers that are chewing up all the bandwidth."

"So what we will do is put in reasonable data plans, but when we now go after the very, very high users, the ones who camp on the network all day long every day... we will throttle and we will find them and we will charge them something else," he says.

"We don't want to have a linear pricing scale," he said. "We do want to find a way to give the majority of people value for bundles, but we have to make sure we find a pricing plan that takes care of that 10 percent that's abusing the system. And it's that simple."

"And therefore you have to have rules, give us discretion to run our business," Seidenberg said. "Net neutrality could negate the discretion to run your business."

"Anytime government, whether it's the FCC or any agency-decides it knows what the market wants and makes that a static requirement, you always lose," he said.  Seidenberg noted that although access speeds might be higher in Korea or France, household penetration in the U.S. market is higher than in any country in Europe, he said.

"Japan may have faster speeds, but we have higher utilization of people using the Internet," said Seidenberg.  "So our view is, whenever you look at these issues, you have to be very careful to look at what the market wants, not what government says is the most important issue."

"If you look at minutes of use, the average American uses their cell phone four times as much as the average European," Seidenberg says. But what about penetration rates?

"If you look at Europe, they publish penetration rates of 150 (percent), 160 (percent), 170 percent meaning that people have more than one phone, two phones, three phones," he notes. Seidenberg suggests the high roaming rates are the explanation.

"My guess is you probably have two or three different phones to carry to use in different countries because your roaming rates are so high," he adds. "So my point is it's a fallacy to allow a regulatory authority to sit there and decide what's right for the marketplace when it's not even close."

In fact, Seidenberg argues that the U.S. market is more advanced in ways that count.

"Verizon has put more fiber in from Boston to Washington than all the Western European countries combined," he notes. Also, "if you look at smart phones, they have exploded this market in the U.S. market."

"Ask any European if they're not somewhat envious of the advancements of smart-phone technology in the United States," he says.

The FCC is "overreaching in regulations," he says. "It's a real problem to have well-intentioned people in Washington regulating the business as they understood it to be in 1995. Bad idea."

"I don't think there is no role for government," he says. "I just worry about, when you allocate capital and you look at consumer behavior, that is not a strength of, I think, everyday transactional activity of government agencies, particularly federal government agencies."

On the technology front, Seidenberg pointed out that the opportunities for distributed, remote or cloud-based applications is growing very fast.

"But here's the thing about the iPad that's very interesting," Seidenberg said. "We look at it as a fourth screen."

"Now, the interesting thing about the iPad, from how Verizon looks at it, from a network person, first of all, it has no hard drive, right?" he said. That means lots of need to get applications from the network, sort of reversing the trend of the client-server era to put more processing and storage at the edge of the network. That has postive implications for a firm such as Verizon.

Seidenberg also does not think the FCC should attempt to take spectrum away from broadcasters and reallocate it for mobile use, Seidenberg says, although Verizon has said it generally supports FCC plans to reallocate spectrum for mobile use. "I think the market's going to settle this," he said.

link

Tuesday, April 6, 2010

Court Deals Blow to Network Neutrality: Will FCC Overreach?

Wall Street Journal "Digits" video about the Overturning of Federal Communications Authority over broadband access services.

Will AI Actually Boost Productivity and Consumer Demand? Maybe Not

A recent report by PwC suggests artificial intelligence will generate $15.7 trillion in economic impact to 2030. Most of us, reading, seein...