Showing posts with label network neutrality. Show all posts
Showing posts with label network neutrality. Show all posts

Tuesday, April 6, 2010

FCC Loses Net Neutrality Case on Appeal: No Authority to Regulate Broadband

Flash! The Federal Communications Commission does not have authority to regulate broadband services, the U.S. Court of Appeals for the District of Columbia Circuit has ruled.

The FCC had fined Comcast in 2008 for subjecting BitTorrent to traffic-management practices. The federal court has reversed a lower court decision, ruling that the FCC does not have authority to regulate broadband, in essence. The full text of the ruling is not yet available, but the decision potentially sets in motion a new direction in broadband regulation by the FCC, which now must either get new legislative authority from the Congress to regulate broadband services, or must take a potentially-divisive alternative approach: attempting to regulate broadband services as "common carrier" services.

That would set off a nuclear war between the FCC and telecom and possibly cable companies, who would feel compelled to fight the change with every weapon at their disposal. Should the FCC ultimately prevail, the nation will face years of ruinous lawsuits, bringing new broadband investment to a grinding halt as private investment drys up.

The FCC can appeal the decision, but the big question now is whether it is willing to risk nuclear war with the telecom and cable industries.

Sunday, April 4, 2010

Is Apple the New and Most-Important Gatekeeper?

Is Apple now more a transformative force in technology-using businesses than Google? Some observers have pointed out that it is possible, perhaps even likely, that Apple's equity value will exceed that of Microsoft in the near future. But Apple now is worth more than Google, with a market capitalization around $214 billion, compared to Google's $159 billion.

To be sure, content companies tend to pay more attention to Apple's moves, while communication companies tend to pay more attention to Google. One might argue that in the communications business, Apple mostly has changed the handset business, and user expectations about what can be done with handsets.

It likely will wind up being more transformative than that. Apple has proven that mobile application stores can be a source of huge end user value in the mobile ecosystem, and potentially a huge driver of revenue and margin in a business that is shifting inexorably towards applications as the driver of communications value overall.

Keep in mind that Apple previously redefined the online music business, if not the whole music business, by some estimates. Apple has had less success in the video arena, but the iPad could signal a potential shift of distribution in the print content business, or at least Apple has been arguing that is its objective.

In the mobile business, virtually everyone agrees that Apple changed end user expectations about what a handset should do, how it should work, and also cracked or broke the historic stranglehood mobile service providers have had over handset features.

In the developing mobile Internet experience, for example, there already are glimmers of a shift of experience from "Web" pages to applications. For a firm such as Google, dependent on search revenues for nearly all of its revenue, that potentially shifts revenue away from search and towards the mobile application as the way people find things.

To the extent that the mobile application, supplied by the application store, becomes the gateway for use of Web-based applications, power and financial success shift towards the app store and the device, and away from the access provider

So the issue that naturally arises is what to make of Apple's influence on the broader technology industry, which generally has moved to an "open" model, where Apple continues to operate on a "partially open" model, closed in terms of operating systems and hardware, but open--with editorial control--for applications.

As attention in the U.S. and some other markets now is turning to "gatekeeper" functions, the implications are that gatekeepers of many sorts now are arising in the Internet ecosystem. Though government regulators typically look at access providers, application providers are emerging as equally-important gatekeepers, as operating systems and browsers have caused concern in the past.

"The iPad is seen by many in the print business as a way of delivering high-value digital content to customers paying real money," notes U.K. technology observer John Naughton. There is a price: Apple will control the distribution channel and take a slice of every transaction.

The iPhone and iPad are really just gateways to the Internet, but are controlled (a better word than "closed") experiences, not the "open" or unfiltered way PCs have been used to access the Internet and its resources.

To the extent that gatekeepers are an issue, we likely will see new concerns about application and experience gatekeepers; nothing so crude as "access" gatekeeping.

Some think Google is the greatest emerging gatekeeper. Perhaps it is Apple.

related article

Tuesday, March 30, 2010

Three Things Verizon and Google Agree On

Despite differences on some other important issues, Verizon CEO Ivan Seidenberg and Eric Schmidt, Google CEO, agree on some matters related to the Federal Communications Commission's "National Broadband Plan."

In an opinion piece authored for the Wall Street Journal, the two executives say three plan elements are praiseworthy.

Not surprisingly, both agree on the plan's nod to health-care information technology, education and job training, and a smart electricity grid. All of those initiatives will tend to create opportunities for both companies.

Both agree on spurring the highest-quality broadband possible, dependent on private investment.

Both say they agree on the importance of making high-speed Internet connections available to all Americans.

The Internet has thrived in an environment of minimal regulation, they say. "While our two companies don't agree on every issue, we do agree generally as a matter of policy that the framework of minimal government involvement should continue," Schmidt and Seidenberg say.

The FCC underscores the importance of creating the right climate for private investment and market-driven innovation to advance broadband. That's the right approach and why we are encouraged to see the FCC's plan, they say.

You might argue all of these are "motherhood and apple pie" sorts of issues, which is true. But it might be significant that both can agree to support, in principle, "minimal government involvement." That doesn't mean the two firms agree on key network neutrality principles or rules. But it does seem to signal a willingness to consider approaches which allow markets to sort out issues.

As typically is the case in communications regulation, regulators will weigh what is possible and prudent, given the different interests, and take those interests into account, crafting solutions that balance the various interests, giving each side something important, while neither side gets all its wants.

That is likely to be case for network neutrality as well.

Wednesday, March 24, 2010

FCC Has No Authority to Regulate Internet, Verizon EVP Says

The Federal Communications Commission does not have the explicit power to regulate the Internet, and should wait for Congress to grant it that authority, says Tom Tauke, Verizon EVP. The statement is not as controversial as some might think, as Comcast has challenged such authority in federal court, and many observers think Comcast will prevail.

Comcast has challenged the FCC’s authority to punish it for throttling the bandwidth of customers using bitTorrent programs to share huge files.

“The authority of the FCC to regulate broadband providers under the so-called ‘Information Services’ title, or Title I, of the Communications Act [is] at best murky,” Tauke said. “In confronting this hard question about jurisdictional authority, we [are] also faced this policy question: If Title I and Title II don’t apply to the Internet space, what are you saying about the authority of government in this space?

“In a market developing at these speeds, the FCC must follow a piece of advice as old as Western Civilization itself: first, do no harm," said Tauke.

“Today about 96 percent of Americans have access to at least two providers of wireline broadband and as many as three wireless providers, and more than 55 million Americans can connect to a broadband network capable of delivering at least a 50 Mbps stream," Tauke said.

Saturday, March 13, 2010

Google to Leave China?

Google has drawn up detailed plans for the closure of its Chinese search engine and is now “99.9 per cent” certain to go ahead as talks over censorship with the Chinese authorities have reached an apparent impasse, according to the Financial Times.

Google's search results are censored in China, as are results provided by all other search engines as well.

Google is also seeking ways to keep its other operations in China going, although some executives fear that a backlash from the Chinese authorities could make it almost impossible to keep a presence in the country, the Financial Times says.

But Google’s executives have made it clear that they still hope to stay in the country, whatever the fate of Google.cn. “It’s very important to know we are not pulling out of China,” Eric Schmidt, Google’s chief executive, told the Financial Times at the time. “We have a good business in China. This is about the censorship rules, not anything else.”

The company’s other operations, which pre-date the launch of Google.cn four years ago, include its research centre in Beijing and a sales force that sells advertising on the Chinese-language Google.com search service, based outside China, to advertisers inside the country.

This sort of issue has been tough for any companies doing business in China, in the past. Software and hardware sold by companies into China can, and are, used in ways that violate sensibilities in the West. Suppression of dissent, spying on citizens and so forth do happen in China, and technology supplied by Western firms is used to do so.

Google might have to take steps that many would agree are principled and just, but will harm its business interests. Similar thorny decisions have been made by other software and hardware suppliers to the Chinese market, with different outcomes. It's an area of moral tension executives cannot escape, though most seem to prefer not to talk about it.

Beyond all that, the dilemma shows that the old Internet, where any user could communicate freely with any other user, is gone. When the any government shuts down applications people use to communicate with each other, the old Internet is gone.

Financial Times article

Thursday, March 11, 2010

Two-Tier Internet is Not Necessarily a Bad Thing, Says Esther Dyson

"The biggest problem that net neutrality has is nobody knows what they’re talking about when they talk about it," says Esther Dyson, noted computer industry analyst. "The issue is who pays and whether they’re monopolies or not, so there’s a whole lot of, I think, disingenuous discussion about control without ever really looking at the fundamental issue, which is somebody’s got to pay for more bandwidth if consumers are gonna be uploading and downloading video."

"As long as there’s healthy competition, I have no problem if someone pays extra for additional bandwidth, as long as that doesn’t cut off people’s access to the other stuff," says Dyson. That does not mean she believes access providers should be able to put up walls around Internet content. 

"There’s this disingenuous discussion of if you don’t allow us to pay extra, you’re not gonna get free content," she says. "Well, of course not, but let the consumer decide whether they want paid or subsidized."

Monday, March 8, 2010

Academy Awards High Stakes Standoff Ends 13 Minutes into Telecast

A high-stakes "Academy Awards" game of chicken ended 13 minutes into the telecast when Walt Disney Co. and Cablevision Systems Corp. settled their dispute over a new contract.

Disney had said it would pull the ABC feed from Cablevision if the cable operator did not negotiate a more-favorable contract, potentially affecting about 3.1 million homes in the New York area.

The drama, some might say, could have been higher only if the contract dispute had occurred in the hours and minutes leading up to the Super Bowl.

The contract dispute, and temporary programming interruption, underscores the increasing financial stress in the multi-channel video entertainment ecosystem. Both broadcasters and distributors face rising programming costs, lower profit margins and growing competition.

In past years broadcasters have struck different deals, agreeing to allow "no incremental cost" carriage of local broadcast feeds in exchange for operators agreeing to add new cable networks to their program lineups. Programmers essentially bartered "free" local station carriage in exchange for carriage of new cable networks.

But that was then, and this is now. These days, both broadcasters and distributors are trying to squeeze more profit out of their video operations. And consumer opposition to ever-increasing monthly subcription fees is a background issue, at the same time distribution alternatives are growing.

In a sense, the broadcast networks also are looking over their shoulders at the potential threat Internet distribution represents. But so are the cable operators. After watching the music industry become disrupted by online distribution, as well as the continued decline of newspapers, video content owners are trying to avoid "no incremental cost" distribution of their content.

Given those pressures, it does not seem likely this will be the last tussle threatening program carriage. Versus, for example, now is dark on DirecTV and has been for months, as those two firms have not agreed on new contract terms, either.

As content ecosystems are rearranged, disputes between partners are bound to grow. The same sort of ecosystem change is behind the network neutrality debate as well.

Thursday, March 4, 2010

Net Neutrality Would Increase Likelihood of Content Discrimination, Phoenix Center Says

"Net neutrality regulation is motivated fundamentally by the belief that broadband service providers will,
at some future date, seek to extract profits from the content segment of the Internet marketplace, and
net neutrality aims to stop it," says a new white paper issued by George S. Ford, Phoenix Center for
Advanced Legal and Economic Public Policy Studies chief economist, and Michael Stern, Assistant
Professor of Economics at Auburn University.

Net neutrality supporters that fear surplus profit extraction will take the form of “exclusionary” practices
such as unfair or discriminatory access prices, “fast lanes” and “slow lanes” where preferential delivery is given to content firms willing and able to pay more, or outright monopolization of content, the authors say.

Such concerns about business advantage, whether "unfair" or not, are different from the separate issue of whether currently-envisioned network neutrality rules actually provide incentives to engage in such behavior, the authors say.

Some observers might be shocked to learn that net neutrality rules could actually encourage such
business behavior, not restrain it.

In fact, the latest Phoenix Center analysis suggests that net neutrality regulation actually increases
incentives to engage in exclusionary conduct in the content sector.

"Firms always have an incentive to take those steps, which increase their profits," the authors say.
"Ironically, net neutrality rules, which are supposed to suppress privately profitable exclusionary conduct,
will actually have an effect opposite of what is intended."

Because net neutrality regulations now under consideration will not reduce the profits associated with monopolization of content, but only those associated with the participation in a competitive content market, the proposed rule encourages broadband service providers to take steps to reduce the diversity of voices on the Internet to the detriment of the public interest, Ford and Stern argue.

The point is that network neutrality rules impose pricing rules, and the issue is whether such
pricing rules are likely to encourage or discourage business policies that increase or restrict content
options.

An important question is whether or not the proposed price regulations “promote consumer
choice and competition among providers of lawful content, applications, and services” by
addressing an ISP’s alleged motivation “to exclude independent producers of applications,
content, or portals from their networks.”

The answer is “no,” the authors say. "Net neutrality rules of the type proposed by the FCC and the
Markey-Eshoo Bill encourage exclusionary behavior rather than impede it."

The policy implications of this analysis are numerous, but can be summarized at a very
high level as follows: the analytical foundation for net neutrality remains in its infancy and the
concept needs more time to evolve, the authors argue.

Since even the advocates of net neutrality regulation admit that there exists a “de facto net neutrality
regime” today, there seems to be little reason for a headlong rush into bright-line regulatory
rules when so little is known about the issue.

The rules proposed by both the FCC and Congress create incentives that may not even exist absent the regulation, and increase whatever incentives do exist for ISPs to behave badly in the content market.

Most troubling about the proposed rules is that net neutrality, it now appears, has become little
more than a quibble over profits between providers, a far cry from the origins of the concept wherein the focus was on the freedom to distribute and consume information without undue interference.

source

Tuesday, March 2, 2010

50% Faster Email Performance Using Akamai and AppRiver

The amount of time it takes users to connect and sync with Exchange was reduced by half when using an Akamai-enabled AppRiver Secure Hosted Exchange solution, reports Compuware Gomez.

In other words, not all email services are created equal. Some can use optimization techniques such as Akamai to improve performance.

"AppRiver's relationship with Akamai is an excellent example of how network optimization can create a differentiated and improved IT service, in this case hosted Exchange," says Peter Christy, Internet Research Group principal analyst.

The enhanced performance also is an example of why overly-strict "network neutrality" rules that do not allow any forms of bit prioritization are problematic. There are lots of reasons to allow users, application and service providers to differentiate services and features, and better performance is foremost among them.

The AppRiver service offers a "one-of-a-kind, optimized hosted Exchange" service. The AppRiver and Akamai service essentially "privatizes" the connection between the user and the Exchange servers in order to create a high-speed virtual private network for AppRiver customers.

"By implementing Akamai's IP Application Accelerator service, AppRiver securely provides fast and reliable hosted e-mail for both wireline and mobile users," said . "Akamai improves the overall global experience for mobile device users by optimizing the Internet and minimizing the impact introduced by oversubscribed wireless networks," says Willie Tejada, Akamai VP.

source

Sunday, February 28, 2010

Regulatory Pendulum Swings: But Which Way?

In the telecommunications business, the regulatory pendulum swings all the time, though slowly. So periods of relatively less-active regulation are followed by periods of relatively more active rule-making, then again followed by periods of deregulation.

It has been apparent for a couple of years that the regulatory pendulum in the the U.S. telecom arena was swinging towards more regulation.

What now is unclear, though, is whether such new rules will largely revolve around consumer protection and copyright or might extend further into fundamental business practices.

Current Federal Communications Commission inquiries into wireless handset subsidies and contract bundling, application of wireline Internet policies to service wireless providers, as well as the creation of new "network neutrality" rules are examples.

But so will the settting of a national broadband policy likely result in more regulation. And there are some voices calling for regulating broadband access, which always has been viewed as a non-regulated data service, as a common carrier service.

One example is a recent speech given by Lawrence Strickling, National Telecommunications and Information Administration assistant secretary, to the Media Institute.

He said the United States faces "an increasingly urgent set of questions regarding the roles of the commercial sector, civil society, governments, and multi-stakeholder institutions in the very dynamic evolution of the Internet."

Strickling notes that “leaving the Internet alone” has been the nation’s Internet policy since the Internet was first commercialized in the mid-1990s. The primary government imperative then was just to get out of the way to encourage its growth.

"This was the right policy for the United States in the early stages of the Internet," Strickling said. "But that was then and this is now."

Policy isues have ben growing since 2001, he argued, namely privacy, security and copyright infringement. For that reason, "I don’t think any of you in this room really believe that we should leave the Internet alone," he said.

In a clear shift away from market-based operation, Strickling said the Internet has "no natural laws to guide it."

And Strickling pointed to security, copyright, peering and packet discrimination. So government has to get involved, he said, for NTIA particilarly on issues relating to "trust" for users on the Internet.

Those issues represent relatively minor new regulatory moves. But they are illustrative of the wider shift of government thinking. Of course, the question must be asked: how stable is the climate?
Generally speaking, changes of political party at the presidential level have directly affected the climate for telecom policy frameworks. And while a year ago it might have seemed likely that telecom policy was clearly headed for a much more intrusive policy regime, all that now is unclear.

A reasonable and informed person might have argued in November 2008 that "more regulation" was going to be a trend lasting a period of at least eight years, and probably longer, possibly decades.

None of that is certain any longer. All of which means the trend towards more regulation, though on the current agenda, is itself an unstable development. One might wonder whether it is going to last much longer.

That is not to say some issues, such as copyright protection or consumer protection from identity theft. for example, might not continue to get attention in any case. But the re-regulatory drift on much-larger questions, such as whether broadband is a data or common carrier service, or whether wireless and cable operators should be common carriers, might not continue along the same path.

You can make your own decision about whether those are good or bad things. The point is that presidential elections matter, and the outcome of the 2012 election no longer is certain.

Friday, February 19, 2010

NARUC Calls for Controls on "Unreasonable" Packet Discrimination, Not "All" Packet Discrimination

The National Association of Regulatory Utility Commissioners has called for protecting “the right of all Internet users, including broadband wireline, wireless, cable modem, and application-based users, to have access to and the use of the Internet that is unrestricted as to viewpoint and that is provided without unreasonable discrimination as to lawful choice of content.”

The key language there is "unreasonable" discrimination. NARUC is not calling for network neutrality rules that ban "all" packet discrimination.  The problem is that some traffic types are  "latency sensitive" and can suffer at times unless packet discrimination mechanisms are used. Applications such as video, gaming and VoIP would suffer, at times of peak congestion, without priority mechanisms that users themselves may wish to have in place.

NARUC therefore has asked that policymakers and regulators keep in mind that "unreasonable restrictions or unreasonable discrimination" be areas of protection, not "all" forms of packet discrimination.

NARUC also asks for rules and regulations that will give providers incentive for continual innovation and a fair return on their investment, without jeopardizing consumer access to, and use of, affordable and reliable broadband services.

Discrimination that is solely, or primarily intended,  to protect business advantages, is an area of valid concern for policymarkers. But the Internet has changed. It is a network increasingly used to support isochronous applications (real-time applications) that are highly susceptible to degradation from latency, for example.
NARUC's position will seem to many a well-reasoned and balanced approach.

http://www.digitalsociety.org/2010/02/naruc-resolution-on-net-neutrality/

Killer Apps and Devices of 2020 Are Not Knowable

What will the killer apps and devices of 2020 be? About  80 percent of experts surveyed by the Pew Center's  Internet & American Life Project agreed that the “hot gadgets and applications that will capture the imaginations of users in 2020 will often come ‘out of the blue.’”

"The experts’ record is so lousy at spotting key technologies ahead of time that there is little chance they will see the killer gadgets and applications of 2020," Pew says. "If you had asked this question a decade ago, no one would have predicted the iPhone."

In other words, we don't know.

But some trends are clear, because they already have begun: Mobile connectivity and location-based services will grow in the next decade.  Still, it takes a generation to figure out which technologies have real impact and which are just fads, so many other application and device trends we now see might, or might not, be actual "killer apps."

 Significantly, just 61 percent of respondents suggested the Internet would remain a place where any user can communicate directly with any other user. About 33 percent think “the Internet will mostly become a technology where intermediary institutions will control the architecture and content, and will be successful in gaining the right to manage information and the method by which people access it.”

A significant number of respondents they argued there are too many powerful forces pushing towards more control of the internet for the end-to-end principle to survive. Governments
and businesses have all kinds of reasons to control what happens online, Pew reports.
There will be alternative networks for companies and individuals that prefer to have a more controlled environment for sharing and consuming content, many believe.

The future will produce a hybrid environment with a bit more control exercised in the core of the internet for some purposes, but for other purposes will enable end-to-end practices, researchers at Pew conclude, based on the responses. "Some things will have to be managed, especially if the capacity of the current internet becomes strained," Pew analysts say.

"The dictates of business will shape large parts of the online experience and more pay-to-play business models will affect information flows online," Pew says.

"The needs of users themselves will sometimes drive changes that bring more control of online material and less end-to-end activity," Pew notes. There will be “content service providers” who are gatekeepers of many users’ online experiences.

The point, one might argue, is that although the "open, end-to-end" Internet will continue to exist, so will many relatively closed experiences, sites, networks, applications and devices.

Monday, February 8, 2010

Mobile Broadband Will Need a New Business Model

One way or the other, something has got to change in the mobile business as voice ceases to be the industry revenue driver. Today mobile service providers get 86 percent of their revenue from low-bandwidth applications like voice and text. But that will keep changing in predictable ways.

Namely, most capacity requirements will be driven by low-margin data rather than high-margin voice and text.  Over the long term, it is irrational to better price services in relationship to cost without attributing more revenue directly to the data services that are driving capital investment.

That doesn't mean every single service or application necessarily has to be priced in relationship to cost. Loss leaders at supermarkets, promotional DVD prices at Target and other promotional pricing happens all the time, in every business. Some products have high margin, others low or even negative margins.

The point is that current retail pricing will get more irrational as data demand grows, and that something will have to be done about it.

Carriers are investing in new capacity, but that alone will not be enough to bring revenue and capacity into balance. By 2013, virtually all traffic load will be driven by broadband data of one sort or another, especially video. That means, over time, new ways of charging for network usage will have to be created.

Like it or not, network management is going to be necessary, plus traffic offload and policy management. The issue, in part, is that demand is unevenly distributed. Even at peak hours of congestion, only a minor percentage of cell sites actually account for most of the congestion. To speak of congestion management at the "whole network" level is not to capture the issue.

The key issue is peak-hour congestion at perhaps 10 percent to 15 percent of sites. Put another way, even at peak congestion, 85 to 90 percent of sites do not experience difficulty. That means it might be necessary to use different policies at a small number of physical sites, not the entire network, even at peak hours.

So even if traffic shaping, bit priority policies and other tools are not generally required at every site, for every application or user, there will be a need to do so at some sites, some of the time.

Tuesday, February 2, 2010

99% of BitTorrent Content Illegal?

A new survey suggests that about 99 percent of available BitTorrent content violates copyright laws, says Sauhard Sahi, a Princeton University student who conducted the analysis.

Some question the methodology, pointing out that the study only looks at content that is available, not content transferred. That might not be such a big distinction, though. Copyright holders are growing more insistent that Internet service providers actively block delivery or sending of such illegal material.

That, in turn, raises lots of issues. BitTorrent can be used in legal ways, so blocking all torrents clearly violates Federal Communications Commission guidelines about use of legal applications on the Internet. That said, the fact that the overwhelming majority of BitTorrent files consist of copyrighted material raises huge potential issues for ISPs that might be asked to act as policemen.

The study does not claim to make judgments about how much copyrighted content actually is downloaded. But it stands to reason that if such an overwhelming percentage of material is copyrighted, that most uploads and downloads will be of infringing content.

The study classified a file as likely non-infringing if it appeared to be in the public domain, freely available through legitimate channels, or  user-generated content.

By this definition, all of the 476 movies or TV shows in the sample were found to be likely infringing.

The study also found seven of the 148 files in the games and software category to be likely non-infringing—including two Linux distributions, free plug-in packs for games, as well as free and beta software.

In the pornography category, one of the 145 files claimed to be an amateur video, and we gave it the benefit of the doubt as likely non-infringing.

All of the 98 music torrents were likely infringing. Two of the fifteen files in the books/guides category seemed to be likely non-infringing.

"Overall, we classified ten of the 1021 files, or approximately one percent, as likely non-infringing," Sahi says.

"This result should be interpreted with caution, as we may have missed some non-infringing files, and our sample is of files available, not files actually downloaded," Sahi says. "Still, the result suggests strongly that copyright infringement is widespread among BitTorrent users."

Thursday, January 28, 2010

YouTube "Feather" Beta Seeks Lowest-Latency Connections

YouTube, or any video content for that matter, is tough to watch on a  low-bandwidth Internet access connection or even a computer with insufficient processing power, such as some netbooks.

So YouTube is in beta testing of "Feather," a way of optmizing latency performance on limited hardware or low-bandwidth connections.  Feather is said to work by “severely limiting the features" and "making use of advanced Web techniques for reducing the total amount of bytes downloaded by the browser."

The video playback page of Youtube Feather is fully transferred after downloading 52 Kilobytes of data compared to 391 Kilobytes that the standard pages require, some note.

Youtube Feather achieves the better performance by partially by removing standard YouTube features such as posting of comments, rating videos, or viewing all comments or customizing the embedded player.

The Feather beta suggests why strict versions of "network neutrality" might hinder innovation or end user experience. Feather works by blocking some bits and features. It is an opt-in feature, and that also is part of the danger over-zealous network neutrality rules represent. Users might want to selectively tune their use of some applications, blocking some features and bits, to optimize the experience.

Tuesday, January 19, 2010

Is Net Neutrality a Case of "Feeling Good" Rather than "Doing Good"?

With typical wit, Andrew Orlowski at the U.K.-based "The Register" skewers "network neutrality" as a squishy, intellectually incoherent concept. It is so nebulous it can mean anything a person wants it to be, and often is posed as a simple matter of "goodness." Which makes people feel righteous, without having to noodle through the logical implications.

Yes, there often is a difference between feeling good, and doing good, and Orlowski wants to point that out.

"As a rule of thumb, advocating neutrality means giving your support to general goodness on the Internet, and opposing general badness," he says. "Therefore, supporting neutrality means you yourself are a good person, by reflection, and people who oppose neutrality are bad people."

"Because neutrality is anything you want it to be, you have an all-purpose morality firehose at your disposal," he says. "Just point it and shoot at baddies."

Beyond that, there are fundamental issues that seem hard to reconcile, because they are hard to reconcile. Consider the analogy to freedom of speech.

In the United States, at its founding, the right of free speech was said to belong to citizen "speakers," engaged in clearly political speech. Recently, the opposite view has been taken, that the right belongs to "hearers of speech." But that means there is tension: is it the creator of speech who is to be protected, or those who might, or might not, want to listen.

Does copyright protect creators of intellectual content, or those who might want to access it? Do property rights in real estate protect those who own property, or those who want to own it?

Network neutrality essentially poses similar issues, and they will not be easy to reconcile.

Sunday, January 10, 2010

FCC has No Current Authority to Impose Network Neutrality Rules?

The U.S. Federal Appeals Court reviewing whether the Federal Communications Commission currently has authority to create or enforce "network neutrality" rules has not yet ruled.

But initial questioning suggests the court questions whether the Federal Communications Commission has current jurisdiction to write, much less enforce, net-neutrality rules for the Internet. So some legal observers now suggest the appeals court will in fact rule that the FCC had not authority to sanction Comcast for the way it managed peer-to-peer services.

A 2008 FCC order forced Comcast to stop throttling BitTorrent applications as a means of managing network congestion.

U.S. Court of Appeals for the District of Columbia Circuit Jude Raymong Randolph pointed out to an FCC attorney that “you have yet to identify a specific statute.”

Since the Congress has passed no laws relating to network neutrality, the FCC had, and has, no authority to take action on the matter, the judge seems to suggest.

A ruling of that sort would at least temporarily delay any new efforts by the FCC to codify new network neutrality rules, and shift the battle over such rules to the Congress.

FCC Chairman Julius Genachowski has argued the agency has authority to set net neutrality rules because of the "Internet Freedoms Principles" set in 2005, which say that users have the right to use lawful applications, which P2P is, though the use of P2P sometimes includes transfers of copyrighted content without permission.

But Comcast argues it has the right to manage its network, which it interprets as permitting rate limiting of P2P services, when necessary to preserve user experience and relieve congestion.

To be sure, the specific issue at hand seems primarily about whether the FCC’s decision was improper for statutory reasons, as Congress has not given the FCC legislative permission to create such rules, observers say.

On a wider legislative front, some observers think the White House is dialing back its efforts to get "strong" network neutrality rules adopted. The evidence is indirect, but some point to the late-October resignation of of Susan Crawford, University of Michigan law professor, previously a key adviser to the president on technology and communications, and a proponent of "strong" network neutrality rules.

According to the American Spectator, Crawford's version of Net neutrality was too radical for White House economic adviser Lawrence Summers, contributing to her early departure. If that observation is correct, it would be a sign that any new rules would not strictly ban "every" form of packet prioritization.

Many observers note that quality of service measures typically are needed when users want to interact with important video or voice services, especially as video already has become the primary driver of bandwidth consumption on a global level.

Those observers also would note that strict versions of net neutrality, that would absolutely ban any packet prioritization, would prevent Internet access providers from applying prioritization on behalf of their users, even when those users might specifcially ask for, and desire, such prioritization.

"Packet discrimination" sounds bad, and is, when it is used as a business weapon, allowing unfair competition. But packet discrimination is a good thing when it helps maintain quality of experience for the emerging applications users say are important, especially video and voice.

Also, at the recent Consumer Electronics Show, White House deputy CTO Andrew McLaughlin said the FCC had yet to determine whether Net neutrality is needed to preserve the "open Internet."

If that seems unremarkable, consider that in 2009 McLaughlin had said network management practices of cable companies that limited the speeds of large file downloads were essentially the same thing as Chinese-style Internet censorship.

Management of bandwidth-heavy applications by some users at times of network congestion is not application "blocking" or censorship. It is an effort to maintain quality of service for most users. Some methods will be more palatable than others.

The analogy is access to the old voice network. Telcos do not "censor" speech when, at times of peak load, a user might encounter a "fast busy" signal indicating that no circuits are available. The point is that every network gets congested at least some of the time.

And it always has been recognized that some method of regulating access at such times is a legitimate network management matter. In fact, a fast busy tone does mean a user has temporarily been "blocked" from the network. Sometimes a mobile voice call experiences the same sort of temporary blocking.

That sort of access blocking is not any suppression of freedom of communication or expression. It is not an infringement of Internet freedom. It is a simple way of managing a congested resource at times of peak load.

The immediate matter at hand, though, is the simple matter of legislatively-granted authority. The appeals court seems to be signaling its belief that Congress has granted the FCC no authority to impose rules about network congerstion management or methods of doing so.

Monday, December 21, 2009

Video Represents 99% of Consumer Information Consumption



Reduced to bytes, U.S. consumers in 2008 imposed an information transfer "load" of about 34 gigabytes a day, say Roger E. Bohn, director, and James E. Short, research direction of the Global Information Industry Center at the University of California, San Diego. That works out to about seven DVDs worth of data a day.

And that isn't even the most-significant potential implication. We are used to hearing about consumption of media or information in terms of "time," such as hours consumed each day. But Bohn and Short also look at information flows in terms of "bandwidth."

If one looks at consumption based on the "hours of use," video accounts for possibly half of total daily consumption.

If one looks at the flows in terms of compressed bytes, or actual bandwidth required to deliver the information, then video represents 99 percent of the flow volume.

That has huge implications for the design of any nation's communications and "broadcasting" networks. To the extent that virtually all information now is coded in digital form, a shift of consumption modes (from watching linear satellite, cable or telco TV to Internet delivery) can have huge effects.

Recall that video bits now represent 99 percent of bandwidth load. But also note that most of that load is delivered in the most-efficient way possible, by multicasting a single copy of any piece of information to every potential consumer all at once. It requires no more bandwidth to serve up an event watched by 500 million people than one person.

That is why video and audio networks historically have been designed as "mutlicast" networks. They are the most effiecient way of delivering high-bandwidth information.

If more video starts to move to Internet delivery, the bandwidth requirements literally explode. To deliver one identical piece of content to 500 million Internet users requires 500 million times as much bandwidth as the "old" multicast method, in at least the access link. If network architects are ruthlessly efficient and can cache such content at the edge of the network, wide area bandwidth consumption is reduced and the new load is seen primarily on the access networks.

All of this suggests a rational reason for maintaining "multicast" video entertainment networks, and not shifting all consumption to unicast Internet delivery. It is extremely inefficient and wasteful of network resources. To the extent that much "on demand" viewing of popular professional content can be satisifed by local storage (digital video recorders), this should be done.

On-demand viewing of YouTube content is harder to rationalize in that manner. For the same reason, local storage of computer games, where possible, makes sense. Interactive, "live" gaming does not allow such offloading, and will contribute hugely to Internet bandwidth demand, just as viewing of YouTube videos is doing.

“Information," representing flows of data delivered to people from 20 sources, is likely to be much higher the next time the researchers replicate the study, because television, which accounts for nearly half of total consumption, now has shifted from analog NTSC to high-definition, which imposes a greater information load.

Television consumption represents about 41 percent of the daily consumption, but computer and video games represent 55 percent of the flow. Add ratio and TV and those two sources represent 61 percent of the flow.

But there is another important implication: the researchers counted "compressed" information, or "bandwidth," in addition to more-familiar metrics such as hours of consumption.

Looked at in this way, the researchers say, "led to a big surprise." In fact, only three activities--television, computer games and movies account for 99 percent of the flow. All other sources, including books, mobile or fixed voice, newspapers, radio or music, contribute only one percent of total load.

The researches also point out that they count bytes as part of the  "information flow" only when users actually consume the information. Data stored on hard drives or TV or radio signals not being watched or listened to does not count in the research methodology.

The researchers also point out that if “personal conversation” is considered a source of information, then high-quality "tele-presence" applications that actually mimic talking to a person in the same room would require about 100 Mbps worth of communications load.

Three hours of personal conversation a day at this bandwidth would be 135 gigabytes of information, about 400 percent more than today's average consumption.

Thursday, December 10, 2009

By 2012, "Closed" Mobile Business Will be Over


Today, the wireless sector is on the edge of a seismic shift, says Deloitte. A survey of wireless industry executives found that 53 percent of surveyed network service provider executives believe their current closed business models will no longer exist by 2012.

That is a shocking finding, for several reasons. Many in the policy community seem convinced the only way to "change" the mobile industry is to legislate more "openness." Mobile industry executives, on the other hand, already believe openness will be the normal way they compete, within a shockingly short period of time.

One way of putting matters is that before the major legal challenges to any new set of wireless "neutrality" rules can be clarified, the industry already might have moved to an open business model, and arguably would have done so without any government action.

If some readers believe this is highly unlikely, one need look no further than the last major revision of U.S. telecommunications policy, the Telecommunications Act of 1996. Despite the fact that many observers argue the Act "failed," you would be hard pressed to find any user of communications who argues their services, prices and features are "worse" or even "the same" as prior to 1996.

Despite the current mistrust of markets, the recent record suggests that "regulatory failure" did not impede market success, defined as better and richer services for end users.

It appears the same thing is happening in the mobile business, and that mobile industry executives widely believe a shift to open models, precisely the state of affairs many policy advocates desire, already is happening at rapid speed.

In just three short years, economic power in the mobile business will be held by third party application providers, not service providers, mobile executives themselves believe.

More than half of the executives surveyed believe by 2010 the future of mobile will be driven by open mobile content, with 67 percent of the respondents believing it will be a “game changing” force within wireless in the short-term, Deloitte reports.

"When asked which mobile operating system has the greatest potential to be the U.S. de facto standard in five years, Google’s open source Android operating system was the runaway favorite with 43 percent of all votes, more than double the score of the next highest finisher," Deloitte says.

"In fact, 27 percent of those surveyed say that Internet companies, rather than network
carriers and handset makers, will dominate the U.S. wireless sector in five years," says Deloitte.

Nearly 60 percent of industry executives surveyed agreed that the future of mobile will be driven by open content and mobile software application providers.

"While almost two thirds of the survey respondents believe that open access regulations will accelerate the commoditization of U.S. wireless network carriers, companies that focus too narrowly on regulatory issues as the key catalyst for change may in fact miss the real market opportunities being driven by open platforms and technologies," Deloitte says.

The regulatory debate over "openness" obscures what will happen, irrespective of any new regulatory intervention. "In fact, when respondents were asked on the best course of action for network carriers to sustain their competitive advantage, keeping network access, devices and services tightly controlled and retaining as much as possible current proprietary business models was the least popular response."

In fact, 74 percent of the executives said that the key to their businesses in the future was to embrace open application and content models. One can argue that regulatory protections to open up networks are important because they will help this "natural" state of affairs to develop on its own.

It might not be politically popular at the moment to argue that a regulatory "light touch" still is the best course of action. But industry executives themselves seem committed to a view that open mobile networks are in fact the fast-coming and basic industry realty.

Whether one agrees that the Telecom Act was a success or failure does not seem to matter. The market seems to have lead to success, in spite of regulatory failure. Maybe we should not be in such a hurry to tinker with the process too much. It looks like openness is the future, no matter what interventions happen, or do not happen.

Net Neutrality and Free Speech: Issue More Complicated Than You Might Think

"Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances."
Most of us likely think we understand what the First Amendment to the U.S. Constitution actually means. Most of us might be surprised at how complicated the matter has proven to be. It comes as no surprise that there is vociferous debate about what speech is, what a "speaker" is and whose speech is to be protected.

Among the issues jurists and courts have had to wrestle with are "whose" rights of speech are protected. Originally, it was the rather narrow right of political speech, a right possessed by the speaker, that was protected. Over time, though, there have been refinements or travesties, depending on one's point of view.

The classic example is free speech restrictions based on time or place, for example. There is no constitutional fight to "yell fire in a crowded theater," settting off a panic.

Over time, courts have had to grapple with what a "speaker" is. Under the law, a corporation, for example, is a "person." Does a person have the right of free speech?

Over time, the definition of "speech" has widened, and now is a mix of the rights of the speaker and the "rights" of the listener.

To the extent that network neutrality touches off yet another round of debates about how the right of free speech applies, we likely will find serious debate yet again. It's a lot more complicated than most of us might think.





More Computation, Not Data Center Energy Consumption is the Real Issue

Many observers raise key concerns about power consumption of data centers in the era of artificial intelligence.  According to a study by t...