Showing posts with label network neutrality. Show all posts
Showing posts with label network neutrality. Show all posts

Sunday, November 29, 2009

Content Delivery Networks and Network Neutrality: Net Is Not Neutral

Much discussion about network neutrality seems to assume that the issue is bit or application "blocking," and from one perspective that is correct. The existing Federal Communications Commission rules about a users' right to use all lawful applications already prohibit blocking of legal applications on wired networks. The issue is whether those rules, and the other "Internet Freedoms" principles also should be extended to the wireless domain.

In another sense, popular perceptions are misguided or worse. There is a separate issue, that of whether it ever is permissible, for any legal reason, to shape traffic, either to maintain network performance, provide an enhanced service to a user, or create a new level of service.

Some will maintain there are other ways of maintaining end user experience aside from traffic shaping. That is arguably correct, but might cost so much that the entire consumer access pricing regime has to change in ways people will find objectionable.

Some argue that any traffic shaping of legal bits should be banned, because such practices have undesirable business impact. "No bits should have any priority," that line of reasoning suggests.

One might simply note that about 60 percent of video bits--almost universally served up by media companies--already enjoys such "unequal treatment." Indeed, that is the purpose of a content delivery network: to expedite the delivery of some bits, compared to others, so that a better end user experience is possible.

In fact, about $1.4 billion was spent in 2008 precisely to deliver such expedited bits. The U.S. market currently generates an estimated 55.8 percent of the global CDN traffic, though international traffic is now increasing at a faster rate than its domestic counterpart, according to Research and Markets.

And though video delivery historically has been the CDN staple, new growth areas include whole site delivery, dynamic content, "live" video, high-definition video, mobile and smartphone applications, other non-PC devices and adaptive bit rate streaming, Research and Markets notes.

Of the 22.5 billion professional video views served during 2009, Akamai delivered 31.9 percent, Limelight Networks 12 percent and Level 3 11.2 percent, says Research and Markets.. Additional CDNs active in the market include CD Networks, Velocix, Liquid Compass, Abacast, Mirror Image, Edgecast Networks, Highwinds, BitGravity, Cotendo and Internap, the firm notes.

The point is that preferential delivery of bits already is an established part of the way the Internet works. Private network users, especially businesses, also commonly set up traffic priority systems for their internal communications and content, as well.

The ability of a consumer end user to choose to use such services and applications is one of the implications of the network neutrality debate that often is lost. To reiterate, preferential treatment of bits already is happening on a wide scale, and for very good reasons: to preserve end user experience. Perhaps we ought not to be in such a rush to foreclose practices and capabilities of obvious value.

Tuesday, November 24, 2009

Do Usage Caps for Wireless and Mobile Broadband Make Sense?


Consumers say 60 percent of the wireless broadband decision is based on two factors: monthly recurring charge and existence or size of a usage cap. For that reason, "data caps" are a particularly unfriendly way to manage overall traffic, says Yankee Group analyst Philip Marshall. 

A better approach, from a service provider perspective, is to offer unlimited usage and then manage traffic usingreal-time, network intelligence-based solutions like deep packet inspection and policy enforcement, Marshall argues.

Some would argue that fair use policies that throttle maximum speeds when policies are violated is no picnic, either. But temporary limits on consumption, only at peak hours of usage, arguably is more consumer friendly than absolute caps with overage charges. 

To test consumer preferences, Yankee Group conducted a custom survey that included a "choice-based conjoint analysis," which allowed Yankee Group analysts to estimate the relative importance to consumers of key wireless broadband service attributes. The survey was taken by 1,000 mobile consumers who also use broadband access services. 

From the conjoint analysis, "we found that, on average, 59 percent of a wireless broadband purchase decision depends on two factors: service price, and the presence or absence of a 2 GByte per month usage cap," Marshall says. 

The results also indicate that 14.5 percent of a typical purchase decision is affected by service bandwidth, and that the implied average revenue per user lift when increasing bandwidth from 768 Kbps to 2 Mbps ranges between $5 and $10 per month.

The results also indicate, however, that there are diminishing returns for service plans that offer speeds above 3 Mbps, though speed increases might be useful for other reasons, such as competitive positioning. 

"Our price elasticity analysis implies that consumers are willing to pay $25 to $30 more per month for plans that offer unlimited usage, compared to plans that have a 2 GBytes a month usage cap," says Marshall.

"In a competitive operating environment, consumers will tend to migrate toward higher bandwidth services, all else being equal, but they are not necessarily willing to pay a significant premium for the added performance capability," says Marshall.

Our most recent survey results indicate that consumers require 2 Mbps to 3 Mbps bandwidth for their broadband service. This is likely to increase dramatically over the next two to three years, but the consumer survey suggests dramatically-higher bandwidth does not affect decisions as much as recurring price and existence of bandwidth caps. 

For example, when offered a choice between one package featuring a 2 GByte per month usage cap with 6 Mbps bandwidth, and another package with unlimited monthly usage but just 2 Mbps service speed, 63 percent of consumers opted for the 2 Mbps service with no cap.

Even when the choice is between an unlimited package offering only 768 Kbps bandwidth, compared to an alternative plan with 6 Mbps bandwidth and a 2 GByte per month usage cap, 57 percent preferred the 768 kbps package.

Service providers still must manage bandwidth demand though, with or without usage caps
Usage caps work to regulate demand, but users do not like them.

The other approach is not to impose the usage caps, but instead to use policy managment and deep packet inspection to manage traffic flows.

If such solutions are implemented in a non-discriminatory manner, so that all like services are treated equally, they can be implemented irrespective of network neutrality regimes currently under consideration, Marshall believes.

Thursday, November 19, 2009

If You Wanted to Build a National 100-Mbps Access Network, Could You?

The Federal Communications Commission says it will cost $350 billion to build a single, nationally available broadband access network operating at 100 Mbps and reaching virtually every American. The FCC also says it is studying whether telcos and cable companies should be forced to offer open access to third parties that want access to their networks.

Assuming one believes that both ubiquitous access and 100 Mbps speeds are a desirable thing, and virtually everyone might agree, in principle, that that is a worthy goal, the issue becomes "how to get there."

At some fundamental level, policymakers will have to decide whether they want maximum deployment and innovation in terms of new physical facilities, or mazimum third party access. 

Some will argue this is a false choice. That is possible. There is no way to predict with certainty what will happen if robust open access policies are instituted. 

That would be especially true if cable operators, for the first time in industry history, also were forced to open up their facilities for open access. 

Many will point to mandatory open access policies existing elsewhere in the world, and argue the same sorts of benefits can accrue in the U.S. setting. Some consumer advocates say open access is one reason why Internet service is cheaper and faster in those countries. it's a complicated question to answer, however. 

In most, if not all countries where robust open access rules apply to telcos, the competitive landscape is quite different from that of the United States. Few other countries have ubiquitous cable broadband and telco broadband. 

That might not seem, at first blush, to be much of an issue. It is, and the reason is as simple as pointing out that competitive markets are distinctly different from monopoly markets. Keep in mind that a single provider of very-high-speed access, operating on an open access model, still is a monopoly. There is one network and all comers can pay to use it. 

The issue is that such a provider, or providers, as would be the case in the United States, would not be able to operate as a monopoly, because there no longer is any such thing in the U.S. broadband communications business. 

In most communities, there already exist two fixed broadband access providers in the cable and telephone company. In addition, there are places where a third fixed operator exists, or one or more fixed wireless providers.

Then there are two national satellite broadband providers, Wildblue and HughesNet.

Beyond that, there are four mobile providers with existing or partially-built mobile broadband networks, as well as Clearwire, also in the process of building its own national broadband network.

So here's the problem. Where open access broadband networks are most successful, there is not a ubiquitous cable competitor fighting head to head for customers. Assume for the sake of argument that cable providers, nationally, have about 48 percent share of the fixed market, all telcos collectively have 38 percent, and other providers have the rest. 

Assume away all the issues of changing the business models of the whole industry so that one provider in each locality is charged with building a 100-Mbps access network, and is then free to provide service to all comers, at government-mandated rates.

Assume away the problem of the actual wholesale rate, which was part of the Telecommunications Act of 1996. That Act imposed just such an open access policy on major U.S. telcos.

To simplify what happened in the aftermath, telcos violently disagreed with the wholesale rates, while competitors argued just as vociferously that the mandated rates were too high. At the same time, investment in faster broadband facilities slowed dramatically, for one simple reason. Telcos saw no advantage to investing in expensive new facilities that provided a financial return unappealing to the entities who would have to lend the money.

All of that changed when new rules were written that exempted new fiber-based facilities from the open access requirements. Keep in mind that cable companies still do not have any open access requirements of any sort, and that any new broadband policies might well require them to provide wholesale access as well, and that they might also object to the mandatory wholesale rates. 

But ignore that for the moment. Here's the investment problem. Companies have to raise $350 billion in private capital to build the network. And when they develop their financial projections, they will have to note that the new revenue from building the $350 billion network is based on the incremental difference between what typical customers now pay for broadband access, and what they will pay for 100 Mbps access.

But there are other services on the network, you might point out. That's true. But here's the problem. The new network only replicates voice and video revenue already earned on the existing networks. No smart lender is going to okay huge sums based on replicating existing revenues. They will want to know what new and additional sources of revenue will exist. 

The providers can argue that where consumers now pay $40 a month for single-digit megabits per second of access, they will pay $100 to $200 a month for 100 Mbps access. Then the providers will have to model what percentage of customers will do so. When the number turns out to be quite small, the money will not be raised.

There just aren't all that many customers willing to pay $100 to $200 a month to get 100 Mbps when they can do nicely with 20 Mbps to 40 Mbps for lots less money. Ask people. They will tell you what they'll do.

You might argue that take rates will be very high if people can buy 100 Mbps for $40 a month. And that's correct. The problem is again that $350 billion cannot be raised if the new network has no ability to pay a return, in a reasonable amount of time, on the investment. And at anything like $40 a month, no lenders are going to cooperate. 

But matters actually are more complicated than that, as if that was not a show stopper. Recall that most people who want broadband access already buy it. Recall that cable providers, with their own networks, serve about 48 percent of the customers. 

Ask any cable executive you can find whether they would be willing to stop using their own network and just buy access from the telco. Go ahead. Ask anybody you can find. Let me know when you find anybody that says they will do so. 

But ignore that. Say the local telco is charged with building the 100-Mbps access network, and that somehow lenders are convinced that large numbers of people will buy the more-expensive 100 Mbps service. How many of its own customers, and customers of other providers, will switch to buying the 100-Mbps service? 

Be generous and say 20 percent of all broadband access customers can be convinced to buy the 100-Mbps service. That means about eight percent of the telco's own retail customers will do so. 

Say 20 percent of cable customers desert. That adds another 10 percent of U.S. broadband customers. Then assume 20 percent of all the other customers likewise make the move. That adds another three percent of current broadband customers.

What that all works out to is that about one in five homes or locations the new 100-Mbps network passes will buy the higher-priced access service. So the issue is whether an adequate financial payback can be built on serving one of five locations passed with a single new service.

You might argue there also is voice and video, but the problem is that the existing networks already provide those services. Additional revenue is not created just because the network changes.  

But assume an investment of $2700 per passing to build the network. Assume the 20 percent take rate and $60 a month incremental revenue per customer ($100 a month). 

Based on those assumptions, the network costs $13,500 per customer, since only one in five homes is a buyer. At an incremental $60 a month in revenue, breakeven (even at zero interest cost) is 225 months, or 18.75 years per customer.

Nobody will lend money for a breakeven of 18.75 years, and that is assuming zero interest on borrowed money.

An open-access 100-Mbps network might be a worthy public policy goal. But it is hard to see how money can be raised to build it. 

Thursday, November 12, 2009

Does "Open Access" Lead to More or Less Consumption of Broadband?

Samuel Clemens famously quipped that there are "dies, damned lies and statistics." Something like that seems to be at the heart of conflicting analyses of the impact of widespread open access requirements on consumer buying of broadband access services.

The Berkman Center for Internet & Society suggests robust open access regulation increases consumer buying of broadband while analysts at the Phoenix Center says the opposite is true.

The interpretation matters. Good public policy requires decisions that are based on facts, as difficult as it may be to determine precisely what the "facts" are. The wrong "fact base" will lead to policies that could harm the intended public policy goal.

http://www.fcc.gov/stage/pdf/Berkman_Center_Broadband_Study_13Oct09.pdf

http://www.phoenix-center.org/perspectives/Perspective09-05Final.pdf

Tuesday, November 3, 2009

"Surprising" AT&T Stance on Net Neutrality?

Some people might be shocked to learn that AT&T complies with existing Federal Communications Commission rules. Some people might be shocked to learn that AT&T actually already agrees that "best effort" Internet services ought to treat every packet the same as every other.

“We use the principle of ‘us on us,’” says AT&T  CTO John Donovan. “If we take an external developer and ourselves, we should not be advantaged in how long it takes or how much expertise is required."

"I don’t think it needs to be that complicated," he says. Does any application run by any third party work as well on the network as an AT&T-provided application?

"Outside applications need to be on an equal footing with our own applications," Donovan says.

But that's part of the problem with net neutrality. It is very hard to define and covers a range of business discrimination issues, network management and performance practices as well as potential future services that consumers might very well want to buy, that provide value precisely because they allow users to specify which of their applications take priority when the network is congested.

As a working definition, net neutrality is the idea that ISPs cannot "discriminate" between packets based on the owner or sender of packets, or on the type of lawful application, or block lawful packets.

The latter principle already applies to fixed broadband access connections, and the new change might be the extension of such rules to wireless providers. What is "new" in the current net neutrality debate is that concept that no packet can be afforded expedited handling, compared to another.

At some level, this is common sense. One wouldn't want video packets or voice packets sold by a third party to be disadvantaged, compared to video packets sold by the Internet access provider, for example.

But that isn't the issue in the current round of discussions and the possible FCC rulemaking. The issue is more an issue of  whether "affirmative" packet handling, as opposed to "negative" packet handling, will be lawful in the future.

"Negative" packet handling is sort of a "thou shalt not" approach: application providers should have a reasonable expectation that their best-effort Internet traffic will be handled the same way as any other application provider's traffic is treated. So ISPs "shalt not" provide any quality-of-experience advantage for their own application bits, as compared to any other bits delivered over the network.

All that sounds fair and reasonable, and in fact ISPs (after a few notable cases of interference), have concluded it is not worth the public outrage to block or delay any packets to heavy users, even when networks are congested, for the purpose of maintaining overall user experience for all the other users.

But there are several issues here. Good public policy would forbid business discrimination, a situation where any ISP could attempt to favor its own applications over those provided by its competitors. Back in the "old days," an example might have been a refusal by one telephone company to deliver calls from a rival.

But the network neutrality debate is far more complicated than that. There is a broad area where network management policies designed to maintain performance might be construed as business discrimination, even when the purpose is simply to protect 95 percent of users from heavy demand created by five percent of users.

Under heavy load, real-time applications such as video and voice suffer the most. So either end users might want, or ISPs might prefer, to give priority to those sorts of applications, at peak load, and slow down packets less sensitive to delay.

The problem with crudely-crafted net neutrality rules is that they might make illegal such efforts to maintain overall network performance for most applications and most users. One can hope that will not be the result, but it remains a danger.

The other issue is creation of new services or applications that can take advantage of expedited handling. Users might want their video or voice packets to have highest priority when there is network congestion. Crude net neutrality rules might make that impossible. But one can hope policymakers will take that sort of thing into consideration.

Net neutrality is a very-complicated issue with multiple facets. Ironically, end users might, in some cases, actually want packet discrimination.

Friday, October 30, 2009

Pandemic Would Impair Residential Broadband, GAO Says

In a serious pandemic, residential Internet access demand is likely to exceed the capacity of Internet providers’ network infrastructure, says the Government Accountability Office. That means enterprise and government disaster recovery efforts that depend on residential broadband connections may not work as planned, GAO warns.

In a serious pandemic, U.S. businesses, government agencies and schools could experience absenteeism (or forced dispersal of workers as precautionary measure) that could reach 50 percent or higher ranges, thereby displacing Internet access demand from normal daytime sites to homes, says the Government Accountability Office.

But residential broadband networks are not designed to handle this unexpected load, and could interfere with teleworkers in the securities market and other sectors, according to the Department of Homeland Security.

Oddly enough, robust network neutrality measures, such as forbidding any prioritization of bits, could render impotent one obvious way of handling the sudden explosion of traffic.

"Private Internet providers have limited ability to prioritize traffic or take other actions that could assist critical teleworkers," GAO says. "Some actions, such as reducing customers’ transmission speeds or blocking popular Web sites, could negatively impact e-commerce and require government authorization."

In other words, laws and rules that forbid "packet discrimination" would impair ability to prioritize more-important work-related uses of the residential Internet.

"Increased use of the Internet by students, teleworkers, and others during a severe pandemic is expected to create congestion in Internet access networks that serve metropolitan and other residential neighborhoods," GAO warns.

"Localities may choose to close schools and these students, confined at home, will likely look to the Internet for entertainment, including downloading or 'streaming' videos, playing online games, and engaging in potential activities that may consume large amounts of network capacity," GAO says.

"Additionally, people who are ill or are caring for sick family members will be at home and could add to Internet traffic by accessing online sites for health, news, and other information," GAO adds. "This increased and sustained recreational or other use by the general public during a pandemic outbreak will likely lead to a significant increase in traffic on residential networks."

"If theaters, sporting events, or other public gatherings are curtailed, use of the Internet for entertainment and information is likely to increase even more," GAO says. At-home workers will only compound the problem.

Oddly enough, the mechanisms ISPs could use to prioritize bandwidth so that a suddenly-scarce resource can be managed are precisely the tools strong "network neutrality" forbids.

"A provider could attempt to reduce congestion by reducing the amount of traffic that each user could send to and receive from his or her network," says GAO. "Such a reduction would require adjusting the configuration file within each customer’s modem to temporarily reduce the maximum transmission speed that that modem was capable of performing—for example, by reducing its incoming capability from 7 Mbps to 1 Mbps."

"However, according to providers we spoke with, such reductions could violate the agreed-upon levels of services for which customers have paid," GAO points out.

And that is even before any new regulations that specifically would outlaw packet shaping that could, for example, limit video streaming, gaming, and peer-to-peer and other bandwidth-intensive applications during daytime work hours, when teleworkers will have an arguably greater need to maintain functioning connections for voice and data operations essential to their work.

Overly-casual positioning of the need for "packet equality" rules can be dangerous, as the GAO points out.

Thursday, October 29, 2009

Google Blocks Calls to About 100 High-Cost Telephone Numbers

Google says that although it still blocks use of Google Voice to terminate calls to fewer than 100 U.S. telephone numbers with unusually high termination cost, it still does so. Earlier, Google Voice had been blocking calls to thousands of numbers in some exchanges.

In a letter to the Federal Communications Commission, Google says a June 2009 study it conducted found that the top 10 U.S. telephone prefixes Google Voice was terminating accounted for 1.1 percent of its monthly call volume, about 161 times the expected volume for a "typical" prefix. That 1.1 percent of calls also accounted for 26.2 percent of its monthly termination costs.

Google says terminating those calls costs as much as 39 cents a minute. Google therefore blocked Google Voice calls to less than 100 U.S. telephone numbers, based on that study.

The difference is that where Google had before only been able to block calls to prefixes, it now can block specific telephone numbers with highly asymmetric traffic typical of free conference call services, for example, which never place outbound calls, but simply receive them.

Wednesday, October 28, 2009

Consumption-Based Billing Coming?

Sandvine has released Usage Management 2.5, a software solution that enables fixed-line network operators to implement consumption-based billing models, real-time subscriber communications and multiple service plan tiers. The move is significant as it suggests retail pricing might move in that direction in the future, representing a major shift in retail pricing models.

Historically, consumption-based billing has been problematic for Internet service providers. Time Warner Cable tested and then decided not to implement metered billing earlier in 2009 after widespread consumer resistance to tests in in Rochester, N.Y., Austin and San Antonio, Tex., and Greensboro, N.C.

User behavior also is powerfully affected by billing methods. At one point in time America Online charged users by the minute for their dial-up Internet access usage. When it converted to flat fee billing, usage and subscribers exploded, and AOL became the largest U.S. ISP.

Similar results have been seen when other types of services, such as voice calls, also moved from per-minute to flat rate or "buckets" of usage. Generally, users spend more time talking or using the Internet when they are not metered for that usage.

Mobile voice services have a half-way approach that combines usage limits with much of the perceived freedom users feel when they are not charged strict per-minute charges. Such "buckets" of usage are a likely direction much retail Internet access pricing will move as bandwidth-intensive applications become more important and if new "network neutrality" rules forbid ISPs from shaping overall demand at times of peak congestion.

The alternative to traffic shaping then would shift to other measures such as increasing raw bandwidth or providing incentives for users to limit their consumption at peak hours. The former obviously requires more investment, which then would have to be reflected in higher prices, while the latter would allow for more gradual investments and therefore stable or more slowly increasing prices.

One problem today is that few consumers have any idea how much bandwidth they use. The new Sandvine tool would simultaneously allow users to monitor and understand their own behavior, as well as provide ISPs with better ways to create plans matched to end user behavior.

The Sandvine tool also would help ISPs create quality-sensitive service or personalized plans, assuming Federal Communications Commission or Congressional rules allow them to be offered.

Real-Time Internet Traffic Doubles

Real-time entertainment has almost doubled its share of total Internet traffic from 2008 to 2009, while gaming has increased its share by more than 50 percent, says Sandvine. Real-time entertainment traffic (streaming audio and video, peer-casting, place-shifting, Flash video) now accounts for 26.6 percent of total traffic in 2009, up from 12.6 percent in 2008, according to a new analysis by Sandvine.

As the percentage of real-time video and voice traffic continues to grow, latency issues will become more visible to end users, and will prompt new efforts by Internet access providers to provide better control of quality issues not related directly to bandwidth.

One reason is that video downloads, for example, are declining in favor of real-time streaming. Downloaded content is less susceptible to latency and jitter impairments.

Traffic to and from gaming consoles increased by more than 50 percent per subscriber as well, demonstrating not only the popularity of online gaming, but also the growing use of game consoles as sources of “traditional” entertainment such as movies and TV shows, says Sandvine.

Gaming, especially fast-paced action games, likewise are susceptible to experience impairment caused by latency and jitter.

.The growth of real-time entertainment consumption also is leading to a decline of peer-to-peer traffic. At a global level, P2P file-sharing declined by 25 percent as a share of total traffic, to account for just over 20 percent of total bytes, says Sandvine.

The changes have key implications for ISPs and end users. One way to protect real-time service performance for applications such as voice, video, videoconferencing and gaming is to take extra measures to protect latency performance for such real-time applications. And that is where clumsy new network neutrality rules might be a problem.

Whatever else might be said, user experience can be optimized at times of peak congestion by prioritizing delivery of real-time packets, compared to other types of traffic that are more robust in the face of packet delay. File downloads, email and Web surfing are examples of activities that are robust in the face of congestion.

So it matters greatly whether ISPs can condition end user traffic--especially with user consent--to maintain top priority for streaming video, voice or other real-time applications when networks are congested. Enterprises do this all the time. It would be a shame if consumers were denied the choice to benefit as well.

Monday, October 26, 2009

Net Neutality: What Verizon and Google Can Agree On

Though there are many issues upon which Verizon and Google disagree, both companies say they agree on some elements of network neutrality.

"For starters we both think it's essential that the Internet remains an unrestricted and open platform. where people can access any content (so long as it's legal), as well as the services and applications of their choice," say Lowell McAdam, CEO Verizon Wireless and Eric Schmidt, CEO Google.

That should come as no surprise. Those rules already are part of the Federal Communications Commission "Internet Freedoms" principles.

Both executives say the current debate about network neutrality is about the best way to "protect and promote the openness of the Internet."

Both executives say "it's obvious that users should continue to have the final say about their web experience, from the networks and software they use, to the hardware they plug in to the Internet and the services they access online."

"Second, advanced and open networks are essential to the future development of the Web," McAdam and Schmidt say. "Policies that continue to provide incentives for investment and innovation are a vital part of the debate we are now beginning."

"The FCC's existing wireline broadband principles make clear that users are in charge of all aspects of their Internet experience--from access to apps and content, so we think it makes sense for the
Commission to establish that these existing principles are enforceable, and implement them on a case-by-case basis," McAdam and Schmidt say.

"We're in wild agreement that in this rapidly changing Internet ecosystem, flexibility in government policy is key," they emphasize. "Policymakers sometimes fall prey to the temptation to write overly detailed rules, attempting to predict every possible scenario and address every possible concern," and that
"can have unintended consequences."

Both executives say "broadband network providers should have the flexibility to manage their networks to deal with issues like traffic congestion, spam, "malware" and denial of service attacks, as well as other threats that may emerge in the future, so long as they do it reasonably, consistent with their customers' preferences, and don't unreasonably discriminate in ways that either harm users or are anti-competitive."

"They should also be free to offer managed network services, such as IP television," both men say.

"While Verizon supports openness across its networks, it believes that there is no evidence of a problem today -- especially for wireless -- and no basis for new rules and that regulation in the US could have a detrimental effect globally," they say. "While Google supports light touch regulation, it believes that safeguards are needed to combat the incentives for carriers to pick winners and losers online."

That isn't to say the two firms have identical interests or views. But as we have seen in prior discussions about net neutrality, there is more room for compromise than sometimes seems to be the case. That undoubtedly will be the case this time around, as well.

Thursday, October 22, 2009

Will Net Neutrality Curtail Broadband Investment?

Nobody knows what final shape of proposed new network neutrality rules might take. What already is clear is the debate over the impact of such rules on network investment. Predictably, proponents of strong new rules say carriers are bluffing about the stifling effect new rules might have.

Just as predictably, leading industry executives say that is precisely the danger.

“We’ve invested more than $80 billion over the last five years to build these platforms for growth, and that’s Verizon alone,” says Verizon Chairman Ivan Seidenberg.

Speaking about the transformative role communication and information technologies can, and should have, Seidenberg cautioned that “while this future is imminent, it is not inevitable, and the decisions we make today – as an industry and as a country – will determine whether the benefits of these transformational networks will be felt sooner or much, much later.”

“Our industry has shown that we can work with the government as well as our partners and competitors to achieve mutually desirable goals of more competition, consumer choice and broadband expansion," Seidenberg says. "But we can’t achieve these ends if we interrupt the flow of private capital and delay the cascading productivity impacts of a more networked world."

“Rather than impose rigid rules on a rapidly changing industry, the FCC should focus on creating the conditions for growth,” he says.

Frank Gallaher, Stifel Nicolaus analyst, warned of just that outcome. At least some policy advocates are too sanguine about the impact on investment if harsh new rules are inacted. Likewise, Matt Niehaus, Battery Ventures analyst, warned that telecom investment capital has been declining over the past 10 quarters. The capital flight is caused in large part because of a perception that there is too much competition in telecoms, and therefore further investment is less likely to provide an adequate return on capital investment.

 "It's a perception in Wall Street, there's too much competition, and therefore it's difficult for entities to obtain a great return, " he says.

  "One of the things that worries me, is you can execute very well, and the problem is you may do all those things right, yet it's not clear you will be rewarded on the back end for it," Niehaus says.

But S. Derek Turner, Free Press research director, says carrier investment decisions are driven by a variety of factors, but regulation plays only a minor role.

"In general, firms’ investment decisions are driven primarily by six factors: expectations about demand;
supply costs; competition; interest rates; corporate taxes; and general economic confidence -- making
the overall decision to invest a complex process that is highly dependent on the specific facts of a given
market," says Turner. "It is simply wrong to suggest that network neutrality, or any other regulation, will
automatically deter investment."

Turner argues that "at the end of 2006, AT&T, as a condition of its acquisition of BellSouth, was required by the FCC to operate a neutral network for two years. During this period, while operating under network neutrality rules, AT&T’s overall gross investment increased by $1.8 billion, more than any other ISPs in America."

"In its wireline segment (which was specifically subject to the FCC’s fifth principle of nondiscrimination
in addition to the other four open Internet principles in the agency’s Internet policy statement, AT&T’s
gross capital investment increased by $2.3 billion," says Turner.

As a percentage of wireline revenues, AT&T’s wireline investments grew from 13.5 percent in 2006 to 20.2 percent in 2008, he also argues.

"During the years following the imposition of pro-competitive regulations on incumbent phone
companies as stipulated in the 1996 Telecom Act, investment as a percentage of revenue by these
companies rose from nearly 20 percent before the enactment of the law to a high of 28 percent in
2001," Turner argues. "In the years following the dismantling of these rules, relative investment levels declined to below 17 percent in 2008."

In fairness, the issue is fairly complex. One might argue that AT&T was willing to invest, even under temporary "neutrality" rules, precisely because those rules were temporary. One might argue that some investment was driven by competitive concerns, not necessarily because of high return on invested capital.

Indeed, the fact that investment, as a percentage of revenue, has grown is precisely because returns are lower than before precisely because the returns from broadband services are lower than for voice services.

Also, investment might have declined in 2008 because of the recession, or because such investment is powerfully affected by the general level of competition. In other words, executives might have been investing more than they believed they "should," not to gain revenue or share but simply to hold it. That, in fact, is precisely what executives say privately.

The other imponderable is that current net neutrality rules are fairly benign, and simply allow end users access to all lawful applications. Proposed new rules might go much further, and prohibit development of new services, driving new revenue, at a much more serious level.

To argue that benign rules have had benign impact is one thing. It is quite another thing to extend rules in ways that might actually choke off needed new revenue opportunities, at a time when everybody agrees the current revenues are unsustainable. Forcing wireless companies to follow the same rules that might be applied to wired networks with vastly more bandwidth is one example.

Net Neutality: What Verizon and Google Can Agree On

Though there are many issues upon which Verizon and Google disagree, both companies say they agree on some elements of network neutrality.

"For starters we both think it's essential that the Internet remains an unrestricted and open platform. where people can access any content (so long as it's legal), as well as the services and applications of their choice," say Lowell McAdam, CEO Verizon Wireless and Eric Schmidt, CEO Google.

That should come as no surprise. Those rules already are part of the Federal Communications Commission "Internet Freedoms" principles.

Both executives say the current debate about network neutrality is about the best way to "protect and promote the openness of the Internet."

Both executives say "it's obvious that users should continue to have the final say about their web experience, from the networks and software they use, to the hardware they plug in to the Internet and the services they access online."

"Second, advanced and open networks are essential to the future development of the Web," McAdam and Schmidt say. "Policies that continue to provide incentives for investment and innovation are a vital part of the debate we are now beginning."

"The FCC's existing wireline broadband principles make clear that users are in charge of all aspects of their Internet experience--from access to apps and content, so we think it makes sense for the
Commission to establish that these existing principles are enforceable, and implement them on a case-by-case basis," McAdam and Schmidt say.

"We're in wild agreement that in this rapidly changing Internet ecosystem, flexibility in government policy is key," they emphasize. "Policymakers sometimes fall prey to the temptation to write overly detailed rules, attempting to predict every possible scenario and address every possible concern," and that
"can have unintended consequences."

Both executives say "broadband network providers should have the flexibility to manage their networks to deal with issues like traffic congestion, spam, "malware" and denial of service attacks, as well as other threats that may emerge in the future, so long as they do it reasonably, consistent with their customers' preferences, and don't unreasonably discriminate in ways that either harm users or are anti-competitive."

"They should also be free to offer managed network services, such as IP television," both men say.

"While Verizon supports openness across its networks, it believes that there is no evidence of a problem today -- especially for wireless -- and no basis for new rules and that regulation in the US could have a detrimental effect globally," they say. "While Google supports light touch regulation, it believes that safeguards are needed to combat the incentives for carriers to pick winners and losers online."

That isn't to say the two firms have identical interests or views. But as we have seen in prior discussions about net neutrality, there is more room for compromise than sometimes seems to be the case. That undoubtedly will be the case this time around, as well.

Saturday, October 17, 2009

End User Danger from Overly-Broad Net Neutrality?

Keep in mind that there is nothing the government can do about the Internet, the quality of our services, the amount of innovation or investment in innovation that can fail to benefit or harm somebody's interests.

That doesn't mean any particular policy is wrong or right, simply that there is nothing "good" anybody can do in Washington, D.C. that does not at the same time have huge financial implications. The way I have always understood this principle is that "for every public purpose there is a corresponding private interest."

Perhaps nothing would have greater potential impact than any move to apply regulations--of any new sort--to IP networks generally, not just the "public Internet."

The reason would be troubling is that all sorts of networks now use IP technology, not just the "Internet." Private corporate networks, satellite TV, cable TV, telco TV, satellite and terrestrial networks of many sorts use the same technology as the public Internet, but are not part of the public Internet.

From a policy perspective, that implies great danger. The reason is that radio, TV, print and communications all are regulated in very different ways. But as all services now can be delivered using IP technology or the public Internet, definitiions that are too broad will ensnare any "net neutrality" rulemaking in a broader regulatory discussion that simply cannot be entertained at the FCC's level.

Raise the number of affected interests, as such a broad move to regulate all IP traffic would, and nothing will happen. Some might find this the best outcome, but to the extent that anything rational gets accomplished, the discussion must be contained in some real ways.

The nature of broadband access lines is that they can carry any sort of traffic, and some of that traffic is regulated in very different ways, some of which the government has little right to regulate. Phone services are the most-heavily regulated, content of the sort we once associated with newspapers is least regulated.

Radio and TV broadcast content is more regulated than print, less regulated than voice. Cable TV is slightly more regulated than "broadcast," in some ways, slightly less regulated in other ways. Private data networks used by businesses tend not to be regulated at all.

The danger is that too-broad an approach accidentally will be taken, ensnaring the entire discussion in broader areas that arguably do need review, but frankly are so complicated now that nothing could be accomplished.

The specific goal of proposed new non-discrimination rules is precisely that: protecting application providers from access provider discrimination. The problem is that "packet discrimination" is at the heart of many other services of extreme value to end users.

Voice, video entertainment and core enterprise business processes are prime examples. Whole ecosystems of end user value are based on the ability to maintain quality of experience at a high level.

On any communications network with congestion, and that is virtually all networks, some applications have higher end user value than others. Packet prioritization of some sort might, under such conditions, be valuable to end users.

So long as business discrimination is not the result of such prioritization, there are lots of good reasons for continuing to allow IP-based businesses to do so, especially when they have the right to do so, based on their differing regulatory regimes.

The danger here for end users and providers of applications is an overly-broad treatment of "net neutrality," and the issue of whether we are talking about private IP networks or the "public" Internet is such an example, especially as Web browsers might be used as the client side access to private services.

Friday, October 16, 2009

Do Prices, Speeds Benefit From Robust Broadband Wholesale Policies?

“Open access” policies—unbundling, bitstream access, collocation requirements, wholesaling, and/or functional separation—have played a core role in the first generation transition to broadband in most countries with high access rates and lower prices, a new study by the Berkman Center for Internet & Society suggests.

The authors suggest the same principles will be important in the next phase of development, where higher speeds must be provided, as well.

The highest prices for the lowest speeds are overwhelmingly offered by firms in the United
States and Canada, all of which inhabit markets structured around “inter-modal” competition—that is, competition between one incumbent owning a telephone system, and one incumbent owning a cable system, the report argues.

The lowest prices and highest speeds are almost all offered by firms in markets where, in
addition to an incumbent telephone company and a cable company, there are also competitors who entered the market, and built their presence, through use of open access facilities, the report says.

The argument, in essence, is that robust wholesale policies contribute meaningfully to providing consumers with faster speeds and lower prices.

There is a logic to the argument which is hard to disagree or agree with in the abstract, since another huge issue is the setting of policy frameworks that encourage robust investment in new broadband networks by private entities.

No policy will be effective, in any particular country, if private capital cannot be raised to build the networks. Conversely, any policy can work so long as adequate capital can be raised.

And though the temptation is to argue about the implications for strong "network neutrality" policies, that is a different issue. The issue here is the same argument national policymakers had when the Telecommunications Act of 1996 was weighed, namely, "what is role for wholesale policies" in setting pro-growth and pro-competititive policies?

Friday, October 9, 2009

3% Consume 40% of Mobile Bandwidth, AT&T Says


The top three percent of smartphone users consume 40 percent of all mobile data bandwidth, says AT&T Mobility and Consumer Markets president Ralph de la Vega. Those three percent of users also consume 13 times the data of the average smart phone user, he adds. Another way of quantifying such usage is to note that users who consume 40 percent of AT&T's mobile data bandwidth constitutute just 0.9 percent of all AT&T postpaid mobile subscribers.

The point was clear enough: Without adequate management of network access, most customers will find their experience damaged because of a small number of other users.

There are legitimate public policy concerns about anti-competitive behavior in the wireless and wireline businesses where it comes to gatekeepers of any sort using that power to impair competition. But that is a different and distinct matter from the obvious need to manage shared network resources in ways that actually preserve reasonable access for all other users.

De la Vega used the word "crowd out" to describe such contention, and it is a legitimate issue. Anti-competitive actions certainly are to be protected against. But there are valid network resource managment issues that obviously have to be addressed as well, especially in the wireless domain.

Beyond that, there are valid reasons for wanting competition protected, but without stifling consumer access to new products that offer mass market customers features enterprise users take for granted, such as the ability to prioritize their own use of bandwidth to perserve performance of mission-critical applications. If any consumer end user wants to prioritize their own video, voice or other bits, they ought to be able to do so.

There is nothing anti-competitive about this, so long as any applications in the class can receive such prioritization. Consumer advocates are right to note that issues can arise if voice bits sold by the ISP can be prioritized, but not voice bits sold by other competing service providers.

Some approaches will work better than others, and that is an issue one would hope policymakers take seriously into account as new "neutrality" rules are crafted.

Thursday, October 8, 2009

AT&T Chief Warns About Heavy Users

The top three percent of smartphone users consume 40 percent of all mobile data bandwidth, says AT&T Mobility and Consumer Markets president Ralph de la Vega. Those three percent of users also consume 13 times the data of the average smart phone user, he adds. Another way of quantifying such usage is to note that users who consume 40 percent of AT&T's mobile data bandwidth constitutute just 0.9 percent of all AT&T postpaid mobile subscribers.

The point was clear enough: Without adequate management of network access, most customers will find their experience damaged because of a small number of other users.

There are legitimate public policy concerns about anti-competitive behavior in the wireless and wireline businesses where it comes to gatekeepers of any sort using that power to impair competition. But that is a different and distinct matter from the obvious need to manage shared network resources in ways that actually preserve reasonable access for all other users.

De la Vega used the word "crowd out" to describe such contention, and it is a legitimate issue. Anti-competitive actions certainly are to be protected against. But there are valid network resource managment issues that obviously have to be addressed as well, especially in the wireless domain.

Beyond that, there are valid reasons for wanting competition protected, but without stifling consumer access to new products that offer mass market customers features enterprise users take for granted, such as the ability to prioritize their own use of bandwidth to perserve performance of mission-critical applications. If any consumer end user wants to prioritize their own video, voice or other bits, they ought to be able to do so.

There is nothing anti-competitive about this, so long as any applications in the class can receive such prioritization. Consumer advocates are right to note that issues can arise if voice bits sold by the ISP can be prioritized, but not voice bits sold by other competing service providers.

Some approaches will work better than others, and that is an issue one would hope policymakers take seriously into account as new "neutrality" rules are crafted.

Monday, October 5, 2009

Does Net Neutrality Posse Credit Risk for U.S. Wireless Providers?

While there is still uncertainty around potential new rules regarding net neutrality and its impact on wireless operators, Fitch Ratings does not believe potential regulatory changes will materially affect the credit profiles of wireless companies over the longer term.

Fitch does believe the controversial plans outlined by the FCC chairman could face process delays and potential legal challenges once there is clarity about the proposed rules. In other words, there will be no clarity for some time after promulgation of new rules.

In Fitch's opinion, the competitive environment would have likely dictated that the wireless industry naturally evolve in this direction but the conditions and rules currently contemplated by the FCC will likely accelerate the pace at which this transition occurs and place more definitive regulatory restrictions on wireless operators.

Consequently, carriers will likely need to adapt access plans to mitigate the impact that devices with more data intensive applications could have on network quality.

Since nearly all markets experience lower demand when prices are raised, it is likely that access pricing will evolve in ways that generally  match consumption to usage, though that does not have to take the form of strict metering of usage, but more likely will take the form of buckets of use, one would suspect.

Fitch also believes that 4G networks offer the potential to generate additional revenue from several new sources like machine-to-machine applications which could more than offset pressure from further erosion of voice related average revenue per user.

From a credit perspective, Fitch believes the dominant market share, higher margins, strong free cash flow, and robust spectrum portfolios of Verizon Wireless and AT&T Wireless strongly position the companies to capture additional share and future market growth opportunities, at least partially offsetting structural changes that could pressure certain revenue and cash flow streams.

However, the market strength of Verizon and AT&T has implications for the remaining national, regional and niche wireless operators, which will likely face increasing credit risk as the wireless industry evolves to 4G and the competitive market intensifies for certain products and services.

Net Neutrality Structurally Flawed, Entropy Economics Says

Network neutrality has deep structural flaws, says Bret Swanson, Entropy Economics president.

Though aiming to ensure equal treatment of applications on the best-effort Internet, network neutrality would ban packet prioritization that might actually benefit consumers, denying them the ability to voluntarily buy services that ensure best performance for voice and video, or any other applications they may deem equally important.

Network neutrality as envisioned by the Federal Communications Commission also would prohibit creation and offering of new differentiated services, he argues.

The FCC seems to argue that although the Internet and the Web have been wild successes, the market cannot be counted on to take the Internet to the next level, Swanson argues.

"The events of the last half-decade prove otherwise," he says. Since 2004, bandwidth per capita in the U.S. grew to three megabits per second from just 262 kilobits per second, and monthly Internet traffic increased to two billion gigabytes from 170 million gigabytes—both tenfold leaps.

The FCC's desire to extend wireline rules to wireless likewise is dangerous, he argues. No sector has boomed more than wireless, yet the FCC wants to extend new regulations to the technically complicated and bandwidth-constrained realm of wireless, he argues.

Wireless carriers invested $100 billion in just the past three years, and the United States vaulted past Europe in fast 3G mobile networks while Americans enjoy mobile voice prices 60 percent cheaper than foreign peers, he argues.

The danger is that heavy-handed new rules will stifle needed investment in new networks.

"My research suggests that U.S. Internet traffic will continue to rise 50 percent annually through 2015, and hundreds of billions of dollars in fiber optics, data centers, and fourth-generation mobile networks will be needed," Swanson says. "But if network service providers can't design their own networks, offer creative services, or make fair business transactions with vendors, will they invest these massive sums to meet, and drive, demand?"

"If you don't build it, they can't come," he says. And that is the danger.

Sunday, October 4, 2009

Is Net Neutrality Possible?

It has been quite some time since the typical service provider executive has had confidence about the stability of rules governing the communications business. To greater or lesser degrees, there has been some element of regulatory instability and uncertainty since the mid-1990s, to go along with heightened market uncertainty.

Investors and executives do not like uncertainty. Yet greater uncertainty is likely what the industry now faces as the Federal Communications Commission ponders new rules about network neutrality, wireless competition and national broadband policy.

The latest reason for heightened uncertainty is the fundamental nature of questions inevitably raised by some of the regulatory discussions and rule makings, and the time it will take to sort out the application of the rules.

How does regulation separate "common carrier" obligations carriers may have from content rights they may have as providers of their own information and content services?

What does "common carriage" mean in an Internet era, for Internet-delivered services that might not work reliably and consistently in a strict "best effort" delivery mode?

What scope exists for "private IP" services provided to consumer users, much as business users have the right to buy "private IP" services that allow prioritization of packets?

How can regulation provide fair and equitable treatment of like services when the fundamental regulatory frameworks apply to different providers?

How does the framework handle instances where a "service" or "application" provider also acts as a "carrier"? When it is impossible to prevent a single legal entity from acting simultaneously as an information provider and a service provider and an application provider and a carrier, how does regulation handle the contradictions between treatment of roles?

Above all, will the sum total of new rules create more freedom, or less? And when freedom for one actor conflicts with freedom for another, how will balance be maintained?

The answers ultimately will matter for reasons other than perhaps-abstract notions about extending or squashing freedom; protecting individuals and companies from the power of government. At a time when everybody agrees that continued robust investment in facilities is in the public and national interest, how will the new rules affect investment and innovation?

The issue is not so much whether the outcome is greater freedom for application providers--that certainly will happen no matter what the outcome--but whether facilities providers also have freedom to change their business models to take advantage of new freedoms.

Virtually all regulators assume that the proof of deregulatory success is that incumbents lose market share and revenue. Some financial pain, inflicted on incumbents,  therefore is the whole point of deregulation.

But there is some point beyond which the infliction of pain must stop, or wider disruption of core facilities is impaired.

A rational observer might argue that "level playing fields" have yet to be fully created.. The issue is when such a point will have been reached, and how we will know it.

Friday, October 2, 2009

Will Net Neutrality Affect Teleworkers?

New "net neutrality" rules proposed by the Federal Communications Commission could affect teleworkers across the country, says Irwin Lazar, Nemertes Research VP.

If new rules strictly require Internet access providers to treat all traffic equally, which has a nice ring to it, it follows that it would also be unlawful to prioritize traffic, such as giving top priority to voice traffic or conferencing traffic at remote work sites.

That might have implications for the sorts of broadband access services organizations are able to buy for remote workers.

Enterprises and organizations might very well require the ability to prioritize voice and conferencing sessions, while assigning lower priority to Web surfing or entertainment video, for example.

That might mean it is not possible to buy standard broadband access services, and might require sourcing of private business class connections where such prioritization is possible. The ability to create virtual tunnels and virtual private networks might be required, and therefore might preclude buying of consumer broadband connections.

That could prove troublesome for teleworker support, as Nemertes Research now estimates that 86 percent of companies are planning to increase the number of teleworkers.

Proposed Net Neutrality rules may hinder their ability to utilize latency sensitive applications without purchasing a business-class service with performance guarantees. Uses of client-based optimization as well as desktop virtualization are likely to increase.

"Do not continue to assume that your employees will always have cheap access to high-speed residential services," says Lazar. "Develop contingency plans that include purchasing of business class services, use of optimization, and, or desktop virtualization to guarantee application performance."

AI Wiill Indeed Wreck Havoc in Some Industries

Creative workers are right to worry about the impact of artificial intelligence on jobs within the industry, just as creative workers were r...