Showing posts with label regulation. Show all posts
Showing posts with label regulation. Show all posts

Wednesday, May 5, 2010

FCC Leaning Against Title II Regulation of Broadband Access

Julius Genachowski, Chairman of the Federal Communications Commission, apparently now is leaning away from any attempt to re-regulate broadband access as a common carrier service, a move that would have set off a political firestorm.

The Washington Post reports that the chairman "is leaning" toward keeping in place the current regulatory framework for broadband services but making some changes that would still bolster the FCC's chances of overseeing some broadband policies.

The sources said Genachowski thinks "reclassifying" broadband to allow for more regulation would be overly burdensome on carriers and would deter investment, a belief likely bolstered by the constant criticism Verizon Communications has taken from investors who have questioned Verizon's investment in fiber-to-the-home almost every step of the way.

Congress could "remedy" the situation by passing new legislation directing the FCC to take action along the lines of reclassifying broadband access as a common carrier service, but prospects for any such legislation are unclear.

Aside from the historic objections cable and telco industry segments have had to common carrier regulation of data services, both industries are widely expected to oppose in the strongest possible way any moves to limit their ability to innovate in the area of services and features for broadband services, especially any moves to prohibit any forms of quality of service features.

"Network neutrality" rules that prohibit any form of packet discrimination would effectively prevent the creation of QoS features guaranteeing video or voice performance, for example, even if those are features end users actually want.

Some policy advocates fear that Internet access providers will not voluntarily and adequately police themselves, but end user pressure has proven to be quite effective in the applications space, and even firms that have attempted some forms of network management have voluntarily agreed not to use some forms of management that essentially
"block" legal applications.

That isn't to argue that there are no dangers, but simply that market pressure and end user outrage have so far proven to be effective inhibitors of anti-competitive behavior. Even without title II common carrier regulation, the amount of end user and policy attention now paid to anti-competitive behavior in the Internet business would effectively encourage responsible ISP behavior.

Proponents opposed to "over-regulating" the developing business have argued that any abuses that do arise can be dealt with as they potentially occur, and that this is preferable to regulating in advance, or that the proper venue is the Federal Trade Commission or Justice Department, in any case.

Aside from all those issues, nobody really believes that anything but growth lies ahead for the broadband access business. 'More bandwidth" does not solve all problems, but does solve many of the concerns users or policy advocates might have about continued progress on the bandwidth front.

source

Monday, May 3, 2010

Apple Gets DoJ, FTC Antitrust Attention

The Department of Justice and Federal Trade Commission reportedly are discussing which of the watchdog agencies will begin an antitrust inquiry into Apple’s new policy of requiring software developers who devise applications for devices such as the iPhone and iPad to use only Apple’s programming tools.

Regulators apparently are concerned the policy harms competition by forcing programmers to choose between developing apps that can run only on Apple devices, compared to platform-neutral versions.

The apparent interest shows that Apple has gotten big enough now to come under the typical scrutiny dominant firms always face.

The inquiry does not mean that there will be a full-blown investigation, only that there is some level of concern. Now that Apple's equity value ($237.6 billion) is bigger than Wal-Mart's ($201.7 billion), such scrutiny now will become an on-going concern for Apple, which will henceforth have to consider antitrust implications as part of its strategy.

That isn't to suggest Apple will face any immediate restriction of its freedom of movement. But that day is coming.

link

Friday, April 23, 2010

The U.S. Mobile Voice Market Is Saturated: So What?

The Cellular Market In The US Is Saturated – 24/7 Wall St

Verizon Wireless, AT&T, Sprint and T-Mobile have almost 260 million wireless subscribers. The U.S. population is 305 million people and some of those are too young to need or use a phone. Others don’t want one.

During the last quarter, Verizon added only 423,000 new contract subscribers and AT&T only 512,000 customers, rates that are lower than has been the case in past quarters.

So what does that mean? What it always means: providers will have to create new products to sell to a base of existing customers, rather than selling more of the existing product to new customers. In the cable and telecom business, that has meant both getting into new lines of business as well as "bundling."

For wireless providers, the new product is wireless broadband, immediately in the form of more smartphone data plans, but over time more use of wireless to support sensor networks of various types.

But there are wider policy implications as well. U.S. regulators sometimes behave as though nothing they do will seriously impede the ability of U.S. service providers to continue to invest and innovate. But both the wireline and wireless segments of the communications business face huge challenges. Existing growth models are exhausted and competition is growing.

Instead of behaving in ways that essentially are punitive, perhaps regulators should ask what they can do to allow the fastest-possible transition to new business models as the old models continue to waste away.

Telecom is not a growth industry; that should be obvious to all observers. The big challenge is to foster a transition to a sustainable model that will support continued investment in state-of-the-art facilities. Telecom, to put it bluntly, is not an industry that needs to be punished; it needs to be fostered.

Friday, April 16, 2010

FCC Has National Broadband Authority, Say 2 Former FCC Commissioners

Former FCC chairmen Reed Hundt (D) and Michael Powell (R) say that, contrary to much speculation, the Federal Communications Commission continues to have the authority it requires to set in motion the "National Broadband Plan."

Both Powell and Hundt agreed that the FCC still has jurisdiction on the Broadband Plan and net neutrality and that there isn’t “Armageddon” because of the DC Circuit ruling on the Comcast complaint.

Powell pointed out that Title II reclassification would have a “destabilizing nature” to the industry because it would change decades-long policy and that it would frustrate investments made under the current regulatory environment (for example the $23 billion investment Verizon made on their “FiOS” service).

The argument that the FCC now "lacks jurisdiction," though incorrect, is being used to advance the notion of wider and more-disruptive changes in the basic regulatory framework governing broadband access services.

Post Tech - The Full Video: Ex-FCC heads Hundt and Powell discuss broadband policy

Thursday, April 15, 2010

FCC National Broadband Plan Does Not Require Title II, AT&T Says

Robert Quinn, Senior VP, federal regulatory, for AT&T, argues that the FCC does not need to redefine broadband as a Title II telecommunications service in order to implement its proposed national broadband plan, particularly its changes to the Universal Service Fund.

The "Open Internet Coalition" including  Google, Sony, Public Knowledge and the Free Press have been arguing for that classification as necessary for the plan's recommendations.

AT&T filed an analysis with the FCC Monday saying it thought the commission still has "all the authority it needs" to migrate the Universal Service Fund from phone to broadband service or to implement the online privacy recommendations. "The FCC has all the authority it needs to go out and do the things it has identified in the national broadband plan," Quinn argues.

He said suggestions that the court decision could significantly impede the broadband plan were overblown, and that classifying it as a more regulated Title II (common carrier) service would chill investment, which could adversely impact broadband deployment.

"I think at a time when we need more than anything else is infrastructure investment, I think it would provide a huge disincentive for entities to invest in this space," Quinn says.

Quinn said, ultimately, Congress may need to step in and clarify the scope of the FCC's broadband oversight, but that in the meantime the FCC has authority over changes to universal service, protecting proprietary customer information online and making broadband accessible to disabilities, for example.

Monday, April 5, 2010

Title II Debate Redux

If you were following debates over Federal Communications Commission policy relating to the Internet back in 2006, you might remember that we were debating whether the Internet, and broadband access, should continue to be regulated as other data services are, under Title I, or as common carrier services, under Title II.

The economic, financial and policy stakes are no less important this time around, as we might be setting up for yet another lengthy battle over how best to regulate broadband access. Lots has changed since 2006. Broadband access by fixed line networks has become a legacy service. Mobile broadband is about to explode. Application innovation arguably is more robust than it was in 2006, and almost all of the innovation has something to do with mobility, not the fixed line Internet.

Congress could "remedy" the situation by passing new laws directly the FCC to take regulatory control of broadband access services. A majority of Americans might regard almost any such congressional moves with derision, given the general contempt that institution now inspires in the overwhelming majority of Americans who are polled about their impressions of Congress.

Wednesday, March 24, 2010

FCC Has No Authority to Regulate Internet, Verizon EVP Says

The Federal Communications Commission does not have the explicit power to regulate the Internet, and should wait for Congress to grant it that authority, says Tom Tauke, Verizon EVP. The statement is not as controversial as some might think, as Comcast has challenged such authority in federal court, and many observers think Comcast will prevail.

Comcast has challenged the FCC’s authority to punish it for throttling the bandwidth of customers using bitTorrent programs to share huge files.

“The authority of the FCC to regulate broadband providers under the so-called ‘Information Services’ title, or Title I, of the Communications Act [is] at best murky,” Tauke said. “In confronting this hard question about jurisdictional authority, we [are] also faced this policy question: If Title I and Title II don’t apply to the Internet space, what are you saying about the authority of government in this space?

“In a market developing at these speeds, the FCC must follow a piece of advice as old as Western Civilization itself: first, do no harm," said Tauke.

“Today about 96 percent of Americans have access to at least two providers of wireline broadband and as many as three wireless providers, and more than 55 million Americans can connect to a broadband network capable of delivering at least a 50 Mbps stream," Tauke said.

Thursday, March 11, 2010

Two-Tier Internet is Not Necessarily a Bad Thing, Says Esther Dyson

"The biggest problem that net neutrality has is nobody knows what they’re talking about when they talk about it," says Esther Dyson, noted computer industry analyst. "The issue is who pays and whether they’re monopolies or not, so there’s a whole lot of, I think, disingenuous discussion about control without ever really looking at the fundamental issue, which is somebody’s got to pay for more bandwidth if consumers are gonna be uploading and downloading video."

"As long as there’s healthy competition, I have no problem if someone pays extra for additional bandwidth, as long as that doesn’t cut off people’s access to the other stuff," says Dyson. That does not mean she believes access providers should be able to put up walls around Internet content. 

"There’s this disingenuous discussion of if you don’t allow us to pay extra, you’re not gonna get free content," she says. "Well, of course not, but let the consumer decide whether they want paid or subsidized."

Tuesday, March 9, 2010

What Future for Telecom Business of 2015 or 2020?

The telecommunications industry has experienced more change in the last decade than in its entire history, says IBM. Consider that, in 1999, only 15 percent of the world’s population had access to a telephone; by 2009, nearly 70 percent had mobile phone subscriptions.

So where will the industry be in five years, in 2015? While nothing is certain, forecasters at the IBM Institute for Business Value say they see four possible outcomes, and none of them offer rosy futures.

(click image for larger view)

In fact, IBM's scenarios likely mean further, and major, industry consolidation at a very minimum. The more-radical alternatives include fundamental industry restructuring in ways that separate network operations from retail operations.

In some of the scenarios where radical industry restructuring occurs, today's service providers might find themselves competing against device manufacturers or even today's suppliers of network infrastructure.

The key observation is that IBM presents a range of five-year scenarios that all involve significant pressure on service provider profit margins or gross revenue, or both. Further service provider consolidation is the least disruptive change in industry structure that could happen.

In half of the most-likely scenarios, the industry is structurally separated into wholesale network services operations and separate retail operators.

Keep in mind IBM believes it will take only five years for one of these scenarios to develop.

In one scenario, which IBM calls "survivor consolidation," consumer spending for communications drops, leading to industry "stagnation or decline."

In this rather-bleak scenario, developed market operators have not significantly changed their voice communications and "closed" connectivity service portfolios and also have failed to expand horizontally or into new verticals.

That will trigger an Investor loss of confidence in the telecommunications sector, which produces a cash crisis and leads to industry consolidation.

In an alternate scenario IBM calls "market shakeout," carriers are structurally reshaped into separate wholesale and retail businesses, and the market is further
fragmented by government, municipality and alternative providers.

In this scenario private capital is available only to dense urban areas. Telecom provider growth occurs in large part through sales of services to business partners.

In a third scenario called "clash of giants,"  carriers consolidate, cooperate and create alliances to compete with "over the top" providers and device manufacturers or even equipment suppliers.

In a fourth scenario IBM calls the "generative bazaar," open access infrastructure leads to more competition from "asset light" and over the top competitors.

It is easy to dismiss the level of change the last 10 years has wrought. It might be easy to dismiss the level of change IBM believes can happen in just another five years. As always, the forecast might be too aggressive in terms of its timetable.

The major implication, though, is that the telecom industry might well be a very-different sort of business by 2020, if not by 2015. If you look at revenue sources, it is virtually certain that in developed markets, less revenue--in some cases far less revenue--will be earned from voice and text services.

More revenue will be earned from broadband services, and possibly from business partners rather than end users.

Thursday, March 4, 2010

Net Neutrality Would Increase Likelihood of Content Discrimination, Phoenix Center Says

"Net neutrality regulation is motivated fundamentally by the belief that broadband service providers will,
at some future date, seek to extract profits from the content segment of the Internet marketplace, and
net neutrality aims to stop it," says a new white paper issued by George S. Ford, Phoenix Center for
Advanced Legal and Economic Public Policy Studies chief economist, and Michael Stern, Assistant
Professor of Economics at Auburn University.

Net neutrality supporters that fear surplus profit extraction will take the form of “exclusionary” practices
such as unfair or discriminatory access prices, “fast lanes” and “slow lanes” where preferential delivery is given to content firms willing and able to pay more, or outright monopolization of content, the authors say.

Such concerns about business advantage, whether "unfair" or not, are different from the separate issue of whether currently-envisioned network neutrality rules actually provide incentives to engage in such behavior, the authors say.

Some observers might be shocked to learn that net neutrality rules could actually encourage such
business behavior, not restrain it.

In fact, the latest Phoenix Center analysis suggests that net neutrality regulation actually increases
incentives to engage in exclusionary conduct in the content sector.

"Firms always have an incentive to take those steps, which increase their profits," the authors say.
"Ironically, net neutrality rules, which are supposed to suppress privately profitable exclusionary conduct,
will actually have an effect opposite of what is intended."

Because net neutrality regulations now under consideration will not reduce the profits associated with monopolization of content, but only those associated with the participation in a competitive content market, the proposed rule encourages broadband service providers to take steps to reduce the diversity of voices on the Internet to the detriment of the public interest, Ford and Stern argue.

The point is that network neutrality rules impose pricing rules, and the issue is whether such
pricing rules are likely to encourage or discourage business policies that increase or restrict content
options.

An important question is whether or not the proposed price regulations “promote consumer
choice and competition among providers of lawful content, applications, and services” by
addressing an ISP’s alleged motivation “to exclude independent producers of applications,
content, or portals from their networks.”

The answer is “no,” the authors say. "Net neutrality rules of the type proposed by the FCC and the
Markey-Eshoo Bill encourage exclusionary behavior rather than impede it."

The policy implications of this analysis are numerous, but can be summarized at a very
high level as follows: the analytical foundation for net neutrality remains in its infancy and the
concept needs more time to evolve, the authors argue.

Since even the advocates of net neutrality regulation admit that there exists a “de facto net neutrality
regime” today, there seems to be little reason for a headlong rush into bright-line regulatory
rules when so little is known about the issue.

The rules proposed by both the FCC and Congress create incentives that may not even exist absent the regulation, and increase whatever incentives do exist for ISPs to behave badly in the content market.

Most troubling about the proposed rules is that net neutrality, it now appears, has become little
more than a quibble over profits between providers, a far cry from the origins of the concept wherein the focus was on the freedom to distribute and consume information without undue interference.

source

Wednesday, February 24, 2010

Not Every Telecom Market Did as Well as U.S. in 2009

The U.S. telecommunications and network-based video entertainment markets (cable, satellite, telco) grew revenue in 2009, largely on the strength of performance by the large incumbents that account for most of the industry's revenue.

That was not the case in all markets, though, as the Columbian market, for example, declined about eight percent in 2009, according to researchers at Pyramid Research.

The Columbian market also is in major deregulation shift, so new competitors are expected, especially in the wireless area. Pyramid Research does not think any such new competitors will be able to alter the current market structure, though. Incumbency has its advantages, it seems.

Tuesday, February 2, 2010

99% of BitTorrent Content Illegal?

A new survey suggests that about 99 percent of available BitTorrent content violates copyright laws, says Sauhard Sahi, a Princeton University student who conducted the analysis.

Some question the methodology, pointing out that the study only looks at content that is available, not content transferred. That might not be such a big distinction, though. Copyright holders are growing more insistent that Internet service providers actively block delivery or sending of such illegal material.

That, in turn, raises lots of issues. BitTorrent can be used in legal ways, so blocking all torrents clearly violates Federal Communications Commission guidelines about use of legal applications on the Internet. That said, the fact that the overwhelming majority of BitTorrent files consist of copyrighted material raises huge potential issues for ISPs that might be asked to act as policemen.

The study does not claim to make judgments about how much copyrighted content actually is downloaded. But it stands to reason that if such an overwhelming percentage of material is copyrighted, that most uploads and downloads will be of infringing content.

The study classified a file as likely non-infringing if it appeared to be in the public domain, freely available through legitimate channels, or  user-generated content.

By this definition, all of the 476 movies or TV shows in the sample were found to be likely infringing.

The study also found seven of the 148 files in the games and software category to be likely non-infringing—including two Linux distributions, free plug-in packs for games, as well as free and beta software.

In the pornography category, one of the 145 files claimed to be an amateur video, and we gave it the benefit of the doubt as likely non-infringing.

All of the 98 music torrents were likely infringing. Two of the fifteen files in the books/guides category seemed to be likely non-infringing.

"Overall, we classified ten of the 1021 files, or approximately one percent, as likely non-infringing," Sahi says.

"This result should be interpreted with caution, as we may have missed some non-infringing files, and our sample is of files available, not files actually downloaded," Sahi says. "Still, the result suggests strongly that copyright infringement is widespread among BitTorrent users."

Sunday, January 31, 2010

Fundamental Changes to PSTN: What Would You Do?

Legacy regulation doesn't make much sense in a non-legacy new "public switched network" context. Nor do legacy concepts work very well for a communications market that changes faster than regulators can keep pace with, both in terms of technology and the more-important changes of business model.

In a world of loosely-coupled applications, old common carrier rules don't make much as much sense. Nor is it easy to craft durable rules when rapid changes in perceived end user value, which relate directly to revenue streams, are anything but stable.

Consider the public policy goal of ensuring a ubiquitous, broadband networking capability using a competitive framework, to promote the fastest rate of application creation and development, under circumstances where the government has neither the financial resources nor ability to do so.

The typical way one might approach the problem is regulate intramodally, looking at wired access providers as the domain. The other way might be to regulate intermodally, comparing all broadband access providers, irrespective of the network technology.

Then consider how a major broadband provider might look at the same problem. No wired services provider, as a practical matter, is allowed for reasons of antitrust to serve more than about 30 percent of total potential U.S. customers. Mobile providers are allowed, indeed encouraged, to serve 100 percent of potential customers, if possible.

Would a provider rationally want to invest to compete for 30 percent of customers on a landline basis, or 100 percent, using wireless?

Ignoring for the moment the historically different regulatory treatment of wired networks and wireless networks, in the new historical context, is it rational to spend too much effort and investment capital chasing a 30-percent market opportunity, or is it more rational to chase a 100-percent market opportunity?

Granted, network platforms are not "equal." Satellite broadband networks have some limitations, both in terms of potential bandwidth and network architecture, compared to wired networks.
Mobile networks have some advantages and disadvantages compared to fixed networks. Mobility is the upside, spectrum limitations impose some bandwidth issues. But fourth-generation networks can deliver sufficient bandwidth to compete as functional substitutes for many fixed applications.

Verizon has already stated that they're going to launch LTE at somewhere between 5 and 12 Mbps downstream. LTE theoretically is capable of speeds up to 80 Mbps, but that assumes lower subscriber demand and also low distance from towers.

The point is simply that discussions about national broadband frameworks will have to open some cans of worms. It is a legitimate national policy goal to foster ubiquitous, high-quality broadband access.

It may not be equally obvious that the best way to do so is to impose "legacy" style regulations that impede robust mobile capital investment and business strategies. That isn't to discount the value of fixed broadband connections. Indeed, broadband offload to the fixed network could play an invaluable role for mobile providers.

Still, aligning policy, capital investment and business strategy will be somewhat tricky.

Friday, January 29, 2010

In 2014, 80% of Broadband Access Will Be Mobile, says Huawei

By 2014, 80 percent of the world's two billion broadband users will be using mobile networks for their access, says Huawei. Of those two billion users, 1.5 billion will be first-time subscribers.

Predictions such as that are one reason regulators and suppliers need to be much more cognizant of how much is changing in the global communications business. Policies that relate to broadband access and deployment must reorient to reflect user behavior and supply that will be overwhelmingly mobility-based in just a few years.

Huawei also points out that voice services revenues also are steadily declining."In the past five years, the revenue for fixed voice services decreased by 15 percent, reflected by a decreasing growth rate for mobile voice services in 2009," Huawei says.

If that is a fundamental trend, as Huawei believes it is, then policies cannot be designed on the assumption that voice revenues, traditionally the underpinning for the whole global business, will continue to do so in the future.

In other words, instead of assuming service providers are powerful gatekeepers who need to be restrained, it might be more apt to view them as endangered suppliers who must replace the bulk of their revenues over the next decade or so, simply to remain in business. That certainly is not how telecom companies have been viewed in the past, but to ignore the changes could be dangerous.

U.S. regulators were so intent on introducing more competition in voice services in the early 1990s that they nearly completely missed the fact that the Internet, broadband and over-the-top applications and services were about to change the industry. Basically, the intended market result was to cause incumbents to lose market share while competitors were to gain share, precisely at the point that nearly every competitor was about to face a declining market for voice services.

It takes little insight to observe that a narrow focus on fixed broadband might likewise be dangerous at a time when usage is shifting so profoundly to mobile modes.

To use an analogy, regulators must resist the temptation to "fight the last war," rather than the different new war that is coming.

Saturday, January 23, 2010

Information Technology Industry Council Reaches Common Ground on Net Neutrality

The "network neutrality" debate is becoming more nuanced, with possibly greater understanding by many participants that it is important to find common ground that does not jeopartdize the Internet's future in a misguided attempt to preserve its past.

The Information Technology Industry Council, which includes Microsoft, Ebay, Intel, Apple, Qualcom, Adobe and Cisco, seems to be threading a needle, for example.

Everybody seems to agree that "certainty" is needed or innovation will be impeded. Everybody also seems to agree that innovation "at the edge of the network" likewise should not be impeded.

One way of getting there is by avoiding the temptation to write overly-detailed rules in advance of issues that could arise. That means the ITIC prefers that issues be settled on a case-by-case basis, as needed, rather than by creating new rules in advance of any conceivable set of issues that could arise.

"The FCC cannot posibly anticipate all future circumstances, and it is entirely possible that conduct that may appear to be harmful today will in fact be beneficial to consumers in light of future circumstances," the ITIC now says.

Managed services, for example, should be allowed unless it is proven that the services are "anticompetitive or harmful to consumers." That suggests a new openness to the possibility of enhanced services that take advantage of user-defined and user-requested packet prioritization features.

Quality of experience, especially during periods of congestion, almost requires that such mechanisms be available for users and applications that want to make use of such features.

Cbeyond Asks FCC for Mandatory Wholesale Optical Access

Cbeyond has the Federal Communications Commission to reverse its rules on wholesale obligations for fiber-to-customer networks. On copper access networks, competitors have rights to buy wholesale access. The FCC has ruled that on new fiber-to-customer networks, competitors have no similar rights.

Predictably, incumbents say the current rules should remain in place, which allow any voluntary wholesale deals, but do not require incumbents to offer wholesale access. The rules are consistent with rules that apply to U.S. cable companies, which likewise have no obligation to sell wholesale access to competitors.

The Telecommunications Industry Association  and the Fiber-to-the-Home (FTTH) Council have filed comments opposing the change.

The debate is an old one. Incumbents argue that the business case for FTTH is troublesome, and that they need the ability to profit from FTTH investments without being required to make those faciltities available to competitors who do not have to build expensive facilities of their own when they can simply lease capacity from others.

Though it is difficult to prove, one way or the other, the FCC has faced a dilemma. It can seek to spur competition by mandating robust wholesale access, or it can spur deployment of new optical access facilities, but might not be able to achieve both goals.

The reason is that incumbents can simply refust to upgrade their networks when they do not feel they will get an adequate financial return. There is some important evidence that incumbents are right about the ability to raise investment capital for FTTH.

Investors punished Verizon Communications for pushing ahead with its FTTH program, preferring AT&T's less-costly FTTN approach, for example. Calle and telco executives point out that all competitors are free to build their own facilities if they want, and most observers would note that in markets where there are three ubiquitous FTTH or FTTN networks, it has proven difficult to sustain business models allowing all three competitors to remain in business.

The calls for mandatory wholesale come at a time when everybody acknowledges that the business case for traditional cable TV and voice services is becoming more difficult, and that neither cable companies nor telcos can rely on their mainstay businesses (video and voice) for future growth. In fact, both types of companies are seeing steady shrinkage of those legacy businesses.

Under such circumstances, and given the shift to Internet-based applications, it might not make lots of sense to weakent he business case for robust optical access investments at a time when the financial returns for doing so are under pressure in any case.

Supporters of mandatory optical access obviously would benefit from a rule change, as they could offer optical access without incurring the expense of building new facilities. So the dilemma the FCC faces is an emphasis either on innovation or competition, in some clear sense.

Since virtually all applications now can be delivered over IP-based connections, it no longer makes as much sense as it once did to directly link "access" and "competitive" services. With or without broadband access, companies now can deliver virtually any service over the top, on any broadband connection.

Under such circumstances, robust competition occurs at the application level, not the access level. In fact, that is precisely the problem telcos face with VoIP, and that cable companies face with online video.

Wednesday, January 6, 2010

More Regulation Needed to Spur Broadband Competition? Really?


The U.S. Federal Communications Commission should consider regulations for broadband providers in an effort to increase competition, says Lawrence Strickling, National Telecommunications and Information Administrationassistant secretary, as reported by IDG News Service.

"We urge the Commission to examine what in many areas of the country is at best a duopoly market and to consider what, if any, level of regulation may be appropriate to govern the behavior of duopolists," Strickling says.

With all due respect for Strickling, who is a smart, experienced regulatory type who knows the terrain, and without disagreeing in full with the full content of his filing on behalf of NTIA, the notion that competition somehow is so stunted that new regulatiions are required likely would lead to greater harm, despite its good intentions.

Here's the argument. Consider, if you will, any large industry with critical implications for the entire U.S. economy. Now consider the following mandate: "you will be forced to replace 50 percent of your entire revenue in 10 years."

"During that time, for a variety of reasons, incumbents will be forced to surrender significant market share to competitors, so that in addition to replacing half of the industry's revenue, it also will have to do so with dramatically fewer customers."

"After that, in another decade, the industry will be required to replace, again, another 50 percent of its revenue. All together, the industry will required to relinquish at least 30 percent of its market share, in some cases as much as half, and also will be required to replace nearly 100 percent of its revenue, including the main drivers of its profitability."

Does that sound like the sort of industry that desperately needs additional competition? Really?

Nor is the argument theoretical. Over a 10-year period between 1997 and 2007, the U.S. telephone industry was so beset with new technology and competition that almost precisly half of its revenue (long distance), the revenue driver that provided nearly all its actual profit, was lost.

The good news is that the revenue was replaced by wireless voice. Then, because of the Internet, cable company entry into voice and the Telecommunications Act of 1996, market share began to wither. That, after all, is the point of deregulation: incumbents are supposed to lose market share to competitors.

Now we have the second decade's project, when mobile voice revenues similarly will have to be replaced, in turn, as IP-based voice undermines the high-margin voice services that have been the mainstay of the mobile business.

If you follow the telecom industry as a financial matter, you know that service providers have maintained their profitability only partly by growing topline revenues. They also have been downsizing workforces and slashing operating costs.

If you talk to ex-employees of the telecom industry, they will tell you the industry seems no longer to be a "growth" industry. That's why millions of people who used to work in telecom no longer do so.

So what about the other big incumbent industry, cable TV operators. As you clearly can see, and can read about nearly every day, there are huge questions about the future business model for what used to be known as "cable TV." Many observers already predict that such services will move to Internet delivery, weakening or destroying the profitability of the U.S. cable industry.

Industry executives, no dummies they, already have moved into consumer voice and data communications, and now are ramping up their assault on business communications. Why? They are going in reverse for the core video business.

Imposing regulatory burdens on incumbents--either telco or cable--that are losing their core revenue drivers on such a scale might not be wise. Few industries would survive back-to-back decades where the core revenue drivers must be replaced by "something else."

Imagine the U.S. Treasury being asked to replace virtually 100 percent of its revenue with "something else" in about 20 years. Imagine virtually any other industry being asked to do the same.

The point is that industries asked to confront such challenges and surmount them are not typically the sort of industries that need to have additional serious obstacles placed in their way.

Granted, they are niche suppliers, but Strickling also is well aware there are two satellite broadband providers battling for customers, plus five mobile broadband providers, and then hundreds of independent providers providing terrestrial fixed wireless access or packaging wholesale capacity to provide retail services.

Granted, only cable, satellite, telcos and several mobile providers have anything like ubiquitous footprints, but that is a function of the capital intensity of the business. Most markets will not support more than several suppliers in either fixed or wireless segments of the business.

One can argue there is not more facilities-based competition because regulation is inadequate, or one can argue investment capital no longer can be raised to build a third ubiquitous wired network.

The point is that wired network scarcity might be a functional of rational assessments of likely payback. Cable TV franchises are not a monopoly in any U.S. community. But only rarely have third providers other than the cable TV or incumbent phone companies attempted to build city-wide third networks. Regulatory barriers are not the issue: capital and business potential are the problems.

Also I would grant that mobile broadband is not a full product substitute for fixed broadband. But where we might be in five to 10 years cannot yet be ascertained. And we certainly do not want to make the same mistake we made last time.

The Telecommunications Act of 1996, the first major revamping of U.S. telecom regulation since 1934, was supposed to shake up the sleepy phone business. But the Telecom Act of 1996 occurred just as landline voice was fading, and the Internet was rising.

If you wonder why virtually every human being with a long enough memory would say their access to applications, services, features and reasonable prices is much better now than before the Telecom Act of 1996, even assuming it has completely failed, the answer is that the technology and the market moved too fast for regulators to keep up.

The Telecom Act tried to remedy a problem that fast is becoming irrelevant: namely competition for voice services. In fact, voice services rapidly are becoming largely irrelevant, or marginal, as the key revenue drivers for most providers in the business.

Yes, there are only a few ubiquitous wired or wireless networks able to provider broadband. But that might be a function of the capital required to build such networks, the nature of payback in a fiercely-competitive market and a shift of potential revenue away from "network access" suppliers and towards application providers.

It always sounds good to call for more competition. Sometimes it even is the right thing to do. But there are other times when markets actually cannot support much more competition than already exists. Two to three fixed broadband networks in a market, plus two satellite broadband providers, plus four to five mobile providers, plus many smaller fixed wireless or reseller providers does not sound much like a "market" that needs to stimulate more competition.

There's another line of reasoning one might take, but would make for a very-long post. That argument would be that, judged simply on its own merits, the availability and quality of broadband services, in a continent-sized country such as the United States, with its varigated population density, is about what one would expect.

Even proponents of better broadband service in the United States are beginning to recognize that "availability" is not the problem: "demand" for the product is the key issue.

Sunday, January 3, 2010

Incumbent Telco VoIP Transition is Not Technology-Led

The fact that AT&T has asked the Federal Communications for a definite date to shut down the public switched telephone network is, like most regulatory filings made in Washington, D.C., more complicated than it might appear.

Virtually all telecom service provider executives believe IP voice is the future, whether in the mobile or fixed domains. But the economics of the transition are complicated, at least for an incumbent provider.

Attackers, such as cable companies or independent VoIP providers, have no installed base of customers to cannibalize. Incumbents most certainly do, and that makes all the difference in perspective.

A Verizon executive recently noted that, “at this point in time, the business case does not support a technology-led migration off of the PSTN with the combination of land line loss, the economy, competing priorities and competitive dynamics.”

The key phrase is "technology led." Cable digital voice, Skype and Vonage build on VoIP: the technology directly supports the business case.

For an incumbent telecom provider, the technology in some cases harms the business case. To the extent that VoIP services largely replace an existing service with no incremental revenue, added investment is not met by added revenue. To the extent that VoIP services are priced lower than the voice services they replace, the business case is negative.

Under such circumstances it is rational to harvest PSTN voice as long as possible, despite market share losses. At some point, the logic reverses, however. As the fixed costs of the old PSTN are shared over a smaller base of customers, it will at some point be advantageous to switch to IP voice, strictly on the basis of operating cost savings.

That point has not yet been reached, but it is inevitable. The issue right now is what regulatory regime will apply to incumbents as that transition occurs. And one might argue that is the real point of the AT&T request for the FCC to specify a firm timetable for shutting down the PSTN.

The replacement of PSTN technology with IP telephony also creates an opportunity for new rules about carrier obligations that directly affect the costs of providing such service. That is why the AT&T request also argues that legacy rules must be altered as the transition is made.

Those rules are arcane and of little visible consequence for the typical consumer user of fixed voice. But they have enormous impact on the voice business case, as viewed from an incumbent perspective. Basically, all the rules that govern how networks compensate each other for terminating traffic are the heart of the matter.

So incumbent sunsetting of the PSTN will not be "technology led." The institutional and business frameworks remain the key issue.

Wednesday, December 2, 2009

FCC Seeks Input on Transition to VoIP

The Federal Communications Commission wants public and industry comment on the policy framework for a transition from circuit-switched to voice services on all-IP networks. The FCC will use the comments to issue a possible "notice of inquiry" on the subject.

"In identifying the appropriate areas of inquiry, we seek to understand which policies and
regulatory structures may facilitate, and which may hinder, the efficient migration to an all IP world," the FCC says. "In addition, we seek to identify and understand what aspects of traditional policy frameworks are important to consider, address, and possibly modify in an effort to protect the public interest in an all-IP world."

Among other issues, the FCC will be looking at consumer protection issues such as how the needs of people with disabilities can be assured. A look at the role of "carrier of last resort" obligations in an all-IP framework also is expected.

All comments should refer to GN Docket Nos. 09-47, 09-51, and 09-137 and title comment filings
as “Comments – NBP Public Notice #25."
 
Filers using the Commission’s Electronic Comment Filing System should enter the following text in the “Custom Description” field in the “Document(s)” section of the ECFS filing page:  “Comments – NBP Public Notice # 25."

Tuesday, November 24, 2009

Do Usage Caps for Wireless and Mobile Broadband Make Sense?


Consumers say 60 percent of the wireless broadband decision is based on two factors: monthly recurring charge and existence or size of a usage cap. For that reason, "data caps" are a particularly unfriendly way to manage overall traffic, says Yankee Group analyst Philip Marshall. 

A better approach, from a service provider perspective, is to offer unlimited usage and then manage traffic usingreal-time, network intelligence-based solutions like deep packet inspection and policy enforcement, Marshall argues.

Some would argue that fair use policies that throttle maximum speeds when policies are violated is no picnic, either. But temporary limits on consumption, only at peak hours of usage, arguably is more consumer friendly than absolute caps with overage charges. 

To test consumer preferences, Yankee Group conducted a custom survey that included a "choice-based conjoint analysis," which allowed Yankee Group analysts to estimate the relative importance to consumers of key wireless broadband service attributes. The survey was taken by 1,000 mobile consumers who also use broadband access services. 

From the conjoint analysis, "we found that, on average, 59 percent of a wireless broadband purchase decision depends on two factors: service price, and the presence or absence of a 2 GByte per month usage cap," Marshall says. 

The results also indicate that 14.5 percent of a typical purchase decision is affected by service bandwidth, and that the implied average revenue per user lift when increasing bandwidth from 768 Kbps to 2 Mbps ranges between $5 and $10 per month.

The results also indicate, however, that there are diminishing returns for service plans that offer speeds above 3 Mbps, though speed increases might be useful for other reasons, such as competitive positioning. 

"Our price elasticity analysis implies that consumers are willing to pay $25 to $30 more per month for plans that offer unlimited usage, compared to plans that have a 2 GBytes a month usage cap," says Marshall.

"In a competitive operating environment, consumers will tend to migrate toward higher bandwidth services, all else being equal, but they are not necessarily willing to pay a significant premium for the added performance capability," says Marshall.

Our most recent survey results indicate that consumers require 2 Mbps to 3 Mbps bandwidth for their broadband service. This is likely to increase dramatically over the next two to three years, but the consumer survey suggests dramatically-higher bandwidth does not affect decisions as much as recurring price and existence of bandwidth caps. 

For example, when offered a choice between one package featuring a 2 GByte per month usage cap with 6 Mbps bandwidth, and another package with unlimited monthly usage but just 2 Mbps service speed, 63 percent of consumers opted for the 2 Mbps service with no cap.

Even when the choice is between an unlimited package offering only 768 Kbps bandwidth, compared to an alternative plan with 6 Mbps bandwidth and a 2 GByte per month usage cap, 57 percent preferred the 768 kbps package.

Service providers still must manage bandwidth demand though, with or without usage caps
Usage caps work to regulate demand, but users do not like them.

The other approach is not to impose the usage caps, but instead to use policy managment and deep packet inspection to manage traffic flows.

If such solutions are implemented in a non-discriminatory manner, so that all like services are treated equally, they can be implemented irrespective of network neutrality regimes currently under consideration, Marshall believes.

CIOs Believe AI Investments Won't Generate ROI for 2 to 3 Years

According to Lenovo's third annual study of global CIOs surveyed 750 leaders across 10 global markets, CIOs do not expect to see clear a...