Monday, August 17, 2009

BT Rolls Out 20 Mbps, FTTH Danger Remains

BT is rolling out 20-Mbps broadband access services to about 40 percent of its network terminations, up from 8 Mbps possible before BT upgraded many of its Digital Subscriber Line Access Multiplexers to the ADSL2+ standard.

Actual connection speeds people will see will vary based largely on the amount of metallic cable between their home and the telephone exchange, as always is the case for DSL. Speed increases will be most noticeble in the upstream direction.

Most BT retail customers connect at up to 448 kbps on the upstream when using ADSL, but this will double to up to 1 Mbps using ADSL2+. The new upgrades will be provided at no charge.

Faster speeds are possible if a further upgrade to VDSL is made, but that also would entail deploying much more fiber in the network.

And while BT faces criticism for not deploying fiber to the home, Fitch Ratings investment analysts say FTTH deployments are highly risky for European service providers facing cable operators. Fitch analysts estimate a telco FTTH upgrade in the U.K. market might cost BT about six times more money per connected home than it will cost cable operators to upgrade using DOCSIS 3.0.

The obvious risk there is financial return. In a competitive sales environment, BT likely could not charge six times more for its offering than a cable operator would. For that reason, Fitch analysts say fiber "to the curb," which costsly significantly less, will be the preferred upgrade strategy.

"FTTH is not commercially viable for much of the incumbents’ networks," though some greenfield builds will be feasible, Fitch Ratings argues.

The point is that Fitch believes wide FTTH deployments are highly risky for European telcos facing serious cable competition.

The broader point is that though fiber to the home likely is the "best" access technology in terms of bandwidth, it is highly financially risky, because the incremental revenue might not cover the incremental cost of the new facilities. It is one thing to mandate national broadband policies. It is quite another thing to require service providers to make investments that are demonstrably highly risky.

That said, there are financial risks either way. Underinvest and telcos are vulnerable to cable operators taking dangerous amounts of market share. Overinvest and a return is not assured.

All that said, actual facts on the ground are decisive. FTTH is more feasible where densities are high and where there is lots of aerial plant, compared to underground. The reason is that it is cheaper to rebuild aerial plant than underground plant. High density helps because loops are shorter and more shared infrastructure can be used. Shared infrastructure always is cheaper than dedicated infrastructure.

But cable operators face the same general financial problem. As fiber is pulled closer to end user locations, the cost rises dramatically. At some point, after DOCSIS 3.0 is deployed and bandwidth is reclaimed by moving to digital video, raw bandwidth might still have to be increased. And that is going to take significant capital investment.

For the most part, then, copper drops of one sort or another are likely to be with us for quite some time.

Sunday, August 16, 2009

National Broadband Plans Not That Effective?

As the Federal Communications Commission starts work on creation of a "national broadband plan," it is worth keeping in mind the relatively slight impact such policies have.

In fact, 91 percent of the differences in fixed broadband adoption rates in the 30 Organization for Economic Cooperation and Development member countries can be explained by reference solely to differences in income, education, population age, and other demographic factors that bear little relationship to broadband or telecommunications policy," the Phoenix Center for Advanced Legal & Economic Public Policy Studies. says in a new study.

In fact, perhaps 87 percent of the variation in broadband subscription rates across the OECD can be explained by a few inputs.

On average, a $10,000 increase in gross domestic product per capita increases the connection rate per capita by 1.97 percentage points. A 10 percentage point rise in the percentage of a population living in an urban area, or a 10 percentage point decline in the share of persons over 65 years of age, both increase the subscription rate by about 3.0 percentage points, on average.

Beyond, one of the deficiencies of the OECD study is that it does not include other popular methods used by people for Internet access, such as at libraries and public Internet connection centers, for example. Nor do OECD studies include mobile Internet access, a more-serious problem as mobile connections grow.

Fully 56 percent of Americans saying that have "at some point used wireless means for online access," according to researchers at the Pew Center Internet & American Life Project say. Notebook PCs are the main way most Americans get online wirelessly, with 39 percent saying it is their "most prevalent means of wireless access," and 32 percent saying they have used a cell phone "or other hand-held device to check e-mail, access the Internet for information, or send instant messages."

More important, 69 percent of Americans also are also starting to use their cell phones for texting, e-mailing, getting directions, snapping and sending photos, Pew says.

Ignoring mobile broadband access under such conditions is a major flaw.

"Mobile broadband is likely to be very important for users who do not own or know how to use a computer, since Internet access is also possible through smart mobile phones and other small, portable devices such as Netbooks," the Phoenix Center says. "Indeed, broadband provided over mobile networks may replace fixed connectivity for many users."

In Portugal, for example, more than half of all broadband connections use mobile technologies, and 10 percent of broadband connected persons in the country use only a mobile access method.


Are Younger Users Cooling to Social Networking?

U.K. communications regulator Ofcom says the percentage of 15- to 24-year-olds with a profile on a social networking site has dropped for the first time, from 55 percent at the start of last year to 50 percent this year.

Some have suggested this means younger users are abandoning sites such as Facebook that no longer are attractive now that their parents use the sites as well.

The other explanation is that users are starting to settle in at fewer sites, says comScore.

Younger users are increasingly moving towards Facebook as their primary social networking destination, and using other sites less.

Inertia A Challenge for Yahoo, Microsoft

It's always hard to get users to change their habits. That is one reason media and content companies spend so much time and money on promotion and marketing. And it appears that applies to the ways people find content as well.

According to comScore, one obstacle the Yahoo!-Microsoft partnership faces is changing user habits. The reason is simply habit. Users who search on Google tend to stick with Google for most searches, comScore notes. About 69 percent of users conducted their searches on Google-owned sites.

Users of the engines at the combined Yahoo! and Microsoft Sites conducted 32.6 percent of their searches on the combined Yahoo! and Microsoft Sites, but a much higher 60.7 percent of their searches on Google sites.

In the content business as well as in the real world, friction and inertia require inputs of energy to "force" objects to change direction.

Saturday, August 15, 2009

More U.S. Mobile Internet Than PC Users by 2010?


"By 2010, the number one way U.S. users will interact with Web is through the phone, not the PC," says Rodney Mason, Moosylvania CMO. That would be a huge change, more in line with what forecasters have been predicting about Internet access methods in the developing world.

If it turns out the mobile device becomes the most-common means of access, the way Web applications and services are designed also will start to change.

Smart phones and mobile broadband networks should lead to "TV everywhere" services, for example. And that could break the hold multichannel video services have on the delivery of video.

Sales of smart phones matter for several reasons, among them the creation of new markets for mobile applications. Handset suppliers and mobile service providers have a huge stake as well.

Nearly a quarter of all handsets sold in the U.S. market during the fourth quarter of 2008 were smart phones, up from 12 percent of all phone sales in the same quarter of 2007, according to the NPD Group. But the rate of growth seems to have slowed because of the recession.

IDC forecasts a U.S. smart phone growth rate of between four and five percent for 2009, a far cry from last year’s 68 percent growth, and Stela Bokun, Pyramid Research analyst, warns that slowdowns in Europe and other markets could negatively affect service provider data revenue growth.

'If smart phones do not get cheaper and if mobile customers remain the only ones bearing the risks related to currency fluctuations in individual markets, uptake will suffer in a prolonged recession and post-recession data services revenue will take longer to recover," she says.

Of the 263 million new handsets sold in Europe in 2008, 14 percent were smartphones. These 36 million units accounted for roughly 24.4 percent of all smart phones sold globally that year, Bokun says.

"We expect handset unit sales in Western Europe to fall 20 percent in 2009; the situation is even worse in Central and Eastern Europe, where new handset sales are expected to fall 25 percent this year," she says.

Globally, Nokia leads in sales of new smart phones, while Research in Motion is second and Apple is third.

Nokia sold 18,441,000 smart phones in the 2nd quarter of 2009, and RIM sold 7,678,900 unit, while Apple sold 5.4 million devices.

Friday, August 14, 2009

Fixed Wireless Likely Big Winner in Broadband Stimulus First Round

Perhaps the biggest first round winners of broadband stimulus grants, after the close of application deadlines Aug. 14 or Aug. 20, 2009 (larger projects will have the Aug. 20 deadline by virtue of an extension granted for electronic filers) are wireless providers, especially firms using terrestrial broadband for access.

There are several reasons. Wireless networks can be built faster, at lower cost, than wired networks, giving wireless providers a better chance of completing larger projects in the required time frame. The largest wired service providers seem to have decided not to apply, for a variety of reasons having to do with the way the rules are constructed and strings attached to receipt of funds.

Also, existing wireless providers, especially independents, have the infrastructure and business acumen required to run such networks, and huge incentive to build out their networks.

In the last major investment wave in the U.S. telecommunications market, though there were hundreds of upstart firms launched, most market share was controlled by just tow companies, AT&T and MCI. Since it appears the largest carriers will sit out the broadband stimulus program, the field is cleared for medium-sized firms to get the funding.

Most of those companies, even those building new middle-mile optical trunking facilities, will rely on wireless for the final mile connections.

Thursday, August 13, 2009

Zer01 Severs Ties with Buzzirk Mobile

Zer01 Mobile says it has severed its business relationship with Buzzirk Mobile for distribution services, based upon breach of contract. It isn't immediately clear what impact the termination will have, as Zer01 is a mobile virtual network enabler and can supply its unusual and interesting approach to mobile voice to other distributors, but Buzzirk seems to have been the most-active of the MVNO marketing partners.

What remains interesting is the approach Zer01 has taken to creating its "VoIP over mobile" capabilities. Essentially, the company uses what it says are national interconnect agreements with GSM providers, and VoIP from VoX to create its service, instead of the traditional MVNO route whereby customers buy wholesale capacity from networks, repackage and resell those capabilities.

Zer01's approach has been to establish itself as a "carrier" for purposes of interconnection, which allows it to exchange traffic with other carriers using the industry-standard rules without buying capacity on one or more networks to resell.

It remains an intriguing approach, though the effort has been clouded by some controversy, which Zer01 now appears to want to put behind it. Zer01's corporate parent is privately held, and has no obligation for the fuller reporting a public company must provide, but a bit more transparency would not hurt, one might argue.

Will AI Fuel a Huge "Services into Products" Shift?

As content streaming has disrupted music, is disrupting video and television, so might AI potentially disrupt industry leaders ranging from ...