Tuesday, February 2, 2010

Text Rules, Even for Older Users

A survey by Tekelec shows that text messaging, once seen as the main communications tool for teenagers and young adults, has become prevalent among older generations. The 500-person survey shows that 60 percent of users older than 45 are just as likely to use SMS as they were to make voice calls from their mobile.

That's perhaps not good news for voice usage but shows the value of text messaging plans. About 40 percent of female users say they "mainly text," rather than talk. About 30 percent of male respondents reported they are likely to text rather than call.

Text messaging also is catching up to e-mail as the preferred means of daily international communication, with 32 percent of responses across all ages preferring SMS, compared to 33 percent who prefer to use email.

So is the fact that text messaging is displacing some amount of voice a good thing for mobile service providers? Not entirely. More than 80 percent of mobile service provider revenue still is derived directly from voice, says Alan Pascoe, Tekelec senior manager.

"Of the remaining data piece, SMS has the largest chunk of revenue and the highest profitability," he says.  "Texting is particularly appealing for operators because nearly every subscriber can do it and networks have sufficient signaling bandwidth."

"Still, profitability isn’t quite keeping up with usage, thanks to all-you-can-eat plans, but operators can reduce costs with a more efficient SMS network infrastructure," Pascoe says.

Pascoe says Tekelec is not sure how much email volume is being displaced by texting. But as a general rule younger users are more comfortable with texting than older users and businesses still prefer email.

"A key reason is that an SMS message implies an urgent request, whereas email is typically less urgent," he says. "Personal communication often revolves around an immediate need, like making plans, so texting is the more natural approach outside of the office."

But email is also more conducive for business tasks like sending attachments, he adds.

So will text messaging ultimately be as "archivable" as email? Certainly operators are looking at a number of ways to "add value and stickiness to SMS offerings, including archiving," Pascoe says.

"The most common ideas we hear discussed are email-like functionalities: archiving, copying, forwarding, black and white lists and group distribution," says Pascoe. "The wild card for text message archiving demand is Google Voice, which allows subscribers to store SMS in Gmail instead of on their phones, keeping messages indefinitely."

"With Google providing this for free, it may be difficult for operators to generate revenue from it," Pascoe notes.

Person-to-person messages are the foundation of SMS, and will dominate for the foreseeable future, he thinks. "But the model is evolving so that growth is strongest for person-to-application, application-to-person and machine-to-machine communications."

Why Cloud Computing is the Finger Pointing at the Moon, Not the Moon


The thing about "cloud computing" is that it is very difficult to isolate and separate from other broader changes in computing infrastructure, all of which are happening simultaneously. We are, most would agree, on the cusp of a change in basic change in computational architecture from "PC" centric to something that might be called "mobile Internet computing," for lack of a more-descriptive and well-understood term.

The point, simply, is that the shift to "cloud-based" computing is inextricably bound up with other crucial changes such as a shift to use of mobile devices as the key end user access device, the rise of Web-based, hosted and remote applications and user experiences.

For most people, businesses and organizations, the shift of geolocational "places" where computing takes place will occur in the background. The main change is the evolution in things that can be done with computational resources.

Aside from something like an order of magnitude more devices that are connected to computing resources, the new mobile Internet will mean the creation of something like a "sensing" fabric will be put into place. Cameras will create "eyes," microphones will create "mouths to speak," and "ears" to hear. Kinesthetic capabilities will create new ways to interact with information overlaid on the "real" or physical world.

All those new devices also will create new possibilities for enriching "location" information. GPS is fine for fixing a location in terms of latitude and longitude. But what about altitude? What about locating devices, people or locations that are in high-rise buildings? Emergency services and first responders need that additional information.

But the possibilities for "sensing" networks grow exponentially once communications, altitude, attitude and other three-dimensional information is available to any application. Lots of medical and recreational devices now can capture biomedical information in real time. Add real-time communications and many other possibilities will open up.

The point is simply that cloud computing as computational architecture will enable other changes, going well beyond simple ability to send and receive information of any sort. The shift to distributed computing will, with mobile sensors, devices and people, lead to vastly-different ability to monitor the environment, process and annotate or contextualize events and objects in the real world with granularity.

That is not to understate the challenges and opportunities for a wide range of companies in the ecosystem, caused directly by a shift of core competencies. By definition, a change of computing eras has always been accompanied by a completely new list of industry leaders.

Keenly aware of that historic precedent, none of today’s computing giants will take anything for granted as the new era begins to take hold. At the same time, it is hard not to predict that key stakeholders of just about every sort might find themselves severely disrupted by the shift.

So far, whole industries ranging from media and music to telecom, advertising and retailing have found themselves struggling to adjust to a world with lower barriers to entry and radically different ways of creating and delivering products and services people want.

As the shift to the next computing paradigm occurs, many more human activities and business models will find themselves subject to attack and change.

Within the global communications business, it should be noted that the incremental growth of just about everything “mobile” will hit an inflection point. Whether that happened in 2009, will happen in 2010 or takes just a bit longer is not the point.

To talk about a world where a trillion devices are connected, in real time, to the Internet, to servers, software and applications, is to talk about a world where mobility IS communications. Mobility will not be merely an important segment of the business, it will be THE business at the end user level.

That is not to say the core backbone networks, data centers and other long-haul and even access networks are unimportant; to the contrary they will be the fundamental underpinning of the “always on, always connected” ecosystem of applications and business activity which will depend on those assets.

Without denigrating in any way the “pipes,” dumb or otherwise, that will be the physical underpinning of all the applications, there is only so much value anybody can wring out of plumbing. Most of the economic value is going to reside elsewhere.

That said, there already are numerous ways to look at cloud computing infrastructure, as it is used to build businesses that create added value.

Almost by definition, cloud computing enables consumption of software and applications that use remote computing facilities. We sometimes call this “software as a service” and the trend is an early precursor of what happens in the shift from PC-based to mobile and cloud-based computing.

Such uses of cloud computing will have intermediate effects on end user experiences. Lots of everyday computing or application experiences will shift away from local computing or storage, and towards on-the-fly rendering.

The shift to utility computing—enterprise use of cloud computing—will shift data centers from “owned and operated” facilities to outsourced services. But that likely will have less impact than the shift to SaaS-based applications.

The former is an “industrial” shift; the latter is more an “end user” shift. And all cloud computing effects will have most impact when they directly touch end user experiences.

Utility computing contributes to many end user experiences, but much utility computing is “behind the scenes.” Hosted applications are, and increasingly will be, everyday experiences for most human beings.

Web services are the area where end user impact will be noticed most strikingly, and where the most-profound transformations will occur, as Web services—mostly mobile—will touch end users with services and features that cannot be provided any other way.

Cloud computing is important, to be sure. But we will miss the bigger picture in focusing too narrowly on what it means for data centers, utility computing services, transport and access providers. Even the huge trend towards mobility is a sub-plot.

Cloud computing will enable an era of ubiquitous computing, with social and economic consequences we cannot begin to imagine. It is a huge business change for all of us in communications. But it is just a finger pointing at the moon; not the moon itself.

Google to Launch App Store for "Google Apps"

Google is preparing to launch an online store in which it will sell third-party business software to Google Apps customers, the Wall Street Journal reports.

The Wall Street Journal says that Google's store could arrive as early as March with the works of third-party developers available as enhancements to Google's office productivity software suite. It appears the store would allow Gmail and Google Docs users to purchase add-ons for niche features too specialized for the mainstream Google Apps product.

The Google Solutions Marketplace contains lists and reviews of third-party software for Google Apps and Enterprise Search, but it does not let you buy the applications directly from Google. That might be what is about to change.

Developers would have to share revenue with Google from sales of their software through the store, and it would be reasonable to assume revenue splits similar to those used by mobile application stores run by Google, Apple, and several other companies.

Typically, the developer gets 70 percent of the revenue.

As iTunes was the "secret sauce" that helped propel the iPod to prominence, and as the App Store has been the surprise attraction for the iPhone, perhaps app stores might provide similar value for service and device providers.

99% of BitTorrent Content Illegal?

A new survey suggests that about 99 percent of available BitTorrent content violates copyright laws, says Sauhard Sahi, a Princeton University student who conducted the analysis.

Some question the methodology, pointing out that the study only looks at content that is available, not content transferred. That might not be such a big distinction, though. Copyright holders are growing more insistent that Internet service providers actively block delivery or sending of such illegal material.

That, in turn, raises lots of issues. BitTorrent can be used in legal ways, so blocking all torrents clearly violates Federal Communications Commission guidelines about use of legal applications on the Internet. That said, the fact that the overwhelming majority of BitTorrent files consist of copyrighted material raises huge potential issues for ISPs that might be asked to act as policemen.

The study does not claim to make judgments about how much copyrighted content actually is downloaded. But it stands to reason that if such an overwhelming percentage of material is copyrighted, that most uploads and downloads will be of infringing content.

The study classified a file as likely non-infringing if it appeared to be in the public domain, freely available through legitimate channels, or  user-generated content.

By this definition, all of the 476 movies or TV shows in the sample were found to be likely infringing.

The study also found seven of the 148 files in the games and software category to be likely non-infringing—including two Linux distributions, free plug-in packs for games, as well as free and beta software.

In the pornography category, one of the 145 files claimed to be an amateur video, and we gave it the benefit of the doubt as likely non-infringing.

All of the 98 music torrents were likely infringing. Two of the fifteen files in the books/guides category seemed to be likely non-infringing.

"Overall, we classified ten of the 1021 files, or approximately one percent, as likely non-infringing," Sahi says.

"This result should be interpreted with caution, as we may have missed some non-infringing files, and our sample is of files available, not files actually downloaded," Sahi says. "Still, the result suggests strongly that copyright infringement is widespread among BitTorrent users."

Monday, February 1, 2010

Private Line Market Starts Decline

After years of steady growth, the $34 billion private line services market is entering a period of declining revenue, says Insight Research. It could hardly be otherwise. Just as IP-based services are displacing TDM-based voice, so IP-based and Ethernet-based bandwidth services are displacing SONET bandwidth services, frame relay and ATM services.

U..S enterprises and consumers are expected to spend more than $27 billion over the next five years on Ethernet services provided by carriers, Insight Research predicts. With metro-area and wide-area Ethernet services now available from virtually all major data service providers, the market is expected to grow at a compounded rate of over 25 percent, increasing from $2.4 billion in 2009 to reach nearly $7.8 billion by 2014.

The decline in revenue will continue from 2009 to 2012. But Insight Research also believes private line revenues will tick up a bit after 2012, presumably as additional applications drive demand for more bandwidth. Why the growth would not come in the form of alternative IP bandwidth is not precisely clear, though.

Insight believes additional demand for wireless backhaul and video will lead to more buying of SONET products. Some of us would disagree, but we shall see.

"The transition away from frame and ATM will put a break on overall private line industry revenue growth for a couple of years," says Robert Rosenberg, company president . "However, private line demand remains strong for wireless backhaul, local bandwidth for caching IPTV video services, and for facilitating VoIP."

Google Nexus One for AT&T?

A device that's almost certainly an AT&T-compatible version of the Google Nexus One has been approved by the Federal Communications Commission. The version now sold by Google works on all T-Mobile USA 3G spectrum. but not on all AT&T 3G bands.

Versions running on Verizon's CDMA air interface and also for Vodafone are expected at some point.

Both the Nexus One and the newly-approved phone are being made by HTC. And while the name of the product in question isn't given, its model number is: 99110. The model number for the current version of Google's smartphone is 99100. These are so close its seems very likely they are from the same series.

Sunday, January 31, 2010

Fundamental Changes to PSTN: What Would You Do?

Legacy regulation doesn't make much sense in a non-legacy new "public switched network" context. Nor do legacy concepts work very well for a communications market that changes faster than regulators can keep pace with, both in terms of technology and the more-important changes of business model.

In a world of loosely-coupled applications, old common carrier rules don't make much as much sense. Nor is it easy to craft durable rules when rapid changes in perceived end user value, which relate directly to revenue streams, are anything but stable.

Consider the public policy goal of ensuring a ubiquitous, broadband networking capability using a competitive framework, to promote the fastest rate of application creation and development, under circumstances where the government has neither the financial resources nor ability to do so.

The typical way one might approach the problem is regulate intramodally, looking at wired access providers as the domain. The other way might be to regulate intermodally, comparing all broadband access providers, irrespective of the network technology.

Then consider how a major broadband provider might look at the same problem. No wired services provider, as a practical matter, is allowed for reasons of antitrust to serve more than about 30 percent of total potential U.S. customers. Mobile providers are allowed, indeed encouraged, to serve 100 percent of potential customers, if possible.

Would a provider rationally want to invest to compete for 30 percent of customers on a landline basis, or 100 percent, using wireless?

Ignoring for the moment the historically different regulatory treatment of wired networks and wireless networks, in the new historical context, is it rational to spend too much effort and investment capital chasing a 30-percent market opportunity, or is it more rational to chase a 100-percent market opportunity?

Granted, network platforms are not "equal." Satellite broadband networks have some limitations, both in terms of potential bandwidth and network architecture, compared to wired networks.
Mobile networks have some advantages and disadvantages compared to fixed networks. Mobility is the upside, spectrum limitations impose some bandwidth issues. But fourth-generation networks can deliver sufficient bandwidth to compete as functional substitutes for many fixed applications.

Verizon has already stated that they're going to launch LTE at somewhere between 5 and 12 Mbps downstream. LTE theoretically is capable of speeds up to 80 Mbps, but that assumes lower subscriber demand and also low distance from towers.

The point is simply that discussions about national broadband frameworks will have to open some cans of worms. It is a legitimate national policy goal to foster ubiquitous, high-quality broadband access.

It may not be equally obvious that the best way to do so is to impose "legacy" style regulations that impede robust mobile capital investment and business strategies. That isn't to discount the value of fixed broadband connections. Indeed, broadband offload to the fixed network could play an invaluable role for mobile providers.

Still, aligning policy, capital investment and business strategy will be somewhat tricky.

Directv-Dish Merger Fails

Directv’’s termination of its deal to merge with EchoStar, apparently because EchoStar bondholders did not approve, means EchoStar continue...