Tuesday, May 19, 2009

AT&T Launches "Synaptic Storage" Cloud Service

AT&T now is selling enterprises a new "AT&T Synaptic Storage as a Service", a storage-on-demand offer that provides enterprise customers with control over the storage, distribution and retrieval of their data from any location, anytime, using any Web-enabled device.

The service automatically scales storage capacity up or down as needed, and users pay only for the amount they use, AT&T says.

AT&T is introducing the service to customers on a controlled basis this month, with plans to make the service generally available in the third quarter. The service is deployed in AT&T Internet data centers (IDCs) in the U.S. and will be accessible by customers connecting to the Web anywhere. In time, AT&T plans to add the service to select global IDCs to meet customer demand internationally.

"Build it and They Will Come..." Again?

"Build it and they will come" became a demolished business strategy in the global bandwidth business, even though for a brief moment around the turn of the century, people believed that to be the case.

Still, the logic behind fiber-to-the-home projects in many ways represents the same sort of thinking. "New applications will flourish on a 100Mbps FTTP rollout even though nobody knows what those apps will be," says Khoong Hock Yun, Infocomm Development Authority assistant chief executive, and reported by CommsDay.

Still, it might still turn out to be correct, at least for providers of access connections, at some point. FTTH Council of Europe President Karel Helsen argues that content, gaming and entertainment companies now aer being invited to join the FTTU Council.

“If you provide the pipes, people will make sure that they fill it,” Helsen says. "Companies such as Nintendo, Sony, and Time Warner we welcome into our council and we’ve started talking to those companies since the beginning of this year.”

“We just had the first gaming association also join the European council and we believe also by having those people as members, you also create the pull effect from the market side to stimulate the rollout of fiber to the home,” Helsen says.

That isn't to argue such networks should not be built, or that new revenue-generating applications will not ultimately be developed. But it is likely to take some time.

Monday, May 18, 2009

Goodbye Sarbox?

The Supreme Court apparently is going to test the constitutionality of Sarbanes-Oxley rules. Personally, I hope the Supremes do rule that way. Sarbox has been a major burden for smaller and mid-sized firms, adding millions of dollar in annual cost, in many cases, and killing the Initial Public Offering market. 

http://www.businessinsider.com/henry-blodget-supreme-court-may-kill-sarbanes-oxley-and-resurrect-ipo-market-2009-5

Why Broadband "Penetration" Measures Often are Misleading

If you were trying to figure out how prevalent televisions, radios, digital video recorders, Slingboxes, PCs or DVD players were in people's lives, would it make more sense to measure how many Best Buy retail locations sold such products, or how many units are sold in any given time period?

Alternatively, if you were trying to measure the penetration of such devices, would you track the number of homes, businesses, or both, that have such devices in use?

Would you try to measure "personal" devices such as mobile phones or MP3 players the same way?

The questions aren't as "academic" as might first appear to be the case.

While it makes sense to measure the penetration of any mobile and personal technology on a per capita basis, because that is the way people buy and use such services and products, it arguably makes less sense to measure other products, such as T1 lines, Ethernet or other fixed broadband connections the same way, because that is not the way people buy or consume such products.

Were we to measure Ethernet connections on a per-capita basis, penetration would be quite low, for example. Most people intuitively would understand that sort of issue.

But where it comes to fixed broadband penetration, that is precisely the problem we face. Agencies are used to measuring fixed broadband in just about that fashion: per capita, even though people do not buy such services that way.

The point simply is that we need to measure things in a way that reflects the way people actually use a given product or technology.

People do not buy fixed broadband subscriptions the same way they buy mobile phones.

So per capita indexes are more suited to some products than others. Per-capita fixed broadband indexes are affected by mundane things such as household size, business adoption and consumer preferences.

"Consider Portugal, in which there are approximately three persons per household," says George Ford, Phoenix Center for Advanced Legal and Economic Public Policy Studies chief economist. "If every household had a broadband connection, then the per capita subscription rate in Portugal would be 33 percent"

"In Sweden, alternately, there are approximately two persons per household," says Ford. "So, if every home had a
connection, then the per-capita subscription rate is 50 percent."

"The number of fixed broadband connections per person is a flawed measure because it will vary based on the average size of a household or business establishment," Ford notes.

"In the United States, nearly every business and household had a fixed line telephone when the 1996 Telecom Act was passed," Ford notes. "Yet, telephone subscriptions per capita were only 49 percent at the time."

"In Sweden, which also had near ubiquitous telephone adoption, the telephone per-capita subscription rate was 69 percent.

The point, says Ford, is that per-capita measures are not meaningful tests of fixed broadband adoption, especially when comparing different regions or nations.

Verizon Wireless Reduces Overage Charges

Verizon Wireless has increased the data allowance for all mobile broadband customers on its lowest priced monthly plan and also has reduced overage pricing on the standard plan.

Users of the $39.99 monthly access plan used to have a cap of 50 Mbytes with an overage charge of 25 cents per megabyte. The new plan includes a 250 MByte monthly allowance and 10 cents per megabyte overage.

Users of the $59.99 monthly access plan have an unchanged 5 GByte monthly allowance and five cents per megabyte overage charges, compared to the older overage charge of 25 cents per megabyte.

4G will Grow 33% Faster than 3G, Juniper Predicts

It took nearly six years for third generation mobile services based on UMTS/HSPA to reach 100 million subscribers but it will take Long Term Evolution just four years to reach the same milestone, say researchers at Juniper Research.

The number of LTE subscriptions worldwide will grow at a cumulative average growth rate of 404 percent from 2010 to 2014 and reach 136 million subscriptions by year-end 2014, Juniper forecasts.

You might think this has something to do with spectrum efficiency, more efficient coding, signal propagation or some other technology attribute, but if the forecast proves accurate, it will be more a result of a changed mobility market than anything else.

When 3G networks were launched, the expectation was that new data services would fuel revenue growth. That largely failed to happen, at least early on. What is different now is that mobile broadband is approaching mass market status.

Mobile broadband demand is growing about 30 percent a year, while video usage is growing only a bit slower.

Wednesday, May 13, 2009

DPI Raises Consumer Ire, Should it?

"Network bandwidth is a finite resource, especially so in wireless networks, so it is reasonable and indeed expected that carriers will manage their network bandwidth to assure sufficient quality of service for all subscribers," says Brian Wood, Continuous Computing's VP. That tends to mean use of deep packet inspection, and that tends to raise hackles in some quarters.

The problem is that Internet access, and Internet backbones and servers, are shared resources. There is a "tragedy of the commons" problem if a few users have behavior patterns dramatically different from those of the typical user, because all networks are engineered statistically.

Nobody builds a network that provisions bandwidth on a "nailed up" basis, because nobody could afford to do so. Instead, bandwidth is "underprovisioned," on purpose. Network architects assume that not every user will be putting load on the network, all at the same time.

That works remarkably well most of the time. What causes problems are unexpected loads that haven't been engineered into the network.

"Without traffic management, a few 'bandwidth hogs' can easily degrade the user experience of many other users on the same network," says Wood. That might especially be true in the access network, but the entire Internet transmission  network, including all the servers, are shared resources.

"For consumers, DPI-based traffic management ensures that subscribers get the quality of service that they expect, or at least that they pay for," says Wood.  So, for example, a business user might opt to pay a slight premium for a guaranteed level of service (e.g., guaranteed minimum bandwidth) while a frugal college student might go for a cheaper "best effort" rate plan.

Basically, DPI or other traffic shaping mechanisms are about fairness: making sure most users get reasonable performance most of the time. The other advantage is the ability to learn or be instructed by a user on what activities are most important, so those activities get the highest priority during congested periods.

DPI can be viewed as an automated away, or a self-learning kind of way, for the network to provide those kinds of benefits, says Wood. "It's all a matter of filtering out the stuff that, based on past behavior or the behavior of similarly-profiled individuals, is deemed to not be of value and, instead, prioritizing the stuff that is deemed to be of value."

Behavioral tracking is an issue, though. "Cookie-based tracking seems to be a generally-accepted practice with web sites these days, but there was great concern when cookies were first introduced," says Wood. There are end user advantages, of course, such as sites "remembering" who you are and what your preferences are.

Behavior-based tracking has raised more concern. Deep packet inspection is deemed by some as intrusive and too personal, says Wood. The same sorts of concerns are raised about DPI-based ad insertion.

"What's interesting to me, though, is that Google has been offering Gmail for free to users in exchange for content-based advertisements being displayed next to their emails, and I haven't heard of any uprising against Gmail," he says.

Subscriber notification, how subscribers are notified, and whether those subscribers have any say in the matter, seem to be the key sticking points here, he muses. "Nobody likes the idea of being monitored without their consent, especially if they feel like information gathered through such monitoring will be used in an attempt to profile or manipulate them in the future."

But behavior-based marketing seems to work well for Netflix and Amazon, Wood notes. The difference seems to be one of perception. Lots of people are afraid technology will be used "on" them, rather than "for" them.

Will Generative AI Follow Development Path of the Internet?

In many ways, the development of the internet provides a model for understanding how artificial intelligence will develop and create value. ...