Tuesday, September 8, 2009

EU Issues Report on Mobile Phone Cancer Risk

The European Commission is the latest group to issue a report on non-ionizing radiation, any type of electromagnetic radiation including near ultraviolet, visible light, infrared, microwave, radio waves, and low frequency RF (longwave) that does not cause heating.

The EU report basically argues there is a "significant risk of brain tumors from cellphone use." The issue has been studied for decades, and there might be thousands of studies that investigate one form or another of non-ionizing radiation, with results that might fairly be called inconclusive.

Still, the EU study says there is enough uncertainty to warrant keeping "certain establishments free of wireless device radiation, including schools, child day care centers, retirement homes and health care institutions."

Dr. George Carlo, leader of the Cellular Telecommunications Industry Association’s $25M research project, for example, is said by the EU report to have found in 1999 "a statistically significant doubling of brain cancer risk" from mobile use. But three of the five subsequent brain tumor studies published between 2000 and 2002 found “non-significant” elevated risks.

But those studies also are said to show an elevated risk as years of mobile phone usage lengthen.

"Studies led by Professor Lennart Hardell in Sweden found significantly increased risk of brain tumors from 10 or more years of cellphone or cordless phone use," the report suggests. "For every 100 hours of cellphone use, the risk of brain cancer increases by five percent," the report suggests.

"For every year of cellphone use, the risk of brain cancer increases by eight percent," while "after 10 or more years of digital cellphone use, there was a 280 percent increased risk of brain cancer."

"For digital cellphone users who were teenagers or younger when they first starting using a cellphone, there was a 420 percent increased risk of brain cancer," the report says.

The study suggests that dangers are greatest for children.

The study also suggests, as you might expect, more studies of greater rigor. That is not a bad idea. Up to this point, the studies have been contradictory, and therefore inconclusive. But it would not be fair to say no studies have suggested any danger: some have.

Right now, some of us would say that mobile technology is a highly useful technology that might carry some risk, as do automobiles, airplanes or even other household tools. As with any tool, use them wisely.

SES: Back to the Future for Satellites

U.S. and European broadband stimulus plans are dampening prospects for delivering broadband Internet access using satellite networks, SES, the largest satellite operator, says. On the other hand, growing demand for satellite-delivered high-definition television likely will grow.

In many ways that is a "back to the future" move, as satellite point-to-multipoint networks always have been optimal for delivery of linear TV signals. Specifically, SES sees a growing role for use of satellite as the delivery mechanism for multi-channel TV by telcos in situations where fixed broadband networks do not have the capacity to delivery TV signals.

That might represent a market including 40 percent to 50 percent of locations.

“I personally believe the rollout of terrestrial broadband will be such that you can’t demonstrate the viability of satellite in the long term,” Romain Bausch, SES CEO, told the Financial Times.

SES provides two-way satellite broadband to 45,000 customers in Europe but would not invest in new capacity for purposes of serving Internet access demand, Bausch says. Support for mobile voice and video is a different matter, though.

SES plans to add eight satellites to its 40-strong fleet within three years, boosting capacity 19 per cent. About 170 of the 200 new transponders will cover emerging markets where SES supplies the “backbone” to mobile networks in remote areas.

High-definition television, which requires twice the satellite capacity of standard definition channels, continued to power SES’s video revenues.

BSkyB, an SES customer, has announced plans to pioneer 3D television in the UK in 2010, which will require a third more satellite capacity than current HD programming.Ultra-HD, being tested in Japan, could consume four times as much capacity, Bausch also notes.

Gmail User Engagement Seems Higher: Why?


Behavioral differences between users of similar products always are profoundly important, either because one provider has uncovered a better end user interface, better features, some unmet user need or because end user segments are revealed by their choices.

According to a new study by ChimpMail, which analyzed about 184 million email messages , when it comes to open rates, click rates, bounces and abuse complaints, there are distinct differences in recipients' engagement with email between major webmail services.

Open rates, for example, were highest among Gmail users (31 percent) and lowest among AOL users (20 percent). Gmail also ranked highest for click rate with 7.4 percent compared to Yahoo's lowest of 4.2 percent.

Messages sent to Gmail accounts also had the lowest hard bounce rate, though other data indicates Gmail’s spam protection may be so stringent that messages disappear without producing a bounce. A 2009 Return Path study, for example, found a 23 percent nondelivery rate for marketing messages sent to Gmail.

According to comScore, Gmail is the third-most-popular e-mail property among U.S. Internet users, though it posted the highest growth rate between July 2008 and July 2009. Unique visitors to the service rose 46 percent to nearly 37 million.

ChimpMail executives suspect the data show there is some demographic difference between Gmail and other Web-based email users that accounts for the higher engagement rates.

Some also think better junk mail filtering by Gmail accounts for the difference in engagement. Perhaps fewer messages, better tailored to actual end users, are being delivered to Gmail users. It is possible that this better matching of interests and messages is having an impact.

Monday, September 7, 2009

What Makes a Business "Social"?

For the past couple of years, businesses have been trying to figure out what it means to be "social," to create "communities" of users, prospects and customers.

The concept is hard to understand, in some ways. Every business satisfies some understood end user want or need, selling products or services that are an answer for those needs or wants. So social networking is seen as a better way to connect with people, and buildconnections between people, in an environment that is conducive to the company’s success.

Some of us call this a shift from "push" marketing to "pull" marketing, from "promoting products" to "inviting people to be part of a conversation about shared interests."

Jahin Mahindra points to Chick-Fil-A as an example. The firm knew it would have a hard time competing against other giants such as Kentucky Fried Chicken. So instead of "selling chicken," Chick-Fil-A created a gathering spot for mothers with children.

Almost all Chick-Fil-A buildings are constructed with indoor play areas for children. Wi-Fi has been added to many locations to ensure the parents can flip open the laptop at the table while junior plays in the jungle gym. Employees routinely pass through and refill drinks or even clear tables as if users were dining in a more formal establishment.

What this has done, in many Chick-Fil-A locations, is create a place for the desperate housewives to gather and nosh on weekday afternoons, Mahindra says.

"It’s not about the food; It’s about the social environment created that is conducive to buying the food," says Mahindra.

"If you have a location, make visiting your location a social event," he says. "Why do you think many bookstores now have coffee shops built in?"

"If you’re a meeting place as well as a place to buy things, people will frequent your location for reasons other than buying stuff," he says.

That's admittedly a tougher thing to do in a business-to-business setting than in a business-to-consumer environment, but the principles are the same.

Sunday, September 6, 2009

Verizon Exec Says "We Don't Want to Upgrade" Charges are "Absurd"

Some policy advocates are taking shots at Verizon Communications for taking the position that a workable minimum definition of "broadband," for purposes of setting national broadband policy, is 768 kbps downstream and 200 kbps upstream.

"There seems to be some confusion around Verizon’s filing suggesting that the FCC keep a baseline definition for broadband as

768 kbps down and 200 kbps up," says David Young, Verizon VP. "The implication here is that we want to keep the speed set low so we won’t have to upgrade our networks."
"This is clearly absurd," he says. Indeed, in its filing Verizon itself suggests a goal of 50 Mbps for fixed broadband and 5 Mbps for mobile broadband.
But Verizon suggests that for reporting, tracking and measurement purposes, the FCC should maintain the current definition used in

the FCC broadband data reporting program (Form 477) for a basline, while continuing to track multiple higher “speed tiers” to get

a full view of what’s happening in the broadband marketplace.

This threshold definition also has the benefit of being the same one used by NTIA and RUS for the broadband stimulus program, Young argues.

There are lots of good reasons for being consistent about national data collection, the most obivious being that it is impossible to track progress over time if we keep changing the definitions. Many important changes that happen in national communications take 10 years or more, and it can make a huge difference if, along the way, the definitions of what we are tracking have changed frequently.

The other reason is that the definitions will cover networks of very type physical properties. Satellite, fixed wireless wireless, mobile and fixed networks all have different cost and capability profiles.
Beyond that, if what we are after is the fastest possible broadband availability, from the widest array of suppliers, with the most-robust growth in speeds and quality of service, at the lowest cost to consumers, the different speeds and investment profiles have to be harmonized.

Every communications engineer realizes there is a trade-off between capability and cost whenever a network is designed. Every engineer knows a network can be optimized--both in terms of performance and deployment cost--if a single application is supported. But multi-purpose networks inherently are tougher to design because there are more trade-offs.

Broadband networks generally are more expensive than narrowband networks, and networks featuring higher bandwidth generally cost more than networks of lower bandwidth.

If definitions are too stringent in the near term, it is possible some potential users, especially users in thinly-settled areas, will find that few, if any, providers can provide them service at any price those consumers would be willing to pay.

Finally, it would strike many as odd to accuse Verizon, which has been the most-aggressive tier one U.S. provider, of not being willing to invest heavily in broadband. Its FiOS fiber-to-home network is the most aggressive program in North America, and Verizon already is ready to start building a 4G wireless network even as it upgrades its 3G wireless network.

Floors are different from ceilings, and floors for some networks are ceilings for others. Customers, for example, cannot buy satellite broadband operating at more than 5 Mbps downstream, for any amount of money. And satellite is, by anybody's estimation, the absolute most affordable way to provide broadband to isolated locations. Fixed and mobile broadband are somewhat more expensive, but can support higher bandwidth.

In an isolated area, optical fiber or even digital subscriber line or cable modem service offers the most bandwidth, but at the highest cost. Cost and bandwidth, in other words, represent a standard engineering trade-off. The highest bandwidth also comes at the highest cost.

To the extent that retail prices are to be kept relatively low, the network investment must be matched to the anticipated revenue. It might, in some cases, be necessary to trade off some capability to keep costs low.

It would be a mistake to confuse that problem, and the other need to maintain comparable statistics to measure progress, with provider unwillingness to keep pace with growing market demands for broadband speed.

Providers that fail to keep pace will lose customer share. They know that. But unreasonable near-term definitions will not help potential customers get service, or even help existing customers get faster speeds, more quickly.

http://policyblog.verizon.com/BlogPost/661/title.aspx

Friday, September 4, 2009

European Mobile Broadband Penetration to Double by 2014

European mobile broadband penetration will double from 17 percent in 2009 to 39 percent in 2014, Forrester Research now projects.

Click on image for larger view.

This forecast of course looks quite linear, but that's a hazard of the forecasting business. It isn't possible to model unexpected effects.

Penny Wise, Pound Foolish Complaining About Communications?

Lots of people enjoy complaining about how bad their mobile service is, how expensive and slow their broadband is or how useless their landline voice service is. It isn't that the complaints have no foundation.

And even if unfounded, consumers are under no obligation to be "happy" about products they believe do not offer proper value-price-quality relationships.

Oddly enough, people are less happy when change is occurring. Students of revolution often have made that observation: that unhappiness is highest when there is hope for change, compared to situations where there is no hope of change.

The simple way of maintaining perspective is to ask oneself how much one was paying for such services--compared to the value one was getting--for any services as they existed in 1980s or in 1995. Then evaluate what one now gets, compared to the price, compared to when one first began using any new service (mobility or broadband or the Web).

The less conducted exercise is to compare how much complaining gets directed at all communication services as a whole, compared to what one spends for fuel on one's automobile. As it turns out, U.S. consumers spend 2.2 percent to 2.5 percent of disposable income on all communication services they use and buy.

Of late, they have been spending 2.5 percent to four percent of disposable income on fuel for autos and about 11 percent on transporation overall.

Of course, most people likely feel there is only so much they can do about spending on fuel or transportation, but considering that people spend more on fuel than all of their communications, and perhaps four to five times more on transportation, one wonders if people are not being "penny wise, pound foolish."

In other words, people seem to worry lots about a smaller amount of spending, but seemingly worry less about larger amounts of spending.

Directv-Dish Merger Fails

Directv’’s termination of its deal to merge with EchoStar, apparently because EchoStar bondholders did not approve, means EchoStar continue...