Tuesday, April 13, 2010

Social Networking Changing Collaboration at Work

Social networking is starting to change the nature of worker collaboration within companies, new poll conducted by Harris Research suggests. Of workers who use social networking at work, 59 percent say that their usage of social networking has increased over the past year. But only about 17 percent of the 1,000 workers surveyed report using social networking.

The study found the most frequently used application for collaborating with others is email (91 percent), but that what people want from their email is changing. In addition to email, the Harris poll found that other applications being used by respondents to collaborate with others in the workplace include shared spaces (66 percent), voice calls and teleconferencing (66 percent), web conferencing (55 percent), video conferencing (35 percent), instant messaging (34 percent), and social networking (17 percent).

Respondents like the fact that email provides an easily-accessible record of communication and the ability to communicate with many people at once. Users also rank email prominently among various collaboration tools because there is a high level of comfort in using the application to easily communicate with others inside and outside their organizations. However, the poll showed there are many pain points associated with the way most email solutions function today.

While email remains the preferred method of collaboration, many respondents complained they receive too much irrelevant email (40 percent) and that they lack the ability to collaborate in real time (32 percent). End users also dislike the fact that they have very limited storage (25 percent) and that large volumes of email come into their inbox with no organizational structure (21 percent).

Half of those using social networking for work by-pass company restrictions to do so. The study participants who prefer to use social networks indicated they would like to have control over who sees their content as well as be able to share with groups of users using different tools. The respondents also indicated the desire to collaborate in real time without having to open up an additional application.

Video Substitution Still A Marginal Activity

In 2009 an estimated 800,000 U.S. households stop subscribing to a cable, satellite or telco video service, say researchers at the Convergence Consulting Group. By the end of 2011, that number is forecast to double to 1.6 million, the group predicts.

Cord cutters don’t yet represent a serious threat to the $84 billion cable/satellite/telco TV access industry, which counts an estimated 101 million subscribers, the analysts suggest. But they might be a leading indicator of the shift to TV viewing on the Web.

So far, Web video viewing clearly is ancillary to other linear TV modes. The cord-cutters make up less than three percent of all full-episode viewing on the Web. The rest comes from people who are only beginning to watch occasionally online. An estimated 17 percent of the total weekly viewing audience watch at least one or two episodes of a full-length TV show online. Last year, that percentage was 12 percent, and next year it is forecast to grow to 21 percent, Convergence Consulting says.

Nor will major programmers be compelled to speed up their online distribution efforts, as the amount of incremental advertising remains quite small.

U.S. online TV advertising made up 2.5 percent of major-network ad revenues of $62 billion in 2009. Convergence Consulting estimates the incremental revenue at $1.56 billion.
 source

Will LTE Bend the Cost Curve?

Mobile service providers hope Long Term Evolution will "bend the cost curve." They also hope it will provide the foundation for new services, but many of us would guess the primary advantage lies in bending the cost curve.

Google CEO Lauds Professional News Organizations, Steps in a Mess?

A smart chief executive officer knows how to tailor his or her remarks to an important ecosystem partner. The trick is to do so without alienating another important part of the same ecosystem. I'm not completely sure Eric Schmidt, Google CEO, completely succeeds on that score.

He makes the point that the importance of "journalism" is its quality, compared to much content produced by bloggers. At some level, that's simply a reflection of reality. Blog content is uneven. And Schmidt is right in catering to the professional content producers whose help could be invaluable in creating more-powerful advertising models for Google.

Still, there are relatively more artful, and less artful, ways of phrasing things. Perhaps another approach would have made the same point without risking some amount of potential blogger ire.

What's the ROI from Telepresence?

Unfortunately, "usage" is not the same thing as "return on investment" If those two metrics were in fact directly related, nobody would ever have a problem figuring out the return on investment from deploying any unified communications solution.

Generally speaking, one has to assess "success" using soft measures, though some will point to offset travel costs. The problem is that it is difficult to quantify "better quality communications" or "faster development time" or "reduced friction," though those are the sorts of benefits one would expect to see.

The trouble is that most of what one can quantify is "usage."

source

Monday, April 12, 2010

Verizon CEO Says Market Can Sort Out Tough Issues

Ivan Seidenberg, Verizon CEO, said at a Council on Foreign Relations meeting that there was a danger of government regulatory overreach of several types in the current environment.

" I always worry about unintended consequences of government reaching into our business," Seidenberg said. "But I believe the players in the industry--like Google, like Microsoft, like the Silicon Valley players, as well as AT&T, and us and the rest of the industry--we're creating a better dialogue."

Seidenberg also thinks the industry has to do a better job of self-policing, though, more on the model of the advertising industry. That would lessen the need for very-detailed rules crafted "in advance" of a particular problem occurring, rather than a focus on fixing such problems as actually do arise.

"In the telecom business we need industry to do a better job at policing behavior, because, in the final analysis, government could never possibly regulate every condition, in every single circumstance that could ever happen, and do it efficiently," Seidenberg said.

Seidenberg thinks one of the key problems with proposed "network neutrality" rules that would prohibit virtually any sort of packet prioritization is that it makes very hard the task of providing different types of service to customers who may want it, at the lowest-possible prices.

 "Most people think a carrier wants to charge for every minute on a linear basis in perpetuity, infinity," he said. But "we don't really want to do that."

"What we want to do is give you a chance to buy a bundle, a session of 10 megabits or a session of 30 megabits," he says. "The problem we have is five percent or 10 percent of the people are the abusers that are chewing up all the bandwidth."

"So what we will do is put in reasonable data plans, but when we now go after the very, very high users, the ones who camp on the network all day long every day... we will throttle and we will find them and we will charge them something else," he says.

"We don't want to have a linear pricing scale," he said. "We do want to find a way to give the majority of people value for bundles, but we have to make sure we find a pricing plan that takes care of that 10 percent that's abusing the system. And it's that simple."

"And therefore you have to have rules, give us discretion to run our business," Seidenberg said. "Net neutrality could negate the discretion to run your business."

"Anytime government, whether it's the FCC or any agency-decides it knows what the market wants and makes that a static requirement, you always lose," he said.  Seidenberg noted that although access speeds might be higher in Korea or France, household penetration in the U.S. market is higher than in any country in Europe, he said.

"Japan may have faster speeds, but we have higher utilization of people using the Internet," said Seidenberg.  "So our view is, whenever you look at these issues, you have to be very careful to look at what the market wants, not what government says is the most important issue."

"If you look at minutes of use, the average American uses their cell phone four times as much as the average European," Seidenberg says. But what about penetration rates?

"If you look at Europe, they publish penetration rates of 150 (percent), 160 (percent), 170 percent meaning that people have more than one phone, two phones, three phones," he notes. Seidenberg suggests the high roaming rates are the explanation.

"My guess is you probably have two or three different phones to carry to use in different countries because your roaming rates are so high," he adds. "So my point is it's a fallacy to allow a regulatory authority to sit there and decide what's right for the marketplace when it's not even close."

In fact, Seidenberg argues that the U.S. market is more advanced in ways that count.

"Verizon has put more fiber in from Boston to Washington than all the Western European countries combined," he notes. Also, "if you look at smart phones, they have exploded this market in the U.S. market."

"Ask any European if they're not somewhat envious of the advancements of smart-phone technology in the United States," he says.

The FCC is "overreaching in regulations," he says. "It's a real problem to have well-intentioned people in Washington regulating the business as they understood it to be in 1995. Bad idea."

"I don't think there is no role for government," he says. "I just worry about, when you allocate capital and you look at consumer behavior, that is not a strength of, I think, everyday transactional activity of government agencies, particularly federal government agencies."

On the technology front, Seidenberg pointed out that the opportunities for distributed, remote or cloud-based applications is growing very fast.

"But here's the thing about the iPad that's very interesting," Seidenberg said. "We look at it as a fourth screen."

"Now, the interesting thing about the iPad, from how Verizon looks at it, from a network person, first of all, it has no hard drive, right?" he said. That means lots of need to get applications from the network, sort of reversing the trend of the client-server era to put more processing and storage at the edge of the network. That has postive implications for a firm such as Verizon.

Seidenberg also does not think the FCC should attempt to take spectrum away from broadcasters and reallocate it for mobile use, Seidenberg says, although Verizon has said it generally supports FCC plans to reallocate spectrum for mobile use. "I think the market's going to settle this," he said.

link

Sunday, April 11, 2010

Another Reason Why Handset Suppliers Have Gained Value in the Mobile Ecosystem

The mobile user experience keeps getting more complex as mobile operators add spectrum bands, even though most users do not directly encounter any of the particular issues. The reason is that it is harder to maintain connections moving from cell to cell and network to network as new frequencies must be added.

Voice and Internet connectivity issues also become marginally harder as hanset antennae have to accomodate more signals at different frequencies. Also, mobile Internet handsets have to conduct all sorts of signaling operations to support social networking, email and other applications. And then there is the simple matter of different air interfaces.

New fourth-generation Long Term Evolution networks will make the problem worse, especially for "world phones" that are supposed to work in many regions of the world.

When GSM, the first "digital" air interface was firs used in Europe, there was only a single frequency band at 900 MHz band. Than an 1800 MHz band was added, then 2100 MHz.

In the United States, the 850 and 1900 MHz, 1700 and 2100 MHz bands are used. That has lead to "quad band" and "tri-band" devices. And now LTE frequencies will have to be added.

In Europe LTE will likely start on 2600 MHz and potentially also be used on 1800 MHz and 2100 MHz bands, with some use at 800 MHz.

In Japan, LTE will be used on 2100 MHz with an additional band likely to follow. In the United States, the situation is even more divergent. Verizon uses a 10 MHz block in the 700 MHz range.

Some other operators might launch LTE in the 1700 and 2100 MHz bands. Finally, there are rumors of Clearwire jumping from WiMAX to LTE in the 2600 MHz band but with TD-LTE.

So global roaming capabilities of devices will be challenging. So how does this all work out on the consumer end user front? First, cost becomes an issue. Battery life is affected. In some cases, there are form factor issues and reception issues, as the physical placement of the antenna makes a difference.

The potential band and technology combinations for GSM, CDMA, UMTS and LTE are huge, as air interfaces also are different between operators in the U.S. market. All of that means there also are volume manufacturing issues, as devices have to be customized to a certain extent, by operator and by intended region of operation.

All of that means some devices will work better, quite apart from the obvious user interface issues, because of hidden requirements such as the networks each device is intended to work with, signaling operations and even the physical placement of elements within each device.

More-efficient producers will have an advantage as well, as the complexity of these decisions will mean there is an advantage for manufacturers and designers that can leverage the customizing process.

source

Zoom Wants to Become a "Digital Twin Equipped With Your Institutional Knowledge"

Perplexity and OpenAI hope to use artificial intelligence to challenge Google for search leadership. So Zoom says it will use AI to challen...