Sunday, October 2, 2011

Eric Schmidt on Google's "Monopoly"

Google Executive Chairman Eric Schmidt raises an interesting point about traditional antitrust regulation and Google's business, by extension raising issues about any number of other leading firms in the software industry that actually do not charge users anything to use the apps.

Traditionally, the test has been harm to consumer welfare, through the mediation of markets where welfare is assumed to be a matter of choice, and choice is a matter of robust competition between firms.

But how does traditional antitrust thinking apply to a firm such as Google? It hasn't harmed any user by its existence, its products or its dominance of the search advertising business. Virtually all of Google's products in the consumer space, and many to most in the business space, are offered free of charge.

"So we get hauled in front of the Congress for developing a product that’s free, that serves a billion people," he says. "I don’t know how to say it any clearer: it’s not like we raised prices."

"We could lower prices from free to…lower than free? You see what I’m saying?"

Monopolies, or perceived monopolies, have in the past been regulated because regulators want to prevent consumer harm in the form of higher prices. Typically that requires a fact finding that such harm actually has occurred, not that it could potentially happen in the future. That's a tough case to argue in Google's case.

Saturday, October 1, 2011

Groupon gets into online retailing

Groupon has launched Groupon Goods, moving for the first time into online commerce, at least for U.S. customers.

Groupon’s move into online retail might be seen as defensive in part, since Amazon recently has gotten into the daily deals business. As an offensive move, the addition of goods sales is not too different from what Groupon has to do already to sell its daily offers. In essence, Groupon still is selling a program to an advertiser, as the retailers handle all the fulfillment. Groupon sells a code that the buyer then enters on a retailer's online site

Groupon already has a formidable customer base with more than 35 million registered users, mostly from North America and Europe, as well.

Does the Sun Cause Climate Change?

Do cosmic rays set the earth's thermostat?
In recent years, the idea that the climate is driven by clouds and cosmic rays has received plenty of attention. Despite quasi-religious insistence in some quarters, the theory is significant because it suggests the fundamental driver of "climate change" is the sun.

The notion sometimes is ridiculed, but recently we have seen potential evidence that even settled theories of major consequence are subject to new experimental evidence.
Einstein's famous equation E=MC2  is among the fundamental aspects of modern physics that would have to be revised if it turns out other scientists can replicate the recent finding of a sub-atomic particle that travels faster than the speed of light, as physicists at CERN recently have reported.

If "C" is off, it means that all nuclear physics has to be recalibrated, says scientist Michio Kaku, theoretical physics professor at City College of New York.


Modern physics is based on two theories, relativity and the quantum theory, so half of modern physics would have to be replaced by a new theory.


The point is that all scientific theories are just that: theories. When the evidence changes, theories have to change.

Interest in the idea that it is sun activity that explains Earth's dramatic changes of climate in the past is credited to Danish physicist named Henrik Svensmark, who first suggested it in the late 1990s.

Using satellite data on cloud coverage, which became available with the establishment of the International Satellite Cloud Climatology Project in 1983, Svensmark found a correlation between lower troposphere cloud cover and the 11-year solar cycle.


He proposed that cosmic rays initiate the formation of aerosols in the lower atmosphere that then form condensation nuclei for cloud droplets, increasing cloud formation from water vapor. Since low-level clouds increase Earth’s albedo (the amount of incoming solar radiation that is reflected back into space), more clouds mean cooler temperatures.

Svensmark claimed that this mechanism was responsible for virtually every climatic event in Earth history, from ice ages to the Faint Young Sun paradox to Snowball Earth to our current warming trend. Needless to say, this would overturn decades of climate research, and challenge the notion that human-caused activity is primarily responsible for climatic change.

The assertion will be met in some instances by quasi-religious opposition. But that opposition is just that: quasi-religious, not scientific.

56% of Millennials Stream Video Weekly


According to online research provider Knowledge Networks, 56 percent of Millennial Internet users stream video on a weekly basis—twice as many as among boomers. Millennials also are four times more likely to watch video via mobile than boomers.

Although the 13-to-31 age group may be leading the charge, digital video viewing is increasing among web users of all ages. Knowledge Networks research indicates that monthly use of an alternative method for movie and television viewing increased from 26 percent in 2010 to 35 percent in 2011. Use of streaming video rental services has doubled since 2010, and so has video viewing using internet-connected video game consoles and on mobile devices.

Usage of Digital Video Services Among US Internet Users, 2010 & 2011

Net Neutrality Wasn't a Story in 2011

"Network neutrality," believe it or not, was among the marketing trends that consultant Gini Dietrich thought in October 2010 would highlight 2011. She was dead on about some things, correct about the direction of others, but some would argue missed the mark about one or two, including network neutrality.

Among the clear correct calls, she argued that "content, content, content" would be key. "All companies should become media companies," she said.

Some of the other trends weren't so pronounced during the year, though partly correct in terms of direction. Dietrich mentioned heightened Federal Trade Commission scrutiny of the "blogging" world. That was correct to an extent, but has not, and some would argue will not, take a further step.

Dietrich argued that "next" in 2011 would be rules "around ethics and how we approach traditional journalists and bloggers." That didn't happen, and some would argue, will not.

Some of us would argue that Dietrich flatly misunderstood the "network neutrality" issue, both in timing and implications. She argued that "being able to write a blog post at 6:00 in the morning and post it two hours later and letting it reach audiences around the world for free will be gone." That's a common argument by supporters of strong versions of network neutrality, but is mistaken. The Federal Communications Commission has operated for years on some fundamental "Internet Freedoms principles" that enshrine consumer access to all lawful applications. Internet Freedom principles

"Consumers and innovators have a right to send and receive lawful traffic--to go where they want, say what they want, experiment with ideas--commercial and social, and use the devices of their choice," the Federal Communications Commission clearly has said. "The rules thus prohibit the blocking of lawful
content, apps, services, and the connection of devices to the network." In other words, non-blocking of lawful content already is policy, and already has been enforced by the Commission, on the couple of occasions when it even became an issue.

But there is another important issue that people become confused about, namely the need to manage traffic on any network to ensure the best possible performance for all users, especially when networks get congested.

"The rules recognize that broadband providers need meaningful flexibility to manage their networks to deal with congestion, security, and other issues," the FCC continues to believe. Some people have experienced "all circuits are busy now, please try your call again later" messages when trying to make a landline telephone call.

More have simply found they are unable, from time to time, to make a mobile call, either. Those are examples of lawful network management. When the network gets overwhelmed with admission requests, it simply blocks some attempts until the congestion is alleviated. That is neither illegal nor illogical.

Much of the confusion about network neutrality flows from not distinguishing between the "unimpeded access to lawful apps" and the "need to manage a network for congestion." Some network neutrality supporters argue that an ISP should be forbidden to manage its traffic demand in any way to optimize network performance. That can have several unpleasant implications for end users.

Congestion management on the Internet, or any Internet Protocol network (and all networks are becoming IP networks), typically involves some sort of blocking or delayed response at times of congestion. The networks "slow down" and some connection requests simply "time out." If ISPs cannot establish any priorities for traffic, then everything randomly slows down.

That isn't a major problem for email or web surfing, which will simply be "slower." Random lags in packet arrival are highly disruptive for video and voice, and both media types will be carried on all-IP networks, everywhere, in the near future. In principle, the best end user experience would be provided if, under congestion, priority is given to voice and video bits, while email, web surfing and software update packets are delayed.

Strict network neutrality rules would prevent that practice. There are some business practices issues of concern, such as ISPs favoring their own content over content supplied by rivals.

But that already happens, all over the Internet, as content delivery networks such as those operated by Akamai optimize content for faster delivery. It is not "equal treatment." That is the whole point. Akamai and other content delivery networks charge content providers money to expedite delivery of their packets. There are potential legitimate restraint of trade issues posed by packet prioritization. But content providers do this today, all the time.

In the future, when all traffic is carried over IP networks, there will be clear end user issues. Are you willing to pay for voice or video services that randomly fall apart, or do you expect some reasonable quality standards? Without packet prioritization, it will not be possible to ensure that the voice or video services a customer has paid for can actually be delivered with minimum quality levels. Calls will become garbled and then suddenly disconnect and video will freeze.

The point is that end user access to all lawful applications is not the issue. Whether quality measures can be taken, especially for latency-sensitive applications such as voice, online gaming, video or video conferencing and many transaction processes related to shopping and banking, is the issue.

The other issue is that all IP networks are shared. So what should an ISP do about the fact that a very small percentage of heavy users can disrupt quality of service for the 97 percent of other users who have to share a network? Right now, the way ISPs deal with the issue is to set a quota for total usage, and then throttle the few heavy users when they exceed the quota of usage. It's a crude way of managing heavy usage.

Some would argue the better approach is to allow users to decide whether they'd rather pay some premium so that, under heavy congestion, they'd get priority access, much as content and application providers now can pay Akamai to expedite packet delivery.

The confusion about network neutrality is widespread, and for good reasons. But the issue is not a matter of content access or freedom of speech. All networks have to be managed. All networks can become congested. The issue is how to preserve end user experience when that happens. Some network neutrality proponents say "do nothing." Few network engineers or architects would agree that is a wise choice.

Little "Virtual" About Facebook Credits for In-App Sales

The only thing that is "virtual" about Facebook Credits used to buy merchandise inside games played on Facebook is the store of value. Software developers can sell goods inside their games, and then be paid in "real" currency. Since July 1, 2011, Facebook has required all game developers that sell goods through the site to use Facebook Credits for the transactions.

When developers go to convert Credits into cash, Facebook gets 20 percent of the transaction amount. Credits are used to purchase items such as movie views, or tickets to concerts. In principle, that is analogous to Apple taking a 30-percent slice of all retail sales from iTunes.

But there are other angles. Facebook also allows users to exchange gift cards for Facebook Credits through Plastic Jungle. The value of the gift card, less a fee of about eight percent, gets translated into Credits added to a user's account.

Likewise, Shopkick allows users to earn "kickbucks" just for walking into participating shopkick retailers like Best Buy, Macy's, Sports Authority, American Eagle and SIMON Malls and turn them into Facebook Credits. So far, with the exception of the in-app purchases program, the other programs are ways to turn "real" currency into Facebook Credits. Only developers can actually export Facebook Credits back out to the real world as real world currency. But you get the point: there is some potential here for blending virtual and real currencies more seamlessly.

Klout vs. Kred in Social Media Analytics

Social media metrics are difficult, and measuring "social influence" is no exception. You might say "it is all about the algorithms" used to mine lots of different data to create a picture, and some think the effort is flawed to begin with. But social influence measurement is one of those analytic processes marketers seem to be asking for, which means the tools likely will be used. Right now "Klout" is the tool many have heard of.

A new service is being launched by social media analyst PeopleBrowsr will launch early in October, and promises an open application programming interface.

PeopleBrowsr says the Kred algorithm’s data will be fully transparent and accessible using the API, which means that once it’s launched we’ll have insight into how influence gets measured.

It is argued that influence-guessing services like Klout are more about vanity than they are about really identifying reach and influence. But it's the sort of think marketers keep wanting to know.

It Will be Hard to Measure AI Impact on Knowledge Worker "Productivity"

There are over 100 million knowledge workers in the United States, and more than 1.25 billion knowledge workers globally, according to one A...