Showing posts with label long tail. Show all posts
Showing posts with label long tail. Show all posts

Friday, December 17, 2010

Whether Content or Distribution is King Depends on Who You Are

Pundits and participants always have argued about whether "content" or "distribution" is king in the media ecosystem. The truth is, both are essential for a completed value chain and revenue model. But at any point in time, the question can only be answered concretely, based on who the actor is.

Large, powerful content creators and networks (networks are also distributors of content) tend to have more power, and one generally can say that in those cases--ESPN for example--it is the content packager that is "king." But ask any of the lightly-viewed small networks carried on cable, telco and satellite networks whether content or distribution is "king," and you are likely to get a much-more-nuanced answer. Small networks have to fight to get distribution, and might well say distribution is king.

To become a top 1,000 website you need at least 4.1 million visitors per month. To become a top 500 website you need at least 7.4 million visitors per month. See http://royal.pingdom.com/2010/07/05/what-it-takes-to-be-a-top-100-website-charts/ for more detail.

To become a top 100 website you need at least 22 million visitors per month. To become a top 50 website you need at least 41 million visitors per month.

For most websites, distribution would appear to be "king." In every case, if you are one of the small number of market leaders, content is king. For most organizations, distribution is going to appear to be the bigger business issue, and so distribution would remain king.

Thursday, December 16, 2010

Twitter Illustrates Pareto Distribution

The "long tail" distribution is a standard Pareto distribution, popularly thought of as the "80/20" rule, where a disproportionate share of just about anything comes from a fraction of the causes.

Twitter followers in December 2010 show a clear Pareto distribution, as do people that Twitter users "follow."

The clear implication for things such as market share in any sphere of business will also have a Pareto distribution.

The implications for businesses and organizations that use Twitter as a social tool is that, in all likelihood, modest expectations should be watchword. It is highly unlikely most companies and organizations will ever appear at the head of the tail. Those spots normally are held by celebrities of one sort or another.

That isn't a reason not to use Twitter, just a reminder to be realistic about expectations.

Wednesday, September 29, 2010

Not So Many Twitter Replies and Retweets

There is a notion that social networking communication patterns "should be" symmetrical, or something sort of symmetrical, or at least highly interactive.

Systomos finds this is not the case. After analyzing 1.2 billion tweets, Systomos found that that 29 percent of all tweets produced a reaction of any sort, either a reply or a retweet.

Of this group of tweets, 19.3 percent were retweets and the rest replies. This means that of the 1.2 billion tweets we examined, six percent, or 72 million were retweets.

Sysomos also discovered that 92.4 percent of all retweets happen within the first hour of the original tweet being published, while an additional 1.63 percent of retweets happen in the second hour, and 0.94 percent take place in the third hour.

That's a classic "Pareto" distribution, often known as the "80/20" rule or a "long tail" distribution. Since so many processes and distributions in the natural world follow a Pareto curve, this should come as no surprise.

Pareto would suggest that a small number of tweets produce most of the replies or retweets. And that is precisely what Sysomos found.

Saturday, May 22, 2010

How Much Competition Is Possible in Telecommunications?

What makes a market workably competitive? That might not be a tough question in the abstract. Most people would probably agree that multiple competitors in any market are good for competition, and therefore good for consumer welfare. Matters are tougher when looking at capital-intensive industries.

Most people, and most economists, might agree that dams, highways, electrical and water systems tend to be so capital intensive that they are "natural monopolies." In such cases, competition from other firms likely is unworkable because there simply is no way as few as two providers could make money over the long term.

Typically, such firms are allowed to operate as highly-regulated monopolies. 

At the other end of the spectrum, most people might agree that consumer goods tend to be wildly competitive, and do not typically require much reguluation as such, though other "product safety" regulations might be appropriate. Where markets are robust and can function, most people likely tend to believe there is no fundamental need for price and other forms of "monopoly provider" regulation, as consumer choice leads to restraint on predatory supplier behavior. 

But there are some industries in between these relatively clear cases. Airlines once were highly regulated, though perhaps the airline industry has not had perceived monopoly characteristics as did the telephone industry. Many are too young to remember it, but there once was no choice in telecom services. Everybody bought from one supplier, AT&T, in about 85 percent to 90 percent of cases (there always have been some areas served by other providers, on a monopoly basis). 

The point is that the number of firms that a market can sustain is directly related to the size of potential addressable market and the cost of entering that market. In fact, says Ford, "having only a few providers does not imply poor economic performance, but might indicate intense competition." 

The point is that the number of firms that a market can sustain is directly related to the size of potential addressable market and the cost of entering that market. In fact, says Ford, "having only a few providers does not imply poor economic performance, but might indicate intense competition." 

Neither regulators nor most people likely believe anymore that telecommunications actually is a natural monopoly. 


But the industry is hugely capital intensive, so the question does arise: how many competitors in a single market are required so that most of the benefits of competition are reaped? 


There are subsidiary questions such as what the relevant "market" is, but the key question is the number of sustainable competitors a given telecom market can support. Some people used to debate whether services provided by wireless networks were, in fact, part of the same market as the wireline segment of the market. 


The point is that it is possible, perhaps likely, that telecommunications markets cannot sustain acilities-based competition by more than a smallish number of viable competitors. If that is the case, then a small number of competitors is not, by itself, evidence of an uncompetitive market. 


In voice services, this already has proven to be true. There now are three times as many mobile "voice" accounts in service as there are fixed voice lines, and the disparity is growing. In the multi-channel video markets, fixed providers now see the satellite firms eating away at fixed-network market share as well. 


And the next question is the extent to which wireless will likewise expand and displace significant portions of the fixed broadband market as well. The point is that wireless and wireline contestants are in the same market, though not each contender competes in every segment of the market. 

Lots of people appear to believe two competitors is too few. Such views tend to point to cable versus telco competition as the salient example. But recent pricing and product trends in the high-speed broadband and voice markets suggest there is a clear trend of price declines in both markets, as well as a continual "price per megabit per second" as well. The former is important as it suggests competition is working; the latter is important because it suggests competition is forcing providers to upgrade the quality and features of the product over time.

That is not to say everyone is happy with the level of competition, which is workable, if not "complete." But it also remains the case that the number of competitors in either the wired or wireline business "always" will be limited to a relatively small number of competitors, because of the capital intensity of the business and the startling impact of just a few competitors in the market on achieveable business results.

Simply put, beyond several competitors in a single market, it might not be possible for any firm to sustain a business in either the wireless or fixed portions of the market.

For example, a theoretical market with a $1 million revenue potential, a monopoly price of $100 per customer, with $100,000 required to enter the market, with variable costs of $10 per customer, and each additional firm reducing profit margins by 10 percent, would typically result in a market structure where no more than seven firms could make a profit of any sort.

And a normal Pareto distribution would have 80 percent of the profits earned by the first two players, with the typical long tail of profit for the remaining players.

The point is that it is not unusual for a Pareto distribution to exist, though not in "idealized" form, in most markets, including telecommunications, which is a scale business. In fact, if one looks at a single retailers sales of products over a month's time, what one sees is another Pareto distribution. Most of the revenue comes from the sale of just seven percent of products. The point is that highly-uneven and highly-unequal Pareto distributions are commonplace.

So are two players enough to create workable competition? Maybe, though not always. That arguably is true for the consumer high-speed access market.

But one might argue from history that the U.S. wireless market was somewhat competitive in the 1970s when a duopoly essentially existed, but become vigorously competitive when additional spectrum was granted to other players with the "Personal Communications Service" spectrum awards. Since then, the U.S. market has shown strong signs of being robustly competititve on virtually all consumer metrics. In the U.S. wireless market, a two-player market does not seem to have produced as much competition as a three-player or four-player market. Still, returns are unequal and uneven.

Some will point to the dominance of two firms, but that would simply confirm that the wireless market is a typical market, with a Pareto distribution. If one looks at developer interest in creating apps for smartphones, the distribution of interest is a classic Pareto distribution, with the most interest clustered around just a few devices, and then dropping off in a classic "long tail" distribution.

In fact, outsized returns for two firms with outsized market share is the normal and expected state of affairs in any market, especially a market with high capital investment barriers to entry, such as telecommunications. The point is that in a perfectly-competitive scenario, what we now see is what we would expect to see. The normal Pareto distribution would suggest something on the order of 80 percent of revenue, profit or market share to be held by just two firms.

Friday, December 4, 2009

No Bandwidth Hogs?

Some would argue there is no "exaflood" and no such thing as a "bandwidth hog." 

I have no more detailed data from any Internet service provider than anybody else does, so I doubt anybody can prove or disprove the thesis definitively. But I also have no reason to think the usage curve will be anything other than a Pareto distribution, since so many common distributions in the physical and business world conform to such a distribution.
Vilfredo Pareto, an Italian economist, was studying the distribution of wealth in1906. What he found was a distribution most people would commonly understand as the "80/20 rule," where a disproportionate share of results come from 20 percent of actions. The Pareto distribution has been found widely in the physical and human worlds. It applies, for example, to the sizes of human settlements (few cities, many hamlets/villages). It fits the file size of Internet traffic (many smaller files, few larger ones).

It describes the distribution of oil reserves (a few large fields, many small fields) and jobs assigned supercomputers (a few large ones, many small ones). It describes the price returns on individual stocks. It likely holds for total returns from stock investments over a span of several years, as most observers point out that most of the gain, and most of the loss in a typical portfolio comes from changes on just a few days a year.

The Pareto distribution is what one finds when examining the sizes of sand particles, meteorites or numbers of species per genus, areas burnt in forest fires, casualty losses: general liability, commercial auto, and workers compensation.

The Pareto distribution also fits sales of music from online music stores and mass market retailer market share. The viewership of a single video over time fits the Pareto curve. Pareto describes the distribution of social networking sites. It describes the readership of books and the lifecycle value of telecom customers.

So knowing nothing else than that the Pareto distribution is so widely represented in the physical world and in business, I would expect to see the same sort of distribution in bandwidth consumption. As applied to users of bandwidth, Pareto would predict that a small number of users in fact do consumer a disproportionate share of bandwidth.

I certainly can't say for sure, but would be highly surprised if in fact a Pareto distribution does not precisely describe bandwidth consumption.

Saturday, March 28, 2009

Long Tail Yes, But Perhaps Not What You Were Expecting

In recent years much has been made of the implications of the "long tail" theorem, the notion that digital technology, digital goods and the Internet make possible a vast shift of commerce from the few large firms in any category to many hundreds to thousands of other firms.

Search market share indicates that the basic underlying theorem, the Pareto Principle, commonly understood as the "80/20" rule, does indeed operate.

But not in the ways some might predict. There is a search long tail, with four providers at the head of the curve, and then several score other smaller providers forming the tail.

Unfortunately, it does not appear that market share is much different from what might predict for physical goods. In search, as elsewhere in life, 20 percent of providers have 80 percent of the market share. In this case, a few percent of providers have 99 percent share.

"Tokens" are the New "FLOPS," "MIPS" or "Gbps"

Modern computing has some virtually-universal reference metrics. For Gemini 1.5 and other large language models, tokens are a basic measure...