Monday, December 5, 2011

What is Happening in the U.S. VoIP Market?


It’s hard to tell precisely what is happening in the U.S. residential VoIP market. According to the most recent Federal Communications Commission data, there were 32 million VoIP subscriptions in service at the end of 2010, representing a growth rate overall of about 22 percent. FCC data

The 149 million wireline retail local telephone service connections in December 2010 included 40 percent  residential switched access lines, 38 percent business switched access lines, 18 percent residential VoIP subscriptions, and three percent business VoIP subscriptions.

The FCC data suggests that 81 percent of VoIP services bought in bundles, representing in turn about 84 percent of all VoIP subscriptions, were supplied using cable modems, meaning that cable operators sell about 81 percent of VoIP connections in bundles, which in turn represent at least 84 percent of all VoIP subscriptions sold in the U.S. market.

But third quarter 2011 data at the company might suggest either that the adoption rate has slowed fairly dramatically in 2011, or that suppliers other than cable operators or telcos have suddenly begun adding more subscribers than ever before. That seems highly unlikely, based on what has been happening in the U.S. VoIP market so far.

Though telcos and independent VoIP providers have represented some VoIP market share up to this point, the FCC data show it is the cable operators who have been responsible for most of the sales and customer volume.

Company results from wireline voice service providers through the third quarter 2011 might suggest that demand is moderating, since most new VoIP subscriptions are sold by cable operators, and cable sales of VoIP clearly are slowing.

Legacy voice services offered by phone companies have continued to decline during 2011, while "digital voice" line growth from cable operators has slowed. What's happening in fixed line VoIP?

AT&T lost 10.5 percent of its wireline voice connections compared to the third quarter of 2010., Verizon lost 7.6 percent  of its total wired voice lines, and CenturyLink reported losses that would total about 6.8 percent annually on a pro forma basis for the 12-month period ending September 2011.

Offsetting  those loses are incremental new telco VoIP connections. AT&T's U-verse Voice connections increased by 119,000 sequentially while 648,000 subscribers over 12 months. HD Voice makes steady progress in mobile networks

But the volume of activity in consumer VoIP has been driven by cable operators, and it now seems as though sluggish economic conditions or wireless substitution might be issues for cable VoIP services.

But there could be other factors at work. Perhaps few, if any observers think telco voice share will dwindle away to “nothing.” For any number of reasons, including product bundles and customer preferences, the likely ultimate outcome is some reasonably stable market share structure. That means cable will reach some “natural” limit in voice, as telcos might reach some “natural” limit in video share.

It could be that cable operators are reaching the “natural” limits of demand for cable voice products. Comcast, the largest domestic cable operator, now has 9.2  million VoIP lines in service representing a 17.6 percent penetration rate of homes passed at the end of the third quarter 2011, up  from a 16.1 percent penetration rate in the third quarter of 2010.

Time Warner Cablehas 4.6 million voice customers, but added only 5,000 new VoIP lines in the third quarter of  2011.

That dramatic slowing suggests cable has reached a natural limit, but also that strong growth of wireless services now is simply as big a problem for cable operators as it has been for fixed-line telcos.

Wireless substitution continues to slowly grow virtually every year, according to the CDC, which estimated in 2010 that 29.7 of homes had only wireless telephones during the last half of 2010.

Is Competition in U.S. Telecom Now Over?

Reasonable people will differ about the potential implications of the Verizon spectrum deal with Comcast, Time Warner Cable, Bright House, where AWS spectrum owned by the cable companies is sold to Verizon Wireless, while Verizon and the cable companies agree to work together in some ways.



Some will argue the deal means the end of competition, which has been bolstered in the fixed line business by robust competition between telcos and cable companies.


Others might argue that the agreement by the cable companies to resell Verizon Wireless capacity, rather than Sprint or Clearwire service, likewise means less potential competition in the wireless business.



Others will simply point to nebulous language about “working together” that might lead to less competition. Those views could prove at least partially correct. They might also have far fewer effects on competition than many fear.



For starters, cable companies essentially have relied on three different wireless partners for wireless service, first Sprint, then Clearwire and Sprint, and now Verizon Wireless. One might argue that the resale agreement with Verizon Wireless removes key potential cable contestants from future roles as new facilities-based wireless competitors.



But cable companies in the United States, for whatever reason, have not succeeded at any of those attempts since 1994. They are going to resell under their own brand names, using somebody else’s infrastructure, no matter whose wholesale assets they use.



Nor does the deal alter the nature of fixed network competition between Verizon and the cable companies. Anybody familiar with the typical tensions within single entities over wholesale and retail sales, whether fixed or wireless, knows the actual financial interests of those staffs are not aligned.



Lots of firms buy wholesale service from Verizon Wireless, and none of that prevents the firms from competing as hard as they can in the retail markets.

Will USPS Hurt Netflix?

The U.S. Postal Service is proposing, through the rulemaking process, to move First-Class Mail to a two to three day delivyer standard for contiguous U.S. destinations. USPS first class service

Some think that will hurt Netflix DVD by mail performance. Some pundits might be tempted to quip "more than Netflix already has done to itself?"

But there is a countervailing argument. One might argue that users who really "want what they want, when they want it," or who "want it now," already have shifted to online delivery.

Also, users who have DVD by mail plans can choose plans that allow them to have multiple discs out at once. Not too many users really will be able to watch so many discs that delivery actually becomes a big problem.

Slower delivery by the USPS won't help, of course. But it remains to be seen whether it harms the DVD by mail business, which Netflix wants to wean itself off of, in any case.

Most people I know who use the DVD by mail service only watch on weekends. They're just too busy the rest of the time.

Amazon Kindle Fire First Tablet to Challenge Apple iPad?

Amazon's Kindle Fire might be the first "tablet" to get serious traction, other than Apple's iPad. In fact, Evercore Partners' analyst Robert Cihra now estimates the Kindle Fire will represent half of all Android tablet sales in 2012.


Shipments of Android-based tablets are expected to jump from 19 million to 20 million units in 2011 to 44 million to 45 million units in 2012, Digitimes says. 


Some might quibble, arguing that the Kindle Fire is an e-reader, not a tablet. But you might remember Apple CEO Steve Jobs insisting 10 inch screens were a minimum requirement for tablets. Amazon will own 50% of Android tablet market in 2012


IHS iSuppli said Friday that the Kindle Fire is expected to take second place in the global media tablet business in the fourth quarter, behind Apple's iPad. 

Amazon will ship 3.9 million Kindle Fire tablets during the last three months of 2011, according to a preliminary projection from iSuppli.Amazon Kindle Fire sales

Sunday, December 4, 2011

Small Cells Mean New Needs for Backhaul

Backhaul is a crucial enabler for mobile operators deploying Long Term Evolution networks. For starters, the amount of backhaul required is an order of magnitude up to two orders of magnitude greater than for 2G and 3G networks. Also, to minimize overall  network costs, mobile service providers are turning to small cell architectures that spot-deploy transmitters in high-traffic areas, rather than relying on use of additional frequencies and bandwidth. 


Also, the deployment of small cells in dense urban environments means traditional methods of forwarding traffic from one macro-cell site to another, before being passed off to the core network, are not always possible. 


Where it comes to small mobile cell sites, which will, by definition, cover small areas primarily in high-traffic areas, backhaul costs will have to scale to match the large number of sites, and the relatively small number of customers served at any single site. That also means new techniques will be needed.

That suggests wireless backhaul will be important, for cost reasons. There are three main categories of contending wireless backhaul solutions, many will note, including:
1 Line-of-sight (LOS) microwave systems typically operating in the 10 GHz – 42 GHz bands.
2. “E-band” LOS solutions that operate in the 60 GHz band (or in some cases at 80 GHz).
3. Non-line-of-sight (NLOS) utilizing sub 6 GHz licensed TDD spectrum. The Options for Small Cell Wireless Backhaul

In some cases, fixed network solutions might also be affordable, though it is unikely a fiber connection often will fall into that category. 


For that reason, a new approach to small-cell backhaul is required to bring down the per-megabit costs. Small cells are forcing vendors to rethink wireless backhaul for an environment where most cell locations are not in line of site with each other or with aggregation points. That means the traditional approach of relaying traffic from one tower to another before handing off to the fixed backhaul network is not possible. 


Also, new levels of cost optimization are needed, as the total cost of deploying dense small cell networks would be excessive, compared to other bandwidth approaches, without a new lower cost parameters.  Small cell backhaul costs





Saturday, December 3, 2011

V.me, Visa Mobile Wallet, to Launch in 2012

The mobile commerce ecosystem is confusing because so many of the participants occupy uncertain roles. Google, PayPal, Isis and Visa or MasterCard might be seen as partners and suppliers or, in some instances, as competitors. In a growing number of cases, major contestants are both partners and competitors. Nor are the roles likely to remain as they appear today.

Today, Visa is a transaction network, but going forward the company will provide real-time offers, alerts and other tools to consumers. http://www.myvisaoffers.com/ And though Visa works with Google, PayPal and Isis, for example, that doesn't mean Visa might not emerge as a mobile wallet provider in its own right. Indeed, it appears that is precisely what Visa has in mind.


Visa earlier in 2011 announced a digital wallet solution which is designed to provide "a secure cross-channel digital wallet and a range of customized mobile payments services that address the specific requirements of geographic markets around the world."



In early December 2011,  Visa announced it had secured the www.v.me universal resource locator for "V.me by Visa," Visa's digital wallet and global acceptance mark. V.me by Visa will be a simple and secure way to pay online as well as in person, with PC, tablet, and mobile devices using Visa and non-Visa accounts, Visa says.
The digital wallet will store Visa and non-Visa payments accounts, support NFC payments and deliver a wide range of transaction services including e-commerce, mobile commerce, micropayments, social networks and person-to-person payments. Visa wallet
The service is built on the VisaNet processing network, existing credit, debit, prepaid and commercial product platforms as well as new capabilities Visa has acquired through its CyberSource, Authorize.net and PlaySpan subsidiaries.
The digital wallet initially was scheduled to go live in the United States and Canada late in 2011, though it appears the launch now will occur in 2012. V.me launch in 2012


 and Visa has already signed up fourteen financial services players for the launch, including:
  • Barclaycard US
  • BB&T Corporation
  • Card Services for Credit Unions (CSCU)
  • ICBA Bancard
  • First Financial Bank of Ohio
  • Nordstrom fsb
  • Pentagon Federal Credit Union
  • PNC Bank
  • PSCU Financial Services
  • Regions Bank
  • Royal Bank of Canada
  • Scotiabank
  • TD Bank Group (US and Canada)
  • US Bank
Separately, Visa and Gap have teamed up for real-time coupons under a program dubbed "Gap Mobile For You." With its network, Visa can see relevance—it knows where you are and what you bought—as well as fulfillment of an offer. Mobile wallet wars 

Is Mobile Commerce at a Tipping Point?

Mobile commerce is in a hype cycle, with high expectations for growth. The issue is whether it is at a tipping point, the point of critical mass where an adoption inflection point occurs, and "overnight," it seems, a new trend gets established. 

You should expect to hear lots of speculation about mobile commerce, mobile payments or mobile wallet efforts reaching a critical mass or tipping point in 2012, as it is the time of year for attention-grabbing predictions. What is a tipping point?

You should remain circumspect. Something big is coming. But it still is "coming," it is not going to be "here" in 2012. 

The 2011 KPMG Mobile Payments Outlook, based on a survey of nearly 1,000 executives primarily in the financial services, technology, telecommunications, and retail industries globally found that 83 percent of the respondents believe that mobile payments will be mainstream within four years (by 2015).

In fact, 46 percent believe mobile payments will be mainstream within two years.

But there is room to disagree about the accuracy of those projections. One might argue that forecasts of this sort are notoriously unreliable, with respondents overestimating near term prospects.

Analysts at Gartner, for example, use a model of how expectations for significant new technologies running in a predictable cycle. What the cycle suggests is that expectations nearly always (always, according to the model) run ahead of marketplace acceptance.

What the Gartner hype cycle suggests is that expectations for mobile payments using near field communications are at a point where we can expect five to 10 years to elapse until NFC actually begins to make serious inroads as an adopted mainstream technology. The emphasis probably is important to note: “begins.”

But KPMG analysts take the opposite view, arguing that respondents are too pessimistic. “We believe that exploding smart phone growth and myriad opportunities will grow mobile payments at a much faster rate than our respondents anticipate,” said Gary Matuszak, KPMG Global Chair of the Technology, Communication and Entertainment practice.

“While KPMG believes that these forms of mobile payment will all gain some traction, our view is that mobile wallet is one of the most exciting and promising payment opportunities,” analysts say.

Mobile wallet provides the momentum to move beyond payments to participate in the entire chain of mobile commerce, from consideration and brand awareness to purchase after-sales loyalty and care,” said Tudor Aw, Technology Sector Head, KPMG Europe.
Banks, mobile service providers have key roles in mobile payment

If Gartner analysts are right about the near field communications "hype cycle," we should soon see some public "disillusionment" expressed about near term prospects for NFC. The reason is that Gartner now sees NFC at the "top" of its hype cycle, the point at which overly-optimistic projections face the reality of an extended period of development, before something "useful" actually emerges.

Internet TV, NFC payment and private cloud computing all are at what Garner calls the "Peak of Inflated Expectations," which is always followed by a period where the hype is viewed as outrunning the actual market. That suggests NFC soon will enter a phase where expectations are more measured.

In fact, Gartner now expects it will take five to 10 years before NFC is in widespread and mainstream use. Gartner's latest expectation likewise is that cloud computing and machine-to-machine applications will not be mainstream for another five to 10 years as well. Gartner's 2011 Hype Cycle

Consider some of the issues that will have to be settled before mobile commerce can reach a tipping point. First, the communication methods need to settle, whether NFC or other approaches are considered. 

Carrier billing could play a crucial role in how consumers start easing into the idea of mobile commerce, but mobile service providers will have to revise traditional pricing, as a payment scheme costing more than a credit card or debit card charge will not work, especially for small ticket purchases, where carrier billing seems most germane. 

Many of us believe "daily deals" are crucial, as a key source of new value for mobile wallet services that answer the question of "what's in it for me?" for both consumers and retailers. Right now, most mobile payment systems cannot provide simple answers for retailers and consumers.

Then there are all the privacy issues related to personalization and location-based targeting, which are certainly going to grow in importance.  Top 7 Mobile Commerce Trends in 2011:

Do Heavy Users Cause Congestion; Do Caps Work?

Capacity impact of peak-hour caps
A study of user behavior on one North American mid-tier Internet service provider’s network attempts to answer the question of whether the heaviest users really are responsible for peak-hour congestion, and whether data caps actually do much to manage peak-hour congestion.

It is an unquestioned fact that a small percentage of broadband users, on virtually any network, use vastly more data than typical users do. The top one percent of data consumers account for 20 percent of the overall consumption, for example, a fact the study by Analyst Benoît Felten confirms.

But overall consumption is not the chief issue on any network. Rather it is peak hour usage which is the gating factor when dimensioning a network in the capacity realm.

It is not the overall monthly data traffic volume by a subscriber, but when and where it is generated that is crucial, argues Monica Paolini of Senza Fili Consulting.  Operators would be better off with higher traffic volumes, as long as they are not during peak hours.

Let’s imagine that wireless subscribers have a plan with caps that apply to peak times only and unlimited access at other times, she says.  During off-peak times, a green dot appears on the smartphone and subscribers know that then their data usage does not count against their data allowance. In this case, we would expect overall higher data consumption, but more diluted through the day.

The chart shows what should happen in this scenario, with different rates of peak/off-peak substitution, assuming that the increase in non-peak traffic will be four-times as large as the decrease in peak traffic.

Starting from a base case of a usage of 1 GByte per month (current average usage is around 500 MBytes, and with the current traffic growth rates, average traffic per month will probably hit the 1 GByte mark in 2012. Peak hour congestion is the issue

The first step, though, is to better understand actual usage, whether by typical users or power users. Analyst Benoît Felten, for example, has wondered for some time about the extent to which power users create out-sized stress on access networks.

Recently, Felten’s firm was able to analyze data from a mid-sized North American ISP to test his hypothesis that “data hogs do not cause unusual congestion” at peak hours, even though some users consume vastly more data than a typical user. Data caps unnecessary?

The study, Felten says, shows that data consumption, overall, is at best a “poor” proxy for bandwidth usage, despite the clear pattern that “heavy” users consume vastly more data than typical users.

Where average daily data consumption over the period was 290 MBytes, the “very heavy” consumers consumed 9.6 GBytes. This roughly equates to data consumption of 8.7 GBytes and 288 GBytes per month, respectively. So the heaviest users consume two orders of magnitude more data than typical users.

But Felten argues that bandwidth usage outside of periods when the aggregation link is heavily loaded (which he defines as  75 percent load) has no impact on costs or other users.
Felten is right to focus on peak-hour congestion.  

“The results show that while the number of active users does not vary significantly between 8 AM and 1 AM, the average bandwidth usage does vary significantly, especially around late afternoon and evening,” he says.

This suggests that the increase in network load is not a result of more customers connecting at a given time, but a result of customers having a more intensive use of their connections during these hours.

The study does confirm that a small percentage of users dominate peak-hour usage. About six percent of all customers (and 7.5 percent of active customers) are among the top one percent of bandwidth users at one point or another during peak hours. The twist is that Felten’s  analysis also does suggest that 80 percent of peak load is generated by the heaviest users over a billing cycle.
In other words, perhaps 20 percent of peak-hour demand is created by “typical” users, not the “bandwidth hogs.”

Oddly enough, though Felten and some other observers might say this “confirms” the thesis that heavy users do not cause peak-hour congestion, the data seem to contradict the theory.
In fact, some might argue Felten’s analysis mostly confirms the theory that the heaviest users over a billing cycle are the heaviest contributors to peak-hour congestion as well.. He argues that “the correlation between real-time bandwidth usage and data downloaded over time is weak.”

A reasonable argument can be made, though, that data caps don’t seem to address peak-hour congestion.

But many, including perhaps most economists, would argue that bandwidth consumption is a good like any other, susceptible to price and other policies that can shift demand.

Of course, many policy advocates would not be in favor of such mechanisms, which obviously could include peak-hour pricing that is higher than off-peak pricing.

Users might not prefer such approaches, either, as pricing predictability is a major plus for most users. Also, many, if not most access providers might also prefer not to incur the overhead of billing on a differential basis.

But many would note the  the potential value of value-based pricing that can incorporate quality metrics, time of day priorities, off-peak pricing or other ways to create incentives for off-peak use and premium pricing for peak-hour use or peak-hour quality.

So are heavy users the problem? Felten seems to argue they are not. The data might suggest they are. The data do show there are indeed heavy users during peak hours, and 80 percent of them are the same people who use the most data over a billing cycle.

The issue might be viewed as determining whether “heavy users, at peak hours” are the same people as “heavy users, over a billing period.” That appears to be the case about 80 percent of the time.

Felten argues that bandwidth caps do not necessarily alleviate congestion problems, and he is right about that. Do data hogs cause congestion? If not, then it makes more sense to use other pricing and value mechanisms to shape demand, one might argue. The question might then be whether other schemes that would work are acceptable to end users and service providers.

Will Comcast, Time Warner Sell Clearwire Stake?

Original Clearwire Ownership
Now that Comcast, Time Warner Cable and Bright House Networks have sold their AWS spectrum to Verizon Wireless, announced they will stop selling Clearwire service and agreed to buy wholesale service from Verizon, one has to believe the cable companies also will sell their shares in Clearwire. 


Comcast and Time Warner say they will wind down their Clearwire business over the next six months, and plan to move their existing customers to other options. 


Not that the companies themselves, or rational observers, would claim that the services have been a wild success. 


Comcast appears to have about 30,000 customers using the mobile service, out of 17.8 million traditional broadband subscribers. Bright House, which had the option to resell 4G service, never actually did so.


Time Warner, Comcast and Bright House Networks were among the original investors in Clearwire. Over time, Sprint has slightly reduced its ownership to about 54 percent, while Intel has said it will sell all of its stake. 





Charter Boosts Speeds


In markets where Charter Communications has deployed DOCSIS 3.0 technology, about 95 percent of its service area, the company is increasing Charter Internet Express download speeds from 12 Mbps to up to 15 Mbps, and increasing upload speeds from 1 Mbps to up to 3 Mbps.

"Charter Internet Plus" download speeds are being increased from 18 Mbps to up to 30 Mbps, and upload speeds are being increased from 2 Mbps to up to 4 Mbps.

Those changes are being made with no increase in cost. In 2010, for example, cable companies outgained telcos about two to one for net additions (new customers less departing customers). 2010 net broadband additions



The speed boosts are the cable company's fourth speed increase in the last three years.

Facebook Collects Data from any Visits to Pages with "Like" Buttons

Computing guru Richard Stallman, creator of the GNU Project and founder of the Free Software Foundation, points out a feature of Facebook that most of us do not know about.

Facebook does massive surveillance, he argues. If there is a "Like" button on a page, Facebook knows who visited that page and it can get the IP address of the computer visiting the page, even if the person is not a Facebook user.  Privacy issues

Friday, December 2, 2011

Telefónica to Introduce Europe-Wide Mobile Wallet Service

Telefónica's new Digital unit and Giesecke & Devrient have reached an agreement to establish a single European-wide platform for Near Field Communication services, with Giesecke & Devrient providing the trusted service manager platform, including  over-the-air transfer and personalization of NFC applications.


Telefónica's wallet application, to be available in Europe over the next few months will also be made available to third-party service providers, including financial institutions, transit operators, and loyalty partners.

Microsoft Will Buy Netflix, LinkedIn In 2012, IDC Predicts

It's the time of year for predictions, and technology research firm IDC has a few predictions of its own, including the forecast that Microsoft will buy Netflix to give it a foothold in online video entertainment and LinkedIn to get into social networking. 


Those moves would be part of a flurry of mergers and acquisitions in 2012 by companies seeking to increase their presence in cloud computing, social networking and online content. The 
IDC predictions about Microsoft and Netflix actually also were made by IDC 12 months ago.


Chief analyst Frank Gens said it makes even more sense now, given Netflix’s diminished market value and expected losses next year from growing content licensing bills. Microsoft Predicted To Buy Netflix, LinkedIn in 2012

“In 2012, part of Microsoft’s challenge is to counter what Apple and Amazon have done and what Google is building up, a really strong media and content marketplace,” Gens said.


By offering movies, music and other content, Apple , Amazon  and Google  are aiding their mobile device ecosystems, including tablets and smartphones. “Without a media and content cloud, the competitiveness of Microsoft’s mobile platforms could be greatly diminished,” Gens says.


In many ways, that observation also illustrates the changing nature of the consumer electronics business as well. To a greater extent, the value of a device hinges on the content resources available to users of the devices. 

Market More Competitive if AT&T Gets T-Mobile USA?

Most people looking at the proposed AT&T purchase of T-Mobile USA might conclude that the deal would reduce competition in the U.S. mobile market. Others might argue it would be more efficient, if not more competitive.

Some people might argue that the apparent failure of the bid is "a potential missed opportunity for consumers to benefit from more carrier competition." Precisely why that might be the case is not explained. Even without AT&T-Mo, we still have no competition

One argument about the "limited state of competition" in the U.S. mobile market is that the existence of incompatible air interface standards (CDMA and GSM) means consumers cannot move their devices freely among the leading carriers, thus limited competition. That argument is fine, as far as it goes. In that vein, one might also note that the carriers do not all use the same spectrum bands, either.

But that problem is going away, since all the U.S. mobile providers have settled on Long Term Evolution. Whether we will see a dramatically different competitive environment, just because all users have LTE handsets, remains to be seen, when LTE is firmly established.

One countervailing argument to the "handset freedom" argument is complicated, but might explain why "competition" will not be dramatically different once all carriers have moved to a single air interface standard. Consumers could buy unlocked handsets and then use several carriers, even now, with the market using different air interfaces. But few consumers choose to pay full price for handsets, especially as we now are moving to a smart phone market where the full retail price of a new device can be $500.

Most consumers simply choose to limit their freedom by signing service contracts that reduce the cost of new handsets to no more than $200. After two years, they typically want to replace those older models with the latest new models in any case, so they once again have to decide whether paying full retail or exchanging a contract for a subsidized phone makes more sense.

To the extent that competition is limited, it is voluntary. Consumers would rather get the cheaper devices, even if the price is a service contract.

That is not to say there are no competitive issues, or potential competitive issues, in any part of the U.S. communications business. But device portability and air interfaces do not seem to be the biggest issues.

Do Data Hogs Cause Peak-Hour Congestion?

It is an unquestioned fact that a small percentage of broadband users, on virtually any network, use vastly more data than typical users do. The top one percent of data consumers account for 20 percent of the overall consumption, for example.


In the absence of mechanisms--or demonstrated end user demand--to price by value, rather than on a flat rate, most service providers rely on simple monthly data caps to attempt to regulate usage overall. 


But that doesn’t necessarily affect peak-hour usage, some will argue. One issue is that users vastly prefer “buckets of usage” with predictable recurring costs, to metered pricing. So peak-hour pricing would introduce some element of pricing uncertainty, which consumers presumably would not prefer. 


Presumably a better tactic would be creation of “additional fee” services that provide quality of service at peak hours, for users willing to pay. Some will object to such policies as creating a “two-tier” Internet. Others will simply say it offers consumers choice. 


But are heavy users the  problem?


The question might seem silly. If the big problem for an access provider is peak hour congestion, then heavy users would seemingly have to be part of the analysis. But the question some would ask is “who are the heavy users, at peak hours?” That might be a different question than “who are the heavy users, over a billing period?”

Some argue that bandwidth caps do not necessarily alleviate congestion problems. Do data hogs cause congestion? If not, then it makes more sense to use other pricing and value mechanisms to shape demand. Do Data Hogs Cause Peak-Hour Congestion?


A new analysis by analyst Benoît Felten suggests the answer is highly nuanced. Felten argues that about 78 percent of peak-hour congestion is caused by the heaviest users. 

On the Use and Misuse of Principles, Theorems and Concepts

When financial commentators compile lists of "potential black swans," they misunderstand the concept. As explained by Taleb Nasim ...