Saturday, May 18, 2013

Two Orders of Magnitude More Access Speed Within 10 Years? History Says "Yes"

If history is any guide,  gigabit Internet access will not be at all rare in a decade, though how much demand for 1-Gbps might well hinge on retail pricing levels.

In August 2000, only 4.4 percent of U.S. households had a home broadband connection, while  41.5 percent of households had dial-up access. A decade later, dial-up subscribers declined to 2.8 percent of households in 2010, 68.2 percent of households subscribed to broadband service.

If you believe gigabit access is to today’s broadband as broadband was to dial-up access, and you believe the challenge of building gigabit networks roughly corresponds to the creation of broadband capability, a decade might be a reasonable estimate of how long it will take before 70 percent of U.S. homes can buy gigabit access service, and possibly 50 percent do so.

Consider that by June 2012 about 75 percent of U.S. households could buy a service of at least 50 Mbps, while half could buy service at 100 Mbps. So it took about a decade to put into place access at two orders of magnitude higher than the baseline (dial-up speeds).

The key distinction is between “availability” and “take rate.” Even though consumers are starting to buy faster access services, most seem to indicate, by their buying behavior, that 20 Mbps or 25 Mbps is “good enough,” when it is possible to buy 50 Mbps or 100 Mbps service.

In the U.K. market, for example, though service at 30 Mbps is available to at least 60 percent of homes,  buy rates were, in mid-2012, at about seven percent (to say nothing of demand for 100 Mbps).  

The European Union country with the highest penetration of such services was Sweden, at about 15 percent, in mid-2012.

To be sure, retail prices are an issue. With relatively few exceptions, U.S. consumers tend to buy services up to 25 Mbps, and price for a gigabit service is probably the main reason.

That is the reason behind Google Fiber's disruptive pricing for gigabit access at $70 a month. That pricing umbrella implies far lower prices for 100 Mbps service than consumers can buy at the moment.

And that is every bit as important as the headline gigabit speed. If a gigabit connection costs $70 a month, will a 100-Mbps connection cost $35 a month?

Friday, May 17, 2013

Fixed Networks are for "Capacity," Mobile Networks are for "Coverage"


These days, in many markets, people using smart phones are on the fixed networks for Internet access, more than on the mobile network. 

In North America, as much as 82 percent of Internet traffic from smart phones occurs indoors, where users can take advantage of Wi-Fi, instead of the mobile network,

suggests.


In Western Europe, as much as 92 percent of Internet usage from smart phones occurs indoors.

So to a large extent, that means the fixed network provides “capacity,” while the mobile network provides “coverage,” a statement that describes the two ways a small cell can provide value for an urban mobile network as well.

For the most part, Wi-Fi offload happens mostly in the office and the home. Some small cells will include Wi-Fi access, but the volume of Internet activity still occurs indoors, not outdoors where small cells will reinforce the mobile macrocell network.

Some tier one carriers have moved to create their own networks of public Wi-Fi hotspots, and many can serve customers from owned fixed networks as well. That makes the fixed network and public Wi-Fi a direct part of the mobile network.

In other cases, carriers simply passively allow their devices to use non-affiliated Wi-Fi networks, as when a mobile service provider allows a user to roam onto a fixed network owned by another service provider. 

That is one more example of the loosely-coupled nature of the Internet ecosystem. A mobile provider can offload traffic to another carrier with which it has no formal business relationship. 








Thursday, May 16, 2013

Will TV White Spaces Be Important?


Whether TV white spaces spectrum is going to be a big deal or not might hinge on how much real spectrum is available in given markets, plus manufacturing volume to get retail prices of network elements down to a level where the spectrum has a reasonable business case.

At a high level, it isn’t going to help as much in urban areas, where interference issues are more constraining.

It might prove quite important in some rural areas where is a lot more bandwidth because there are enough people living in a region to create incentives for TV broadcasters. In areas where few people live, there might be lots of bandwidth, just few potential users or customers. Every location is different.

While at least 6 megahertz is available throughout most of the United States, there are a few locations where there is much more spectrum available. Though most of the spectrum cannot be used in most locations, the white spaces band includes 150 MHz of spectrum in total.  


Other sources of lightly-regulated or unlicensed spectrum might be made available in the future. And new technologies such as agile radio and ultra-wideband technologies are available, but regulator action is required to enable use of such technologies.

And though the general rule has been that spectrum is licensed to entities for specific purposes, unlicensed spectrum might be crucial.

"Unlicensed" Spectrum Doesn't Always Mean "You Can Use it" Without Paying


It sometimes is easy to forget that it isn’t as easy to become an ISP in some nations, as in others. Consider the matter of “unlicensed spectrum,” for example.

“Unlicensed” spectrum exists. But use of that spectrum is, in about 66 percent of nations, is not really non-licensed. Based on responses from 75 countries, 33 percent of national regulators require a license to use 2.4, or 5 GHz “unlicensed” spectrum, a study found.

Another 33 percent of national regulators require obtaining an operating license, though not a spectrum license. About 33 percent do not require a license of any type. In a small fraction of cases (two percent) use is forbidden.

"Mix and Match" is one Advantage of Software Defined Networks


If you wanted to rip cost out of any device, app or network, what are some of the things you’d do? You’d remove parts. You’d probably reduce size.

Shoretel, for example, sells docking stations for iPhones and iPads, for example, that allow the iOS devices to act as the CPU, when the docking station providing all the peripheral support.



And that's as good an example as any of how a network designer would try and wring cost out of a network.


You’d rely as much as possible on functionality that could be supplied by other common CPUs. 

You’d pick passive solutions, not active solutions, as often as possible. You’d simplify with an eye to improving manufacturing tasks.

You also would create a “mix and match” capability about specific makes and models of network gear. You’d want to be able to use a variety of network elements, made by many different suppliers, interchangeably.

You’d create a network using common, commodity-priced devices as much as possible.

In other words, you would make devices, networks and apps "as dumb as possible," and as flexible and interchangeable as possible.

If you think about software defined networks, that’s an application of the principle. Not “everything” about SDN is “as dumb as possible;” only the data plane elements.

The corollaries are that such approaches create networks that also are “as flexible as possible” and “as affordable as possible.”

The control plane you would still want to be as “smart as possible,” and you would be able to afford to do so, since the key to much cost reduction is the ability to share a physical resource across a broad number of end users, subtending devices or nodes.

That is why the per-user or per-customer cost of an expensive headend is rather low, as a percentage of the total cost of a network. On the other hand, the total cost of smart CPUs (decoders) used by every end user or customer is so high because there is so little sharing possible: each customer needs one or more decoders.

That was what drove cable operator Cablevision Systems to adopt a “network digital video recorder” strategy. By centralizing the CPU functions, the ability to share the cost of processing infrastructure was vastly improved.

The broader principle is that one proven way to reduce cost, increase flexibility and enhance functionality is to separate and centralize control plane (CPU) functions from the data plane functions that are widely distributed throughout a network.

That’s the whole point of software defined networks.


What Does Your Business Look Like if Access Bandwidth is Not a Constraint?

There is one thread that underlies thinking and business strategy at firms as disparate as Google, Netflix and Microsoft, namely Moore's Law. Even if communications does not scale in the same way as memory and processing, Moore’s Law underpins progress on the communications front, at least in terms of signal compression, the power of network elements and cost of those elements and systems built on those building blocks.  


As Intel CEO Paul Otellini tells the story, Moore’s Law also implied an inverse relationship between volume and price per unit. Over time, processing and memory got more powerful and cheaper in a linear way.


The implication for Intel was that it would have to shift from producing small numbers of components selling for high prices to a market where very large numbers of very cheap components was the context of the business. “Towards ultra-cheap” is one way to describe the progression of retail prices.

You might argue that assumption also drove Microsoft’s decisions about its software business (“what does my business look like if computing hardware is very cheap?”), the confidence Netflix had that broadband would support high-quality streaming (“Will access bandwidth be where it must to support our streaming business?”) and the many decisions Google makes about the ability to support software-based businesses using advertising.

You might argue that the emergence of cloud computing is reshaping existing content and software businesses precisely because of the question “what would my business look like if access bandwidth were not a constraint?”

For Intel, the implications were a radical change in component pricing, reflected back into the way the whole business has to be organized.


Ubiquiti illustrates a related principle, namely the role of disruptive pricing in a market. Ubiquiti has operating expense in single digits where a traditional technology supplier has operating expense in the 30 percent to 60 percent range.


That allows Ubiquiti to sell at retail prices competitors cannot easily match.

source Justin Sullivan/Getty Images

BT Changes Mind About Branded Mobile Service


BT appears to have changed its mind about the retail mobile market. Having won 4G spectrum (2x15 MHz of FDD and 20 MHz of TDD 2.6GHz spectrum), BT suggested it would not build a national retail network but use the 4G spectrum as a way to augment its fixed network operations.

Now BT says it will launch its own retail 4G network. The thinking is that BT will source wholesale mobile connectivity from one of the U.K. mobile service providers to provide full mobile access, while using its own spectrum largely for fixed or location access.

That raises some interesting new questions. BT is not the first service provider to imagine using a mix of wholesale “mobile” access and “Wi-Fi access whenever possible.” Republic Wireless, for example, is using precisely that approach, offloading Internet access to Wi-Fi whenever possible.

But the new issue is the degree to which Wi-Fi roaming could allow an ISP to create an “untethered” but not fully mobile service offering, as cable operators basically are doing with their public hotspot networks, creating a national Wi-Fi roaming capability.

In BT’s case, wholesale mobile spectrum would allow users to use the Internet when they are in transit, with the expectation that most Internet use will happen when people are at home, at work, or within range of a public Wi-Fi hotspot.

That is why some believe small cells incorporating Wi-Fi will be a game changer for mobile service providers, easing heavily congested data pipes while linking together billions of devices into a single network architecture, according to the IHS iSuppli.

Small cells--low-power base stations each supporting approximately 100 to 200 simultaneous users--will augment mobile coverage and capacity in dense urban areas.

That is the mirror image of the BT approach, which augments fixed coverage with a mobile overlay.

So where mobile operators will use Wi-Fi to offload mobile traffic, BT essentially will use mobile to augment and “upload” fixed traffic.

But both of those approaches blend “mobile” and “fixed” Internet access. The unknown is whether there could arise a market for Wi-Fi-only devices that take advantage of the growing availability of Wi-Fi, much as Wi-Fi-only tablets get used.

Already, in most developed nations, 80 percent to 95 percent of the time, smart phone users are in zones where Wi-Fi can be the primary Internet connection, when they use the Internet.

Global Telecom Revenue to Grow 3.4% in 2013

The total worldwide telecom market grew by 3.2 percent during 2012, and IDC is forecasting growth of 3.4 percent during the 2013 time frame, with the market settling into a steady growth rate of about 3.2 percent during the forecast time frame," according to Courtney Munroe, GVP, Worldwide Telecommunications, Mobility, and Network Infrastructure, IDC. 

But those service provider revenues will be unevenly distributed. What is not so clear from those global statistics is the actual pattern of growth and decline regionally.

Revenue growth, though slower than it had been in the first decade of the 21st century, will continue everywhere but Europe. 

The Asia Pacific region will lead growth. But Africa is growing faster than many think. 

Telecom retail revenue in Latin America will grow at a compound annual growth rate (CAGR) of 3.3 percent between 2012 and 2017, according to Analysys Mason.


But the European telecom service market decreased for the third year in a row in 2011, by 1.5 percent, the European Telecommunications Network Operators Association reports. 

In the third quarter of 2012, European carrier revenue contracted, though growing in other regions such as China, the United States, India and South America.


Even in the United Kingdom and Germany, the markets with the brightest future, STL Partners forecasts a respective 19 percent and 20 percent revenue decline in mobile core services (voice, messaging and data) revenues by 2020.


Revenue in the French market will decline 34 percent by 2020. In Italy, revenue will drop 47 percent and in Spain revenue will drop 61 percent by 2020.


Overall, STL Partners anticipates a reduction of 36 percent or €30 billion in core mobile service revenues by 2020, a loss of about €50 billion for Europe as a whole.


Europe's share of the global telecoms market has been declining regularly over the recent years, from 31 percent  in 2005 to just over 25 percent in 2011 as the gap between global growth (+3.2 percent in 2011) and growth in Europe widens.

Gigabit Network Coverage Almost Has to be Uneven, At First


It hasn't happened yet, but it is predictable that, at some point, concerns will be raised about the extent of coverage of gigabit access networks in the United States. One might argue that is a fair public policy concern, but also a concern that some might argue has to be secondary to promoting the building of gigabit networks as widely as possible.

Even within its own chosen cities, Google Fiber builds first in neighborhoods where the expressed demand is the highest.

Now AT&T, facing Google Fiber in Austin, Texas and Kansas City, Mo., believes there will be demand for gigabit or other similar very high speed networks in neighborhoods, if not whole areas of every city.

AT&T Chairman and Chief Executive Randall Stephenson says AT&T is not the only ISP that will want to provide gigabit or other very high speed service,  though perhaps in neighborhoods with many potential customers, rather than "everywhere."

Other projects, such as Gig.U, have roughly the same idea, that communities within cities, anchored by colleges and universities are the ways to get gigabit access networks up and running.

The point is that pushing forward will require deploying where that is possible: where there is demand. That will be an uneven process, almost by definition. And that is going to raise hackles, because communications is a political business.

There eventually will be complaints about universal service, or the communications equivalent of “redlining,” where whole neighborhoods might be deemed “low priority” or “no priority.” But that is just a problem we will have to face.

Given the uncertain business model and high costs of upgrading access networks for gigabit operation, we will have to push forward bit by bit, area by area, where the chances of sustainable success are highest.

The biggest single problem most would-be ISPs face when trying to provide  low cost, universal Internet access is a sustainable revenue model. Grants won’t do it. Permanent government support won’t do it. Good intentions won’t do it.

A self-sustaining revenue model of some sort is necessary. Though indirect mechanisms might be possible in some cases (ad-supported Wi-Fi hotspots, for example), in most cases actual end users will have to support the continued operation of the networks.

The second biggest problem is inability to get government permission to do so (licensing, spectrum). In many countries, ISPs are required to get telecom licenses, or can use unlicensed spectrum, but only if they pay a licensing fee.

That adds expensive overhead for any set of entrepreneurs trying to bring Internet access to everybody, under difficult financial circumstances.

Even for well-heeled providers in the United States, gigabit networks might have to be spot deployed.

"The key is being able to do it in areas where you know there's going to be high demand, and people are willing to pay the premium to be able to do it," Stephenson said.

Stephenson suggested the ideal level of potential subscribers would occur when 25 percent to 35 percent of households in a neighborhood want it.

There are sure to be complaints that such a deployment process is unfair. But without an attitude of “build gigabit networks where you can,” we run the risk of slowing the availability of such networks anywhere.

Google Maps Goes "Personal"



The new version of Google Maps will personalize and customize map detail for each user. 

When users set "Home" and "Work" locations, star favorite places, write reviews and share with friends, every Google Map instance automatically will include such details within the mapping experience. 

The new "carousel" gathers all Google Maps imagery in one spot enabling users to fly through cities, walk canyon trailsclimb mountains, and even swim the oceans, Google says. 
And on a WebGL-enabled browser, like Google Chrome, the carousel includes the Earth view which directly integrates the 3D Google Earth images into maps. 



Wednesday, May 15, 2013

Send Money to Anybody with an Email Address Using GMail

One of the adoption issues with Google Wallet or any other type of mobile payment system is getting a critical mass of users and merchants. One new helpful wrinkle for Google Wallet is a coming ability to send money to anybody with an email address (no need for them to use GMail). 

That isn't a direct mobile payment app, but certainly will have critical mass. 

AT&T Sees Gigabit Networks Where 25% of Households Want It


Google Fiber is having the intended effect, namely convincing other leading ISPs that gigabit networks are feasible and have demand.


AT&T, now facing Google Fiber in Austin, Texas and Kansas City, Mo., now believes there will be demand for gigabit or other similar very high speed networks in neighborhoods, if not whole areas of every city.

AT&T Chairman and Chief Executive Randall Stephenson says AT&T is not the only ISP that will want to provide gigabit or other very high speed service,  though perhaps in neighborhoods with many potential customers, rather than "everywhere."

"The key is being able to do it in areas where you know there's going to be high demand, and people are willing to pay the premium to be able to do it," Stephenson said. 

Stephenson suggested the ideal level of potential subscribers would occur when 25 percent to 35 percent of households in a neighborhood want it. 

The real implications are not limited to the actual number of consumers able to buy, and buying, gigabit connections. The more important implication will be the number of consumers who buy 50 Mbps or 100 Mbps connections by 2020. 


By 2020, it is possible, perhaps even likely, that 100 Mbps will be a common consumer access speed.
In 2002, it is hard to remember, only about 10 percent of U.S. households were buying broadband service. A decade later, virtually all Internet-using households were buying broadband access service.
The point is that an order of magnitude increase, over about a decade, is doable.

In 2009, long before there was Google Fiber, Technology Futures predicted that, in 2015, about 20 percent of U.S. households would be buying access at 100 Mbps, about 20 percent at 50 Mbps, and something more than 20 percent will be buying service at about 24 Mbps.
It might be a bit of a stretch to hit that 2015 forecast, but the direction and momentum is clear enough, as implausible as it might have seemed just two years ago.
Even mobile networks could be offering breathtaking amounts of bandwidth in a couple of decades. 







Streaming Displacing Linear Radio, Stitcher Survey Suggests

It probably will not surprise you to learn that a survey suggests younger U.S. residents prefer to listen or watch "on demand" rather than listen to the radio. 

 Mobile streaming service Sticher conducted an online survey of 2,000 adults and found that 57 percent believe that in five years, Americans will primarily listen to streaming radio options over traditional AM/FM radio. 

The survey also found that U.S. residents aged 18 to 34 are far more likely to watch movies, television, and listen to music always or mostly on demand than their older counterparts. Duh. 

Streaming is displacing legacy modes of audio and video entertainment consumption virtually everywhere globally. 

strategy analytics global music streaming sales

Will Eventual Sprint Owner Try to Disrupt U.S. Mobile Market?


The outcome of the bidding for Sprint is not yet concluded, but it might not be incorrect to suggest that whether Dish Network or SoftBank wins Sprint, there is potential for a disruption of the U.S. mobile market. If Dish wins Sprint, the attack is more likely to take longer to develop.

Dish would immediately create bundles (Sprint voice and data with Dish video). But the big changes would come later, when the video portion of the bundle shifts from satellite to mobile delivery. But that would take time.

If SoftBank wins Sprint, the assault would come quicker. While SoftBank would aim to leverage new services and applications, as it did in the Japanese market, SoftBank also might launch a disruptive assault on retail pricing and packaging, as it also did in Japan.

How much the assault would resemble what Illiad’s “Free” has done in the French market is unclear. But Free’s attack, squarely on disruptive levels of value and retail pricing, has allowed it to grab nine percent market share in about 15 months.

Iliad offers users unlimited calling (domestic) and 3 gigabytes of data for EUR20 per month, prices that have proven attractive enough to entice nine percent of French consumers to change providers.

To be sure, European mobile revenue has been under pressure since at least 2007. In part, that is because mobile data revenue has so far failed to compensate for the sharp decline in mobile voice revenue, according to Wireless Intelligence research.

That study found mobile average revenue per user had fallen by 20 percent between 2007 and 2010, dropping from EUR25 in 2007 to EUR20 in 2010 on average.

A major reason is a decline in the average per-minute price for voice calls, which dropped from EUR0.16 to EUR0.14 in the EU27 mobile markets over the period. France has been particularly affected, one might argue, because of the new level of competition.

SFR in the first quarter of 2013 suffered an 11 percent  year-over-year revenue decline, as it faces the second year of price competition with Free, the upstart mobile operation owned by Illiad.

Orange recently reported an 8.1 percent fall in first-quarter 2013 revenue.  

In the 15 months since its launch, Free has secured around nine percent of the French mobile customer base.

Illiad Group, parent company of Free Mobile, said that sales in its mobile business had increased by 202 per cent to €294.5 million from €97.5 million, with the company adding 870,000 mobile subscribers during the first quarter to take its total to 6.08 million.

EU27: Average Revenue Per User (ARPU) 2007-2010
Source: Wireless Intelligence

And, as competition in the French market has at least one of the three leading providers looking for some way to merge with another entity to strengthen its position, a SoftBank assault could spur more thinking about market structure. In France, the thinking is that four providers probably cannot survive, even if French regulators say that is a minimum number of providers necessary to preserve competition.

In the United States, where regulators and antitrust officials likewise have suggested that four is the minimum number of providers necessary to ensure robust competition, a disruptive price assault by SoftBank would challenge those notions.

As the Broadband Availability Gap study suggested, a three-competitor market reduces ISP average revenue per user by 28 percent, and take rates by 75 percent. And the four-leader U.S. market has impact arguably more severe than that.

Tuesday, May 14, 2013

A 3-Provider ISP Market Reduces ARPU 28%, Take Rates by 75%

Competition has implications. As the broadband availability gap study suggested, a three-competitor market reduces ISP average revenue per user by 28 percent, and take rates by 75 percent. 

That latter statistic is quite important, as it means any single facilities-based ISP strands as much as 75 percent of its invested capital.

The corollary is that the cost per customer has to reflect the cost of the stranded assets. If, for example, the cost of the network were $1,000 per location, at a 25 percent take rate, the cost per customer would be $4,000. 


That's one reason why the economics of wireless networks are better than fixed networks, especially under competitive conditions. There is much less stranded capital. 


One concrete example is the French mobile market, where upstart "Free," owned by Illiad, is forcing the other three legacy carriers to cut prices as they lose market share. Since the launch of Free, revenues, incumbent profit margins and market share have been lost. 

SFR suffered an 11 percent  year-over-year revenue decline, as it faces the second year of price competition with Free. The three legacy providers (Orange, SFR and Bouygues) have seen average revenue per user in France fall to $36.25 during the last quarter of 2012, from $43.37 a year earlier, according to Informa.







Goldens in Golden

There's just something fun about the historical 2,000 to 3,000 mostly Golden Retrievers in one place, at one time, as they were Feb. 7,...