Tuesday, May 21, 2013

1-Gbps LTE? Yes, But You Need 40 MHz of Bandwidth

There is a simple answer to the question of why mobile service providers and would-be providers want more spectrum. As usage continues to climb, and as access speeds continue to climb, there is little chance of boosting access speeds, in the mobile or fixed wireless realms, without adding more spectrum.

For example, it is possible to deliver 1-Gbps mobile Internet access using Long Term Evolution, but that requires a 40-MHz block of spectrum, not the 10 MHz or 20 MHz channels now used by LTE providers.

LTE Advanced vs LTE
Numerical superiority: LTE Advanced vs LTE
Source: 3GPP



Monday, May 20, 2013

Dish Network Offers $2 Billion for LightSquared’s Spectrum

imageDish Network reportedly has offered to buy LightSquared spectrum, offering $2 billion for LightSquared's 60 MHz of spectrum.

LightSquared apparently has until May 31, 2013 to accept the offer, which was made May 15, 2013, Bloomberg reports.

Dish already has received Federal Communications Commission permission to use former mobile satellite service spectrum to create a terrestrial Long Term Evolution network.

Dish had acquired that spectrum for $3 billion from bankrupt satellite companies DBSD North America Inc. and TerreStar Networks.


As with most initiatives undertaken by Charlie Ergen, Dish Network CEO, there typically are a number of ways to monetize an asset. Ergen always has believed spectrum has value, whether to support an on-going business venture or simply as an asset to be sold. 

But most observers might agree that Dish Network is acting as though it has clear intentions of entering the mobile business, and is not simply "bluffing."














Google Hangouts Video on AT&T Getting Broader Support


Mobile service providers have had a complicated relationship with over the top applications viewed either as displacing existing revenue-generating services (carrier voice services) or imposing high loads on mobile networks (video apps and video conferencing apps).

That is one reason why, at least initially, use of Apple iPhone “FaceTime” was restricted to Wi-Fi access on the AT&T mobile network, for example.

To be sure, there are both public policy issues (can a person use a lawful application) and .  management issues (how do resource intensive apps get access to the network?) In the past, there also have been business model issues (can a mobile service provider support unlimited use of video for a flat rate price?)

Google Hangouts provided the most recent issue. Hangouts unifies Google messaging services, including video chats and conferencing. But AT&T indicated initially that video chats could be used only on Wi-Fi networks.

AT&T seems to have quickly clarified that policy, at least for some users. AT&T originally had allowed mobile use of Hangout video chats on Apple, Samsung and BlackBerry devices used on “Mobile Share” or tiered data plans (3G). Long Term Evolution support will be enabled by mid-June, AT&T says.

In the second half of 2013, AT&T will enable pre-loaded video chat apps that work on the mobile network for all customers, regardless of data plan or device; that work is expected to be complete by year end.

Today, all of its customers can use any mobile video chat app that they download from the Internet, such as Skype, AT&T also says.

Smart Phone Shipments Will Pass Feature Phones in 2013


Global smart phone shipments will surpass shipments of basic and feature phones for the first time in 2013, according to NPD.

Global smart phone shipments are expected to reach 937 million units in 2013, compared to just 889 million units for basic phones and feature phones.

Between 2011 and 2016, smart phone shipments will grow at a compounded annual growth rate of 26 percent, to 1.45 billion units, which will account for 66 percent of the mobile phone market.

Emerging markets are driving most of the smart phone growth, NPD researchers say.  In these markets, entry-level smart phones priced below $200 are important.

China leads in the entry-level smart phone category, comprising 55 percent of shipments. China is also the largest market for smart phones as a whole, and the Asia-Pacific region will account for over 50 percent of smart phone shipments in 2013.

At the high end of the market, LTE-enabled smart phones will reach 23 percent market share in 2013, NPD DisplaySearch says.

Screen sizes are also changing. In 2013, more than half (57 percent) of smart phone displays will range between four and five inches, while screens larger than five inches will grow to 16 percent of the market.


How Big a Revenue Boost from LTE?


Whenever a next-generation mobile network replaces an older network, there typically is room for both product substitution that does not dramatically affect total revenues, and incremental revenue lift, initially from higher prices, and later from new services.


Juniper Research forecasts Long Term Evolution network subscribers will double from an estimated 105 million subscribers in 2013 to nearly 220 million in 2014.

What that means in terms of incremental revenue is less clear, though many service providers are using the LTE rollout as an opportunity to raise data plan prices. In many markets, 4G data plans will simply cannibalize 3G plans, with some incremental revenue lift if operators are able to charge a 4G pricing premium.

That might be more the case in developing markets, where 3G cost premiums over 2G rates were quite significant.

But market conditions might lessen the amount of price premium possible in particular markets. In many markets, 4G tariffs had to be lowered, or usage buckets increased, while price remained constant,  because of market conditions.

And some competitors have simply chosen not to charge a premium for LTE access.

In some cases, consumers think the 4G prices are too high.


That is not to say new applications are unimportant. It might turn out that revenue lift occurs for indirect reasons, such as users consuming more mobile data as appetites for mobile video entertainment consumption continue to grow.

It also is possible more consumers will start using tethering features for their tablets and PCs, which likewise will increase consumption. The point is that, even in the absence of new apps, mobile service providers should see incremental revenue from 4G.

But the “gross revenue” figures we will be seeing will have to be weighed against the cannibalization of 3G data revenues.

“To some extent, 4G may not impact mobile innovation the way 3G did,” observes Dan Hays, PwC US Wireless Advisory Leader. “We may be more likely to see second order effects from 4G rather than new things enabled by the technology itself.”

In other words, there might not be as much application innovation as some believe, nor might the revenue lift revenue lift be as significant as some believe.

“I believe 4G will enable operators to deliver a more consistent experience, more ubiquitously,
at a lower cost and allow them to make money and stay in business,” said Hays. That sounds a bit like the upside from fiber to home networks.

There is some revenue upside, particularly from video entertainment services. But much of the benefit comes from “future proofing” or lower operating and repair costs. Lower costs per bit is one advantage, for example.

Saturday, May 18, 2013

Two Orders of Magnitude More Access Speed Within 10 Years? History Says "Yes"

If history is any guide,  gigabit Internet access will not be at all rare in a decade, though how much demand for 1-Gbps might well hinge on retail pricing levels.

In August 2000, only 4.4 percent of U.S. households had a home broadband connection, while  41.5 percent of households had dial-up access. A decade later, dial-up subscribers declined to 2.8 percent of households in 2010, 68.2 percent of households subscribed to broadband service.

If you believe gigabit access is to today’s broadband as broadband was to dial-up access, and you believe the challenge of building gigabit networks roughly corresponds to the creation of broadband capability, a decade might be a reasonable estimate of how long it will take before 70 percent of U.S. homes can buy gigabit access service, and possibly 50 percent do so.

Consider that by June 2012 about 75 percent of U.S. households could buy a service of at least 50 Mbps, while half could buy service at 100 Mbps. So it took about a decade to put into place access at two orders of magnitude higher than the baseline (dial-up speeds).

The key distinction is between “availability” and “take rate.” Even though consumers are starting to buy faster access services, most seem to indicate, by their buying behavior, that 20 Mbps or 25 Mbps is “good enough,” when it is possible to buy 50 Mbps or 100 Mbps service.

In the U.K. market, for example, though service at 30 Mbps is available to at least 60 percent of homes,  buy rates were, in mid-2012, at about seven percent (to say nothing of demand for 100 Mbps).  

The European Union country with the highest penetration of such services was Sweden, at about 15 percent, in mid-2012.

To be sure, retail prices are an issue. With relatively few exceptions, U.S. consumers tend to buy services up to 25 Mbps, and price for a gigabit service is probably the main reason.

That is the reason behind Google Fiber's disruptive pricing for gigabit access at $70 a month. That pricing umbrella implies far lower prices for 100 Mbps service than consumers can buy at the moment.

And that is every bit as important as the headline gigabit speed. If a gigabit connection costs $70 a month, will a 100-Mbps connection cost $35 a month?

Friday, May 17, 2013

Fixed Networks are for "Capacity," Mobile Networks are for "Coverage"


These days, in many markets, people using smart phones are on the fixed networks for Internet access, more than on the mobile network. 

In North America, as much as 82 percent of Internet traffic from smart phones occurs indoors, where users can take advantage of Wi-Fi, instead of the mobile network,

suggests.


In Western Europe, as much as 92 percent of Internet usage from smart phones occurs indoors.

So to a large extent, that means the fixed network provides “capacity,” while the mobile network provides “coverage,” a statement that describes the two ways a small cell can provide value for an urban mobile network as well.

For the most part, Wi-Fi offload happens mostly in the office and the home. Some small cells will include Wi-Fi access, but the volume of Internet activity still occurs indoors, not outdoors where small cells will reinforce the mobile macrocell network.

Some tier one carriers have moved to create their own networks of public Wi-Fi hotspots, and many can serve customers from owned fixed networks as well. That makes the fixed network and public Wi-Fi a direct part of the mobile network.

In other cases, carriers simply passively allow their devices to use non-affiliated Wi-Fi networks, as when a mobile service provider allows a user to roam onto a fixed network owned by another service provider. 

That is one more example of the loosely-coupled nature of the Internet ecosystem. A mobile provider can offload traffic to another carrier with which it has no formal business relationship. 








Thursday, May 16, 2013

Will TV White Spaces Be Important?


Whether TV white spaces spectrum is going to be a big deal or not might hinge on how much real spectrum is available in given markets, plus manufacturing volume to get retail prices of network elements down to a level where the spectrum has a reasonable business case.

At a high level, it isn’t going to help as much in urban areas, where interference issues are more constraining.

It might prove quite important in some rural areas where is a lot more bandwidth because there are enough people living in a region to create incentives for TV broadcasters. In areas where few people live, there might be lots of bandwidth, just few potential users or customers. Every location is different.

While at least 6 megahertz is available throughout most of the United States, there are a few locations where there is much more spectrum available. Though most of the spectrum cannot be used in most locations, the white spaces band includes 150 MHz of spectrum in total.  


Other sources of lightly-regulated or unlicensed spectrum might be made available in the future. And new technologies such as agile radio and ultra-wideband technologies are available, but regulator action is required to enable use of such technologies.

And though the general rule has been that spectrum is licensed to entities for specific purposes, unlicensed spectrum might be crucial.

"Unlicensed" Spectrum Doesn't Always Mean "You Can Use it" Without Paying


It sometimes is easy to forget that it isn’t as easy to become an ISP in some nations, as in others. Consider the matter of “unlicensed spectrum,” for example.

“Unlicensed” spectrum exists. But use of that spectrum is, in about 66 percent of nations, is not really non-licensed. Based on responses from 75 countries, 33 percent of national regulators require a license to use 2.4, or 5 GHz “unlicensed” spectrum, a study found.

Another 33 percent of national regulators require obtaining an operating license, though not a spectrum license. About 33 percent do not require a license of any type. In a small fraction of cases (two percent) use is forbidden.

"Mix and Match" is one Advantage of Software Defined Networks


If you wanted to rip cost out of any device, app or network, what are some of the things you’d do? You’d remove parts. You’d probably reduce size.

Shoretel, for example, sells docking stations for iPhones and iPads, for example, that allow the iOS devices to act as the CPU, when the docking station providing all the peripheral support.



And that's as good an example as any of how a network designer would try and wring cost out of a network.


You’d rely as much as possible on functionality that could be supplied by other common CPUs. 

You’d pick passive solutions, not active solutions, as often as possible. You’d simplify with an eye to improving manufacturing tasks.

You also would create a “mix and match” capability about specific makes and models of network gear. You’d want to be able to use a variety of network elements, made by many different suppliers, interchangeably.

You’d create a network using common, commodity-priced devices as much as possible.

In other words, you would make devices, networks and apps "as dumb as possible," and as flexible and interchangeable as possible.

If you think about software defined networks, that’s an application of the principle. Not “everything” about SDN is “as dumb as possible;” only the data plane elements.

The corollaries are that such approaches create networks that also are “as flexible as possible” and “as affordable as possible.”

The control plane you would still want to be as “smart as possible,” and you would be able to afford to do so, since the key to much cost reduction is the ability to share a physical resource across a broad number of end users, subtending devices or nodes.

That is why the per-user or per-customer cost of an expensive headend is rather low, as a percentage of the total cost of a network. On the other hand, the total cost of smart CPUs (decoders) used by every end user or customer is so high because there is so little sharing possible: each customer needs one or more decoders.

That was what drove cable operator Cablevision Systems to adopt a “network digital video recorder” strategy. By centralizing the CPU functions, the ability to share the cost of processing infrastructure was vastly improved.

The broader principle is that one proven way to reduce cost, increase flexibility and enhance functionality is to separate and centralize control plane (CPU) functions from the data plane functions that are widely distributed throughout a network.

That’s the whole point of software defined networks.


What Does Your Business Look Like if Access Bandwidth is Not a Constraint?

There is one thread that underlies thinking and business strategy at firms as disparate as Google, Netflix and Microsoft, namely Moore's Law. Even if communications does not scale in the same way as memory and processing, Moore’s Law underpins progress on the communications front, at least in terms of signal compression, the power of network elements and cost of those elements and systems built on those building blocks.  


As Intel CEO Paul Otellini tells the story, Moore’s Law also implied an inverse relationship between volume and price per unit. Over time, processing and memory got more powerful and cheaper in a linear way.


The implication for Intel was that it would have to shift from producing small numbers of components selling for high prices to a market where very large numbers of very cheap components was the context of the business. “Towards ultra-cheap” is one way to describe the progression of retail prices.

You might argue that assumption also drove Microsoft’s decisions about its software business (“what does my business look like if computing hardware is very cheap?”), the confidence Netflix had that broadband would support high-quality streaming (“Will access bandwidth be where it must to support our streaming business?”) and the many decisions Google makes about the ability to support software-based businesses using advertising.

You might argue that the emergence of cloud computing is reshaping existing content and software businesses precisely because of the question “what would my business look like if access bandwidth were not a constraint?”

For Intel, the implications were a radical change in component pricing, reflected back into the way the whole business has to be organized.


Ubiquiti illustrates a related principle, namely the role of disruptive pricing in a market. Ubiquiti has operating expense in single digits where a traditional technology supplier has operating expense in the 30 percent to 60 percent range.


That allows Ubiquiti to sell at retail prices competitors cannot easily match.

source Justin Sullivan/Getty Images

BT Changes Mind About Branded Mobile Service


BT appears to have changed its mind about the retail mobile market. Having won 4G spectrum (2x15 MHz of FDD and 20 MHz of TDD 2.6GHz spectrum), BT suggested it would not build a national retail network but use the 4G spectrum as a way to augment its fixed network operations.

Now BT says it will launch its own retail 4G network. The thinking is that BT will source wholesale mobile connectivity from one of the U.K. mobile service providers to provide full mobile access, while using its own spectrum largely for fixed or location access.

That raises some interesting new questions. BT is not the first service provider to imagine using a mix of wholesale “mobile” access and “Wi-Fi access whenever possible.” Republic Wireless, for example, is using precisely that approach, offloading Internet access to Wi-Fi whenever possible.

But the new issue is the degree to which Wi-Fi roaming could allow an ISP to create an “untethered” but not fully mobile service offering, as cable operators basically are doing with their public hotspot networks, creating a national Wi-Fi roaming capability.

In BT’s case, wholesale mobile spectrum would allow users to use the Internet when they are in transit, with the expectation that most Internet use will happen when people are at home, at work, or within range of a public Wi-Fi hotspot.

That is why some believe small cells incorporating Wi-Fi will be a game changer for mobile service providers, easing heavily congested data pipes while linking together billions of devices into a single network architecture, according to the IHS iSuppli.

Small cells--low-power base stations each supporting approximately 100 to 200 simultaneous users--will augment mobile coverage and capacity in dense urban areas.

That is the mirror image of the BT approach, which augments fixed coverage with a mobile overlay.

So where mobile operators will use Wi-Fi to offload mobile traffic, BT essentially will use mobile to augment and “upload” fixed traffic.

But both of those approaches blend “mobile” and “fixed” Internet access. The unknown is whether there could arise a market for Wi-Fi-only devices that take advantage of the growing availability of Wi-Fi, much as Wi-Fi-only tablets get used.

Already, in most developed nations, 80 percent to 95 percent of the time, smart phone users are in zones where Wi-Fi can be the primary Internet connection, when they use the Internet.

Global Telecom Revenue to Grow 3.4% in 2013

The total worldwide telecom market grew by 3.2 percent during 2012, and IDC is forecasting growth of 3.4 percent during the 2013 time frame, with the market settling into a steady growth rate of about 3.2 percent during the forecast time frame," according to Courtney Munroe, GVP, Worldwide Telecommunications, Mobility, and Network Infrastructure, IDC. 

But those service provider revenues will be unevenly distributed. What is not so clear from those global statistics is the actual pattern of growth and decline regionally.

Revenue growth, though slower than it had been in the first decade of the 21st century, will continue everywhere but Europe. 

The Asia Pacific region will lead growth. But Africa is growing faster than many think. 

Telecom retail revenue in Latin America will grow at a compound annual growth rate (CAGR) of 3.3 percent between 2012 and 2017, according to Analysys Mason.


But the European telecom service market decreased for the third year in a row in 2011, by 1.5 percent, the European Telecommunications Network Operators Association reports. 

In the third quarter of 2012, European carrier revenue contracted, though growing in other regions such as China, the United States, India and South America.


Even in the United Kingdom and Germany, the markets with the brightest future, STL Partners forecasts a respective 19 percent and 20 percent revenue decline in mobile core services (voice, messaging and data) revenues by 2020.


Revenue in the French market will decline 34 percent by 2020. In Italy, revenue will drop 47 percent and in Spain revenue will drop 61 percent by 2020.


Overall, STL Partners anticipates a reduction of 36 percent or €30 billion in core mobile service revenues by 2020, a loss of about €50 billion for Europe as a whole.


Europe's share of the global telecoms market has been declining regularly over the recent years, from 31 percent  in 2005 to just over 25 percent in 2011 as the gap between global growth (+3.2 percent in 2011) and growth in Europe widens.

Gigabit Network Coverage Almost Has to be Uneven, At First


It hasn't happened yet, but it is predictable that, at some point, concerns will be raised about the extent of coverage of gigabit access networks in the United States. One might argue that is a fair public policy concern, but also a concern that some might argue has to be secondary to promoting the building of gigabit networks as widely as possible.

Even within its own chosen cities, Google Fiber builds first in neighborhoods where the expressed demand is the highest.

Now AT&T, facing Google Fiber in Austin, Texas and Kansas City, Mo., believes there will be demand for gigabit or other similar very high speed networks in neighborhoods, if not whole areas of every city.

AT&T Chairman and Chief Executive Randall Stephenson says AT&T is not the only ISP that will want to provide gigabit or other very high speed service,  though perhaps in neighborhoods with many potential customers, rather than "everywhere."

Other projects, such as Gig.U, have roughly the same idea, that communities within cities, anchored by colleges and universities are the ways to get gigabit access networks up and running.

The point is that pushing forward will require deploying where that is possible: where there is demand. That will be an uneven process, almost by definition. And that is going to raise hackles, because communications is a political business.

There eventually will be complaints about universal service, or the communications equivalent of “redlining,” where whole neighborhoods might be deemed “low priority” or “no priority.” But that is just a problem we will have to face.

Given the uncertain business model and high costs of upgrading access networks for gigabit operation, we will have to push forward bit by bit, area by area, where the chances of sustainable success are highest.

The biggest single problem most would-be ISPs face when trying to provide  low cost, universal Internet access is a sustainable revenue model. Grants won’t do it. Permanent government support won’t do it. Good intentions won’t do it.

A self-sustaining revenue model of some sort is necessary. Though indirect mechanisms might be possible in some cases (ad-supported Wi-Fi hotspots, for example), in most cases actual end users will have to support the continued operation of the networks.

The second biggest problem is inability to get government permission to do so (licensing, spectrum). In many countries, ISPs are required to get telecom licenses, or can use unlicensed spectrum, but only if they pay a licensing fee.

That adds expensive overhead for any set of entrepreneurs trying to bring Internet access to everybody, under difficult financial circumstances.

Even for well-heeled providers in the United States, gigabit networks might have to be spot deployed.

"The key is being able to do it in areas where you know there's going to be high demand, and people are willing to pay the premium to be able to do it," Stephenson said.

Stephenson suggested the ideal level of potential subscribers would occur when 25 percent to 35 percent of households in a neighborhood want it.

There are sure to be complaints that such a deployment process is unfair. But without an attitude of “build gigabit networks where you can,” we run the risk of slowing the availability of such networks anywhere.

Google Maps Goes "Personal"



The new version of Google Maps will personalize and customize map detail for each user. 

When users set "Home" and "Work" locations, star favorite places, write reviews and share with friends, every Google Map instance automatically will include such details within the mapping experience. 

The new "carousel" gathers all Google Maps imagery in one spot enabling users to fly through cities, walk canyon trailsclimb mountains, and even swim the oceans, Google says. 
And on a WebGL-enabled browser, like Google Chrome, the carousel includes the Earth view which directly integrates the 3D Google Earth images into maps. 



The Best Argument for Sustainable Neocloud Role in the AI Ecosystem

Perhaps the “best” argument for a permanent role for neocloud service providers is the relevance of enterprise private cloud inference serv...