Tuesday, February 5, 2013

Smart Phones Drive Mobile Data Consumption, Globally

In 2012, global mobile data traffic grew more than 70 percent year over year, to 855 petabytes a month, according to Cisco.

Mobile data traffic growth varied by region, with the slowest growth experienced by Western Europe at 44 percent, and the highest growth rates experienced by Middle East and Africa (101 percent) and Asia Pacific (95 percent).




There are three key reasons for the lower mobile data traffic growth in Europe in 2012, Cisco says. Tiered mobile data packages are one reason, as most “unlimited” plans have been eliminated.

In Europe, there also has been a slowdown in the number of mobile-connected laptop net additions. The number of mobile-connected  laptops in Europe declined from 33.8 million at the end of 2011 to 32.6 million at the end of 2012.

Per Device Usage, MByes per Month
Device Type
2012
2017
Non smart phone
6.8
31
M2M Module
64
330
Smart phone
342
2,660
4G Smart phone
1,302
5,114
Tablet
820
5,387
Laptop
2,503
5,731

In Europe, there also has been an increase in the amount of mobile traffic offloaded to the fixed network. Operators have encouraged the offload of traffic onto Wi-Fi networks. Tablet traffic
that might have migrated to mobile networks has largely remained on fixed networks, as well.

By 2017, global mobile data traffic will reach 11.2 exabytes per month, or a run rate of 134 exabytes annually.

Smart phones will be 68 percent of total mobile data traffic in 2017, compared to 44 percent in 2012. LTE 4G connections will be 10 percent of total mobile connections in 2017, and 45 percent of mobile data  traffic.


Global mobile network connection speeds doubled in 2012 and will increase seven fold by 2017, reaching 3.9 Mbps.

As much as 46 percent of global mobile data traffic will be offloaded in 2017, up from 33 percent in 2012, Cisco forecasts.

By 2017, 66 percent of the world’s mobile data traffic will be video, up from 51 percent in 2012.

The Middle East and Africa will have the strongest mobile data traffic growth of any region at 104 percent compound annual growth rates, followed by Asia Pacific at 84 percent and Central and Eastern Europe at 83 percent.

U.K. Regulator to Allow LTE in all Mobile Frequency Bands

Ofcom, the U.K. communications regulator, now is proposing to allow use of Long Term Evolution air interfaces in the existing 900 MHz, 1800 MHz and 2100 MHz bands to permit the deployment of 4G services.

The new LTE spectrum auctions will use the 800 MHz and 2.6 GHz bands.

The new rules would mean no restrictions on which air interfaces have to be used in each frequency band. That could become an important issue if one of the expected four leading winners of new LTE spectrum should be shut out, in part, or completely, in the current LTE spectrum auctions. 


As always, spectrum auctions could have market-reshaping implications, either allowing new competitors to enter, or changing the strategic relationships between leading providers. But there is an important potential element: it is not clear there is enough spectrum for all four current leading mobile service providers to win new spectrum. 

In fact, it seems likely that only three of four can win the auctions for coveted 800-MHz spectrum best suited for national networks. 

That failure to win spectrum could put the loser at a severe disadvantage, compared to the other three leaders who win spectrum, it goes without saying.
Ofcom also will propose allowing an increase of 100 percent (3 decibels) in transmit power of radios in the 900 MHz frequency band for UMTS (3G) technology, as requested by Telefónica and Vodafone.

Such regulation by function has been a staple of licensing in many countries, including the United States, which explains why Clearwire had to ask for Federal Communications Commission permission to use its mobile satellite spectrum to support a terrestrial LTE mobile network.

The changes would allow all the major mobile operators to make business decisions about whether to transition their current networks to LTE based solely on business considerations.

The decision also means that, should one of the four leading U.K. service providers fail to win LTE spectrum in the current auctions, that firm would still be able to offer LTE using its existing spectrum.

While not as easy as deploying a network using brand-new spectrum, the change in Ofcom rules would protect any one of the four major firms from being shut out of the LTE business.

$50 Smart Phones for Emerging Markets in 2013, Gartner Predicts

By 2013, the first $50 smart phone will appear in emerging countries, Gartner predicts, lead by devices produced by China-based firms.

"Semiconductor vendors that serve the mobile handset market must have a product strategy to address the low-cost smartphone platform, with $50 as a target in 2013," said Mark Hung, research director at Gartner.

That will be one of several developments that many who have worked in the communications business might find frankly surprising. 

Most surprising of all, perhaps, is the "solving" of the problem of "giving telephone service to billions of people who never have made a phone call." These days, that is mostly a problem that is solved, or soon to be solved. 

The next problem is related to another problem, namely the issue of how to get computing devices and networks to people who have never used a PC or the Internet. Most believe that mobile broadband is the answer to the access problem. 

Smart phones are becoming one answer to the "affordable devices" issue. In fact, the arrival of the low-cost smart phone parallels the earlier effort to develop low cost PCs for users in emerging markets. 

The new element is the availability of the tablet as a form factor likely to make a big difference in the "low cost PC" market, which has been the object of some attention over the last decade, under the one laptop per child or one tablet per child

We probably will be surprised over the next decade or so by the extent to which broadband access and use of the Internet has blossomed, globally.




Virgin Media to Sell to Liberty Global?

Virgin Media is in takeover talks with U.S.-based conglomerate Liberty Global. 

In some ways, assets changing hands in the communications and media business, while interesting for the firms and people involved, do not change the strategic backdrop of the affected markets. In other cases, there are strategic implications. 

This deal would not appear to be so much strategic, as tactical. Liberty Media long has invested in European cable TV assets. So Virgin Media would represent a bigger profile in the U.K. market, not out of line with what Liberty Global has invested in, in the past. 

The irresistible story line, though, is that a successful deal would pit Liberty Global's chairman, John Malone, head-to-head with Rupert Murdoch in the video entertainment business, as has been the case, from time to time, in the past. 

New "National Wi-Fi" Story Doesn't Make Sense

Every now and then, we all run into a story that doesn't make sense. That seems to be the case with the notion that the Federal Communications Commission is about to enable the building of new forms of national Wi-Fi service. It is true that the FCC proposes to set aside some spectrum formerly used by TV broadcasters for unlicensed use.

Such uses have in the past created markets for garage door openers and what we now call "Wi-Fi." But so far as anybody really can tell, the FCC has not called for, or said it would directly  or indirectly fund the construction of networks that use unlicensed spectrum.

It will simply make the frequencies available, and then private interests have to do the investing. Some refer to "white spaces" spectrum as "Super Wi-Fi." It is a catchy phrase. But nobody can yet tell whether that is the right analogy. Wi-Fi, after all, has been used as a local area communications protocol, not a "network access protocol."

And while it would be helpful, in an end user or Internet application provider sense, for new unlicensed spectrum to be made available, it would be more helpful for would-be network access providers to have additional spectrum resources. 

Wi-Fi, in the sense of local distribution, is in the same category of things as the use of Ethernet cables or other methods of forwarding packets inside a home, office or other area. Between the local distribution network and the "Internet" is an access network of some kind. And the bigger business issue is access, not local distribution. 

If "white spaces" could create a big new access channel, that would be big news. If used only for local distribution, indeed, as "Super Wi-Fi," that would probably not be so big a deal. 

Recent stories about Google and France Telecom talking about "terminating access" are other cases in point, where a story just doesn't make sense. 

In fact, that whole issue of Google paying access providers or content owners, both ways of redistributing profit in the Internet ecosystem, are a muddled matter. Given enough business or political pressure (such as threatened regulations), dominant and influential firms sometimes find they must make accommodations they would rather not.  

So some would say Google now is "paying France Telecom" for access to Orange customers in Africa, something that would be quite a precedent for Google and any access provider. Others would say Google likewise is paying French content firms for the right to index their content. Google would say otherwise.

But the fact remains that firms sometimes have to bend. Google can rightly say it is not paying for access, only executing peering agreements or interconnection agreements. Google can rightly say it is helping French newspapers retool for a digital age. But France Telecom and French newspapers are going to be getting some revenue, for something, from Google, in ways that allow Google to say it is not paying for termination, or for the right to index content.

As with the case of "white spaces," the actual story is more nuanced than headlines would suggest. 

Dell Encounters a Changing Era, As Did IBM, Microsoft: Will Apple be Next?

There's a good reason for eras of computing and the scary fact that no leader in one era has lead in the next era. Firms survive the shifts--IBM is the best example so far--but they do not lead in the same way they once did. 

Historically, what it has taken to succeed in each era has required different architectures, has had firms engaging with different customers, or in different ways with customers, and has had different amounts of integration with other parts of life. 

Some would say we have been though four eras, and are entering the fifth of five eras of computing, including mainframes, PCs and Web, while we now are entering the "Device" era, which will be followed by something Robert Grossman calls the "Data" era.

Others might say we have been through four eras, including mainframes, minicomputers, PCs and now are in an era where cloud or mobile might better characterize the new era. 

The point is that, historically, these eras correspond to business leadership. It is therefore no knock on executive skill that firms such as Dell, HP, IBM and perhaps now even Apple have run into problems when eras change. 

Most technology historians would agree there was a mainframe era of computing, followed by the mini-computer and then PC or client-server era. Most would agree that each era of computing has been lead by different companies.

IBM in the mainframe era; Digital Equipment Corp. in the mini-computer era and Microsoft and Intel in the PC (or Cisco in the client-server era, as one might also refer to the PC era) are examples. Apple has been among the brightest names in the current era, however one wishes to describe it. But judging by market valuation, Apple has hit a bit of an air pocket.


But there is no doubt there has been a change over the past decade or so. Where in the late 1990s one might have said EMC, Oracle, Cisco and Sun Microsystems were the four horsemen of the Internet, leading the business, nobody would say that in 2013. 

These days, it is application firms such as Google, Amazon, Facebook, plus Apple, that fit into the typology. 

There has been a trend towards computing pervasiveness, as each era has succeeded the earlier era. Computing used to be in a "glass room." Then it could be done in a closet. With PCs computing moved to the desktop. Now, computing is in a purse or pocket. 

The role of software obviously has become more important over time. But, to this point, computing eras have never been defined by the key applications enabled. Perhaps we will one day see matters differently, but it would be a change to shift from "how" computing is done to "what computing does" to define the eras. 

We all sense that a new era is coming, and that the Internet, mobile devices and applications will be more important. But there is not any agreement on whether we have "arrived" or are still only approaching the new era. 
We certainly are leaving the PC era. That's why former Apple CEO Steve Jobs always insisted the iPad was not a PC. In fact, many would insist that it is the tablet's optimization for content consumption that makes it distinctive. 

We can't yet say that the next era of computing is defined by mobile devices, tablets, the Internet or cloud computing or even the fact that leadership is shifting more in the direction of applications and activities than computing appliances. But all of that hints at the shape of what might be coming. 


If history holds, someday even Google, Apple, Facebook and Amazon will be seen as "former leaders." Despite the success those firms have enjoyed, there is still no precedent for a firm that leads in one era to lead in the next. 

And IBM has shown one way of surviving in an era a former leader cannot dominate. Dell wants to go the same route. But it might be fair to say that "surprise" is one common element when eras start to change. 

Michael Dell, about to execute a deal to take Dell private  said the "rise of tablets had been unexpected."  

"I didn't completely see that coming," he said.

Dell would be in good company. Bill Gates did not "get" the Internet, either. 

Monday, February 4, 2013

Three Breaks Ranks on LTE Pricing

Three UK says it will not charge a premium for customers using its Long Term Evolution 4G network, taking a different retail pricing policy than many other service providers that offer LTE only at higher effective prices. Three UK says all smart phone price plans will include LTE access at no extra charge. 

LTE service will be added to Three’s "Ultrafast" network later in 2013. Unlike some other U.K. mobile operators, it will be available across all existing and new price plans without customers needing to pay a premium fee, Three UK says. 

That's an example of how an upstart contestant in a competitive market can try and disrupt market pricing structures. 

Sunday, February 3, 2013

Low-Cost Apple iPhone is Coming

All protestations by Apple aside, Apple has to develop a lower-cost iPhone it is to compete with arch-rival Samsung in developing markets, the next big battleground for smart phone suppliers. Apple might continue to deny it is working on such a device.

But analysts at investment firm Detwiler Fenton say Apple is working on a new product for the low end of the market that uses a Qualcomm Snapdragon processor. The device might even deliberately feature less robust graphics and video support, or other features standard on today's iPhones, to maintain distinctiveness from the rest of the iPhone line. 

100 Mbps Access Will be Common by 2020. Ubiquitous 1-Gbps Access Might Take 10 Years

Policymakers, policy advocates and many bandwidth-dependent interests are calling for either 100-Mbps or 1-Gbps Internet access as a “standard” U.S. reality by 2020 or so. Some will doubt that is feasible. As daunting as that objective sounds, history suggests the goal is achievable.

In fact, some relatively standard forecasting techniques suggest the 100 Mbps target is inevitable. Perhaps the only question is when the 1-Gbps speeds might be common.

Give it a decade. In 2002, it is hard to remember, only about 10 percent of U.S. households were buying broadband service. A decade later, virtually all Internet-using households were buying broadband access service.

Researchers at Technology Futures continue to suggest that 100 Mbps will be a common access speed for U.S. households by 2020, for example.

In 2009, Technology Futures predicted that, in 2015, about 20 percent of U.S. households would be buying access at 100 Mbps, about 20 percent at 50 Mbps, and something more than 20 percent will be buying service at about 24 Mbps.

That might have seemed a bold forecast back in 2009, but Technology Futures uses a rather common method of technology forecasting that has proven useful. In fact, Technology Futures has been relatively accurate about access speeds for a couple of decades, at least.

The 2009 forecast by Technology Futures furthermore seems to be a reasonable approximation of reality. Technology Futures had expected that roughly 20 percent of U.S. households would be buying 1.5 Mbps service by about 2010, another 20 percent would be buying 24 Mbps service, while 40 percent of U.S. households would be buying 6 Mbps service.

The Technology Futures estimates of 2009 seem to match other data reasonably well. An Akamai study suggested that typical U.S. access speeds. were about 4 Mbps, on average, in 2010,

Separate test by Ookla cited by the Federal Communications Commission show widely varying speeds in different cities, but running generally from 8 Mbps to 12 Mbps in 2010.

Recall the Technology Futures forecast that 40 percent of U.S. households would be buying services of about 6 Mbps, with 20 percent buying 24 Mbps and 20 percent buying services of about 1.5 Mbps. Average them all together and you wind up somewhere between 6 Mbps and 12 Mbps.
But the forecast of 100 Mbps by 2020 requires movement of two orders of magnitude in less than a decade, and three orders of magnitude to reach 1 Gbps.

You can count Netflix CEO Reed Hastings as among those who think the typical U.S. household will be buying quite a lot of access capacity by 2020. The difference is that where Technology Futures believes 100 Mbps would be typical in 2020, Hastings thinks 1 Gbps could be a reality.

Back when modems operated at 56 kbps, Netflix took a look at Moore’s Law and plotted what that would mean for bandwidth, over time.

“We took out our spreadsheets and we figured we’d get 14 megabits per second to the home by 2012, which turns out is about what we will get,” says Reed Hastings, Netflix CEO.“If you drag it out to 2021, we will all have a gigabit to the home.”

The difference between the Netflix expectation and that of Technology Futures probably can be accounted for by the fact that Moore’s Law applies to only a relatively small amount of access network cost. Physical costs other than semiconductors account for nearly all access network capital investment and operating cost, and none of those other cost elements actually follow Moore’s Law.

The point is that, whether government policies and incentives are in place, or not, it is highly likely typical access speeds will be relatively widely available by 2020 or 2025.

With most things broadband, a decade is plenty of time to bring surprising speed increases into common and typical use.

PC Won't be So "Personal" in Future

How people use PCs at home is changing, with most likely to shift to a "shared" device model, with the personal devices becoming the smart phone and tablet, one might suggest. As once was the case with additional access lines in the home being purchased for teenagers, fax machines or dial-up Internet access, a shift in demand might be occurring.

The change is that although most homes will keep a PC for content creation, on a shared basis, spending that once went for "personal" PCs might be shifting to tablets. That means the replacement PC market will shrink. 

“Tablets have dramatically changed the device landscape for PCs, not so much by ‘cannibalizing’ PC sales, but by causing PC users to shift consumption to tablets rather than replacing older PCs,” says Mikako Kitagawa, Gartner principal analyst.

That implies a market where most people will use "personal" tablets as the primary Internet appliance, while the shared PC gets used when people have to create content. That might also imply that the replacement PC market will shrink, as PCs will be retired and replaced by tablets over time, with only one PC in a home upgraded over time as the shared content creation device.

So PCs will tend to become less "personal," becoming a shared use device, more like a TV screen or microwave oven, in that sense. 

Saturday, February 2, 2013

Google, French Publishers Compromise on "Link Taxes"

Google will create a €60 million Digital Publishing Innovation Fund in France that is a compromise designed to avoid payment of "link taxes" to French publishers. The new fund will avoid setting a precedent whereby Google pays content owners to index their content. 

On the other hand, the deal funnels resources to French publishers. As part of the deal, Google also says it will work with French publishers to increase their online revenues using Google's advertising technology.

The compromise avoids putting Google in a position where it directly is paying content owners to index their content. On the other hand, French publishers will be compensated in other ways, including by potentially higher advertising revenues. 

The deal is significant because it shows the growing number of ways that Google has to adapt to growing regulatory oversight and commercial pressures by ecosystem partners that think application providers are building big businesses without adequate compensation to content developers or access providers. 

The compromise probably is a direction that will happen more often in the future, as ecosystem revenues are essentially transferred from Google to other partners, but in indirect ways that do not force Google to directly pay for either terminating access or copyright fees. 

94% of U.S. Homes Can Buy Mobile Broadband at 3 Mbps Speeds

Some 93.9 percent of mobile Internet access subscribers in the United States have access at 3 Mbps or faster, compared to about 93 percent of fixed network subscribers, an NTIA analysis suggests. The latest NTIA analysis will be updated in another six months, and the NTIA says it still wants feedback on the accuracy of the maps supporting the data.

Some 34 percent of homes have access to fixed wireless networks offer access at 3 Mbps. When considering that figure, keep in mind that fixed wireless does not operate as ubiquitously as do DSL and cable modem networks. The NTIA data only suggests that about a third of U.S. households can buy service at 3 Mbps from a fixed wireless provider.

That scenario does not change for speeds of at leaset 6 Mbps. As you would guess, fixed networks using optical fiber or cable modems have broad coverage at 6 Mbps or higher speeds. Some 86 percent of locations can buy cable high speed access at 6 Mbps or faster.

About 64 percent of digital subscriber line locations are able to get 6 Mbps service. About 78.6 percent of locations have access to mobile broadband of at least 6 Mbps.

Availability begins to diverge more at speeds of 25 Mbps. Only about 7.7 percent of U.S. homes have access to DSL at that rate. But 75.5 percent of homes can buy cable modem service operating at 25 Mbps.

About 4.7 percent of homes can buy fixed wireless service at 25 Mbps.

Friday, February 1, 2013

Tablets Generating More "Mobile Shopping" than Smart Phones

New research suggests some 22 percent of U.S. tablet-owning consumers spend $50 or more per month and nine percent spend $100 or more. That is much higher than spending levels by smart phone owners, ABI Research says. “Tablets are quickly becoming the go-to transaction screen within the home,” says ABI Research mobile devices senior practice director Jeff Orr.

Some will argue that “tablet commerce” really is not “mobile commerce,” a point well taken, as most tablets are used when people are not actually “mobile” but inside their homes or offices. On the other hand, perhaps a majority of mobile device usage likewise occurs when people are inside their homes or offices, so the definitions are a bit fuzzy.

The larger and notable point is that mobile and untethered devices are becoming a bigger factor in consumer “buying” and “shopping,” a fact that explains the huge interest on the part of application providers in mobile advertising, mobile promotion and mobile commerce.

Virtually nobody would argue that tablet commerce or mobile commerce has seriously affected retail stores. But few might be willing to argue that this always will be the case.

Logistics, such as price checking, using a coupon and location-based searches, consistently rank as the most common activities performed by more than 50 percent of tablet shoppers in the previous 90 days, while shopping, ABI Research has found.

​At the close of 2012, ABI Research estimates, nearly 200 million tablets will have shipped worldwide since 2009 and an additional one billion tablets are forecasted to ship over the next five years. That growing installed base of users is certain to lead to higher commerce volume.

Mobile commerce already represents double digit billions worth of transaction volume.

According to comScore e-commerce research, 10 percent of online retail dollars spent in the third quarter of 2012 were spent from users on mobile devices.

That might grow to 12 percent to 13 percent during the fourth quarter of 2012.

Make no mistake, neither “mobile shopping” or “tablet shopping” are especially large transaction categories right now, compared either to total retail shopping or even online shopping.

But most observers think mobile is destined to become much bigger, for obvious reasons, among them prosaic issues such as the generally more difficult display advertising business on small screen devices.

That suggests commerce might be a bigger fit for mobile devices. 


Thursday, January 31, 2013

Spectrum Policy Innovations are Coming

If AT&T, Verizon and T-Mobile USA are actively working to explore how to share spectrum now used by the U.S. Department of Defense, that is a signal that the carriers believer there is a serious chance spectrum sharing could happen, even if the carriers typically prefer to use only licensed spectrum.

The immediate focus is a proposed sharing of 95 MHz of spectrum currently used by DoD and other federal agencies, in the 1755 to 1850 MHz spectrum band.

Spectrum sharing, releasing more unlicensed spectrum and new spectrum auctions, plus reassignment of frequencies originally awarded for mobile satellite service are key ways regulators now are trying to make more spectrum available as a way of promoting mobile and wireless competition and innovation.


Since their introduction in 1994, the United States has conducted more than 70 spectrum auctions to assign thousands of wireless licenses.

But regulators also are working to increase the amount and ease of using unlicensed spectrum as well. The "white spaces" spectrum, and a new proposed sharing of 5-GHz spectrum are examples of some of the ways additional spectrum could be made available to existing and new service providers.

If three of the four largest U.S. mobile service providers are working in public on spectrum sharing in the 1755 MHz to 1850 MHz spectrum, it indicates they believe the spectrum will be made available.



What is the "Value" of the Fixed Access Network

Studies of smart phone user behavior confirm what most of us might have concluded, namely that Wi-Fi has become a key access method for smart phone users, and provides the answer to a question some might now be asking about the respective roles of mobile and fixed access networks.

That there are synergies between mobile and fixed networks is incontestable. All forms of access, whether fixed, untethered or mobile, are essentially “tail circuits” that connect users to core networks.

What is harder to determine is precisely where those synergies exist, and how big the synergy might be, when considering the highest value provided by fixed access, as compared to mobile access.

That issue increasingly is important as most people, in virtually all markets, rely on smart phones, potentially raising the issue of mobile substitution for the fixed network, and as fast mobile networks using Long Term Evolution create, in a new way, a chance to substitute mobile networks for Internet access that formerly would really have made sense only on a fixed network.

In other words, the growing question is “what is the value of the fixed network.”

Support for video entertainment, and consumption of large amounts of bandwidth at low cost, to support multiple users, emerges as perhaps the defining “value” of a fixed access connection. The key issue is that, increasingly, most digital appliances used in the home or at work use Wi-Fi, which is a wireless tail for a fixed network.


Android smart phone users tracked for a year by NPD Connected Intelligence use between half a gigabyte a month to about 1 Gbyte a month of mobile network data. Apple iPhone users tend to use a bit more.

Though the data might reflect the smaller number of iPhone users in the sample, consumption tended to run between 0.75 Gbytes a month up to about two gigabytes a month. By December 2012, though, Apple iPhone users were consuming data at about the same rate as Android users.


U.K. Android users send and receive 78% of all their data over WiFi networks, according to Nielsen, which also tracked the data usage of about 1,500 Android users.

Nielsen’s analysis suggests as much as 78 percent of all data consumed by users is using a Wi-Fi connection of some sort.

Data collected by Mobidia shows that Wi-Fi usage is close to ubiquitous in developed markets, where more than 90 percent of smart phone users also use Wi-Fi as a means of data connectivity. In Hong Kong and the Netherlands, use of Wi-Fi by smart phone users is over 98 percent.

Is Private Equity "Good" for the Housing Market?

Even many who support allowing market forces to work might question whether private equity involvement in the U.S. housing market “has bee...