Every now and then, we all run into a story that doesn't make sense. That seems to be the case with the notion that the Federal Communications Commission is about to enable the building of new forms of national Wi-Fi service. It is true that the FCC proposes to set aside some spectrum formerly used by TV broadcasters for unlicensed use.
Such uses have in the past created markets for garage door openers and what we now call "Wi-Fi." But so far as anybody really can tell, the FCC has not called for, or said it would directly or indirectly fund the construction of networks that use unlicensed spectrum.
It will simply make the frequencies available, and then private interests have to do the investing. Some refer to "white spaces" spectrum as "Super Wi-Fi." It is a catchy phrase. But nobody can yet tell whether that is the right analogy. Wi-Fi, after all, has been used as a local area communications protocol, not a "network access protocol."
And while it would be helpful, in an end user or Internet application provider sense, for new unlicensed spectrum to be made available, it would be more helpful for would-be network access providers to have additional spectrum resources.
Wi-Fi, in the sense of local distribution, is in the same category of things as the use of Ethernet cables or other methods of forwarding packets inside a home, office or other area. Between the local distribution network and the "Internet" is an access network of some kind. And the bigger business issue is access, not local distribution.
If "white spaces" could create a big new access channel, that would be big news. If used only for local distribution, indeed, as "Super Wi-Fi," that would probably not be so big a deal.
Recent stories about Google and France Telecom talking about "terminating access" are other cases in point, where a story just doesn't make sense.
In fact, that whole issue of Google paying access providers or content owners, both ways of redistributing profit in the Internet ecosystem, are a muddled matter. Given enough business or political pressure (such as threatened regulations), dominant and influential firms sometimes find they must make accommodations they would rather not.
So some would say Google now is "paying France Telecom" for access to Orange customers in Africa, something that would be quite a precedent for Google and any access provider. Others would say Google likewise is paying French content firms for the right to index their content. Google would say otherwise.
But the fact remains that firms sometimes have to bend. Google can rightly say it is not paying for access, only executing peering agreements or interconnection agreements. Google can rightly say it is helping French newspapers retool for a digital age. But France Telecom and French newspapers are going to be getting some revenue, for something, from Google, in ways that allow Google to say it is not paying for termination, or for the right to index content.
As with the case of "white spaces," the actual story is more nuanced than headlines would suggest.
Tuesday, February 5, 2013
New "National Wi-Fi" Story Doesn't Make Sense
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
Dell Encounters a Changing Era, As Did IBM, Microsoft: Will Apple be Next?
There's a good reason for eras of computing and the scary fact that no leader in one era has lead in the next era. Firms survive the shifts--IBM is the best example so far--but they do not lead in the same way they once did.
Historically, what it has taken to succeed in each era has required different architectures, has had firms engaging with different customers, or in different ways with customers, and has had different amounts of integration with other parts of life.
Some would say we have been though four eras, and are entering the fifth of five eras of computing, including mainframes, PCs and Web, while we now are entering the "Device" era, which will be followed by something Robert Grossman calls the "Data" era.
Others might say we have been through four eras, including mainframes, minicomputers, PCs and now are in an era where cloud or mobile might better characterize the new era.
The point is that, historically, these eras correspond to business leadership. It is therefore no knock on executive skill that firms such as Dell, HP, IBM and perhaps now even Apple have run into problems when eras change.
Most technology historians would agree there was a mainframe era of computing, followed by the mini-computer and then PC or client-server era. Most would agree that each era of computing has been lead by different companies.
IBM in the mainframe era; Digital Equipment Corp. in the mini-computer era and Microsoft and Intel in the PC (or Cisco in the client-server era, as one might also refer to the PC era) are examples. Apple has been among the brightest names in the current era, however one wishes to describe it. But judging by market valuation, Apple has hit a bit of an air pocket.
But there is no doubt there has been a change over the past decade or so. Where in the late 1990s one might have said EMC, Oracle, Cisco and Sun Microsystems were the four horsemen of the Internet, leading the business, nobody would say that in 2013.
These days, it is application firms such as Google, Amazon, Facebook, plus Apple, that fit into the typology.
There has been a trend towards computing pervasiveness, as each era has succeeded the earlier era. Computing used to be in a "glass room." Then it could be done in a closet. With PCs computing moved to the desktop. Now, computing is in a purse or pocket.
The role of software obviously has become more important over time. But, to this point, computing eras have never been defined by the key applications enabled. Perhaps we will one day see matters differently, but it would be a change to shift from "how" computing is done to "what computing does" to define the eras.
We all sense that a new era is coming, and that the Internet, mobile devices and applications will be more important. But there is not any agreement on whether we have "arrived" or are still only approaching the new era.
We certainly are leaving the PC era. That's why former Apple CEO Steve Jobs always insisted the iPad was not a PC. In fact, many would insist that it is the tablet's optimization for content consumption that makes it distinctive.
We can't yet say that the next era of computing is defined by mobile devices, tablets, the Internet or cloud computing or even the fact that leadership is shifting more in the direction of applications and activities than computing appliances. But all of that hints at the shape of what might be coming.
If history holds, someday even Google, Apple, Facebook and Amazon will be seen as "former leaders." Despite the success those firms have enjoyed, there is still no precedent for a firm that leads in one era to lead in the next.
Michael Dell, about to execute a deal to take Dell private said the "rise of tablets had been unexpected."
"I didn't completely see that coming," he said.
Dell would be in good company. Bill Gates did not "get" the Internet, either.
Historically, what it has taken to succeed in each era has required different architectures, has had firms engaging with different customers, or in different ways with customers, and has had different amounts of integration with other parts of life.
Some would say we have been though four eras, and are entering the fifth of five eras of computing, including mainframes, PCs and Web, while we now are entering the "Device" era, which will be followed by something Robert Grossman calls the "Data" era.
Others might say we have been through four eras, including mainframes, minicomputers, PCs and now are in an era where cloud or mobile might better characterize the new era.
The point is that, historically, these eras correspond to business leadership. It is therefore no knock on executive skill that firms such as Dell, HP, IBM and perhaps now even Apple have run into problems when eras change.
Most technology historians would agree there was a mainframe era of computing, followed by the mini-computer and then PC or client-server era. Most would agree that each era of computing has been lead by different companies.
IBM in the mainframe era; Digital Equipment Corp. in the mini-computer era and Microsoft and Intel in the PC (or Cisco in the client-server era, as one might also refer to the PC era) are examples. Apple has been among the brightest names in the current era, however one wishes to describe it. But judging by market valuation, Apple has hit a bit of an air pocket.
But there is no doubt there has been a change over the past decade or so. Where in the late 1990s one might have said EMC, Oracle, Cisco and Sun Microsystems were the four horsemen of the Internet, leading the business, nobody would say that in 2013.
These days, it is application firms such as Google, Amazon, Facebook, plus Apple, that fit into the typology.
There has been a trend towards computing pervasiveness, as each era has succeeded the earlier era. Computing used to be in a "glass room." Then it could be done in a closet. With PCs computing moved to the desktop. Now, computing is in a purse or pocket.
The role of software obviously has become more important over time. But, to this point, computing eras have never been defined by the key applications enabled. Perhaps we will one day see matters differently, but it would be a change to shift from "how" computing is done to "what computing does" to define the eras.
We all sense that a new era is coming, and that the Internet, mobile devices and applications will be more important. But there is not any agreement on whether we have "arrived" or are still only approaching the new era.
We certainly are leaving the PC era. That's why former Apple CEO Steve Jobs always insisted the iPad was not a PC. In fact, many would insist that it is the tablet's optimization for content consumption that makes it distinctive.
We can't yet say that the next era of computing is defined by mobile devices, tablets, the Internet or cloud computing or even the fact that leadership is shifting more in the direction of applications and activities than computing appliances. But all of that hints at the shape of what might be coming.
If history holds, someday even Google, Apple, Facebook and Amazon will be seen as "former leaders." Despite the success those firms have enjoyed, there is still no precedent for a firm that leads in one era to lead in the next.
And IBM has shown one way of surviving in an era a former leader cannot dominate. Dell wants to go the same route. But it might be fair to say that "surprise" is one common element when eras start to change.
Michael Dell, about to execute a deal to take Dell private said the "rise of tablets had been unexpected."
"I didn't completely see that coming," he said.
Dell would be in good company. Bill Gates did not "get" the Internet, either.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
Monday, February 4, 2013
Three Breaks Ranks on LTE Pricing
Three UK says it will not charge a premium for customers using its Long Term Evolution 4G network, taking a different retail pricing policy than many other service providers that offer LTE only at higher effective prices. Three UK says all smart phone price plans will include LTE access at no extra charge.
LTE service will be added to Three’s "Ultrafast" network later in 2013. Unlike some other U.K. mobile operators, it will be available across all existing and new price plans without customers needing to pay a premium fee, Three UK says.
That's an example of how an upstart contestant in a competitive market can try and disrupt market pricing structures.
LTE service will be added to Three’s "Ultrafast" network later in 2013. Unlike some other U.K. mobile operators, it will be available across all existing and new price plans without customers needing to pay a premium fee, Three UK says.
That's an example of how an upstart contestant in a competitive market can try and disrupt market pricing structures.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
Sunday, February 3, 2013
Low-Cost Apple iPhone is Coming
All protestations by Apple aside, Apple has to develop a lower-cost iPhone it is to compete with arch-rival Samsung in developing markets, the next big battleground for smart phone suppliers. Apple might continue to deny it is working on such a device.
But analysts at investment firm Detwiler Fenton say Apple is working on a new product for the low end of the market that uses a Qualcomm Snapdragon processor. The device might even deliberately feature less robust graphics and video support, or other features standard on today's iPhones, to maintain distinctiveness from the rest of the iPhone line.
But analysts at investment firm Detwiler Fenton say Apple is working on a new product for the low end of the market that uses a Qualcomm Snapdragon processor. The device might even deliberately feature less robust graphics and video support, or other features standard on today's iPhones, to maintain distinctiveness from the rest of the iPhone line.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
100 Mbps Access Will be Common by 2020. Ubiquitous 1-Gbps Access Might Take 10 Years
Policymakers, policy advocates and many bandwidth-dependent interests are calling for either 100-Mbps or 1-Gbps Internet access as a “standard” U.S. reality by 2020 or so. Some will doubt that is feasible. As daunting as that objective sounds, history suggests the goal is achievable.
In fact, some relatively standard forecasting techniques suggest the 100 Mbps target is inevitable. Perhaps the only question is when the 1-Gbps speeds might be common.
Give it a decade. In 2002, it is hard to remember, only about 10 percent of U.S. households were buying broadband service. A decade later, virtually all Internet-using households were buying broadband access service.
Researchers at Technology Futures continue to suggest that 100 Mbps will be a common access speed for U.S. households by 2020, for example.
In 2009, Technology Futures predicted that, in 2015, about 20 percent of U.S. households would be buying access at 100 Mbps, about 20 percent at 50 Mbps, and something more than 20 percent will be buying service at about 24 Mbps.
That might have seemed a bold forecast back in 2009, but Technology Futures uses a rather common method of technology forecasting that has proven useful. In fact, Technology Futures has been relatively accurate about access speeds for a couple of decades, at least.
The 2009 forecast by Technology Futures furthermore seems to be a reasonable approximation of reality. Technology Futures had expected that roughly 20 percent of U.S. households would be buying 1.5 Mbps service by about 2010, another 20 percent would be buying 24 Mbps service, while 40 percent of U.S. households would be buying 6 Mbps service.
The Technology Futures estimates of 2009 seem to match other data reasonably well. An Akamai study suggested that typical U.S. access speeds. were about 4 Mbps, on average, in 2010,
Separate test by Ookla cited by the Federal Communications Commission show widely varying speeds in different cities, but running generally from 8 Mbps to 12 Mbps in 2010.
Recall the Technology Futures forecast that 40 percent of U.S. households would be buying services of about 6 Mbps, with 20 percent buying 24 Mbps and 20 percent buying services of about 1.5 Mbps. Average them all together and you wind up somewhere between 6 Mbps and 12 Mbps.
But the forecast of 100 Mbps by 2020 requires movement of two orders of magnitude in less than a decade, and three orders of magnitude to reach 1 Gbps.
You can count Netflix CEO Reed Hastings as among those who think the typical U.S. household will be buying quite a lot of access capacity by 2020. The difference is that where Technology Futures believes 100 Mbps would be typical in 2020, Hastings thinks 1 Gbps could be a reality.
Back when modems operated at 56 kbps, Netflix took a look at Moore’s Law and plotted what that would mean for bandwidth, over time.
“We took out our spreadsheets and we figured we’d get 14 megabits per second to the home by 2012, which turns out is about what we will get,” says Reed Hastings, Netflix CEO.“If you drag it out to 2021, we will all have a gigabit to the home.”
The difference between the Netflix expectation and that of Technology Futures probably can be accounted for by the fact that Moore’s Law applies to only a relatively small amount of access network cost. Physical costs other than semiconductors account for nearly all access network capital investment and operating cost, and none of those other cost elements actually follow Moore’s Law.
The point is that, whether government policies and incentives are in place, or not, it is highly likely typical access speeds will be relatively widely available by 2020 or 2025.
With most things broadband, a decade is plenty of time to bring surprising speed increases into common and typical use.
In fact, some relatively standard forecasting techniques suggest the 100 Mbps target is inevitable. Perhaps the only question is when the 1-Gbps speeds might be common.
Give it a decade. In 2002, it is hard to remember, only about 10 percent of U.S. households were buying broadband service. A decade later, virtually all Internet-using households were buying broadband access service.
Researchers at Technology Futures continue to suggest that 100 Mbps will be a common access speed for U.S. households by 2020, for example.
In 2009, Technology Futures predicted that, in 2015, about 20 percent of U.S. households would be buying access at 100 Mbps, about 20 percent at 50 Mbps, and something more than 20 percent will be buying service at about 24 Mbps.
That might have seemed a bold forecast back in 2009, but Technology Futures uses a rather common method of technology forecasting that has proven useful. In fact, Technology Futures has been relatively accurate about access speeds for a couple of decades, at least.
The 2009 forecast by Technology Futures furthermore seems to be a reasonable approximation of reality. Technology Futures had expected that roughly 20 percent of U.S. households would be buying 1.5 Mbps service by about 2010, another 20 percent would be buying 24 Mbps service, while 40 percent of U.S. households would be buying 6 Mbps service.
The Technology Futures estimates of 2009 seem to match other data reasonably well. An Akamai study suggested that typical U.S. access speeds. were about 4 Mbps, on average, in 2010,
Separate test by Ookla cited by the Federal Communications Commission show widely varying speeds in different cities, but running generally from 8 Mbps to 12 Mbps in 2010.
Recall the Technology Futures forecast that 40 percent of U.S. households would be buying services of about 6 Mbps, with 20 percent buying 24 Mbps and 20 percent buying services of about 1.5 Mbps. Average them all together and you wind up somewhere between 6 Mbps and 12 Mbps.
But the forecast of 100 Mbps by 2020 requires movement of two orders of magnitude in less than a decade, and three orders of magnitude to reach 1 Gbps.
You can count Netflix CEO Reed Hastings as among those who think the typical U.S. household will be buying quite a lot of access capacity by 2020. The difference is that where Technology Futures believes 100 Mbps would be typical in 2020, Hastings thinks 1 Gbps could be a reality.
Back when modems operated at 56 kbps, Netflix took a look at Moore’s Law and plotted what that would mean for bandwidth, over time.
“We took out our spreadsheets and we figured we’d get 14 megabits per second to the home by 2012, which turns out is about what we will get,” says Reed Hastings, Netflix CEO.“If you drag it out to 2021, we will all have a gigabit to the home.”
The difference between the Netflix expectation and that of Technology Futures probably can be accounted for by the fact that Moore’s Law applies to only a relatively small amount of access network cost. Physical costs other than semiconductors account for nearly all access network capital investment and operating cost, and none of those other cost elements actually follow Moore’s Law.
The point is that, whether government policies and incentives are in place, or not, it is highly likely typical access speeds will be relatively widely available by 2020 or 2025.
With most things broadband, a decade is plenty of time to bring surprising speed increases into common and typical use.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
PC Won't be So "Personal" in Future
How people use PCs at home is changing, with most likely to shift to a "shared" device model, with the personal devices becoming the smart phone and tablet, one might suggest. As once was the case with additional access lines in the home being purchased for teenagers, fax machines or dial-up Internet access, a shift in demand might be occurring.
The change is that although most homes will keep a PC for content creation, on a shared basis, spending that once went for "personal" PCs might be shifting to tablets. That means the replacement PC market will shrink.
“Tablets have dramatically changed the device landscape for PCs, not so much by ‘cannibalizing’ PC sales, but by causing PC users to shift consumption to tablets rather than replacing older PCs,” says Mikako Kitagawa, Gartner principal analyst.
That implies a market where most people will use "personal" tablets as the primary Internet appliance, while the shared PC gets used when people have to create content. That might also imply that the replacement PC market will shrink, as PCs will be retired and replaced by tablets over time, with only one PC in a home upgraded over time as the shared content creation device.
So PCs will tend to become less "personal," becoming a shared use device, more like a TV screen or microwave oven, in that sense.
The change is that although most homes will keep a PC for content creation, on a shared basis, spending that once went for "personal" PCs might be shifting to tablets. That means the replacement PC market will shrink.
“Tablets have dramatically changed the device landscape for PCs, not so much by ‘cannibalizing’ PC sales, but by causing PC users to shift consumption to tablets rather than replacing older PCs,” says Mikako Kitagawa, Gartner principal analyst.
That implies a market where most people will use "personal" tablets as the primary Internet appliance, while the shared PC gets used when people have to create content. That might also imply that the replacement PC market will shrink, as PCs will be retired and replaced by tablets over time, with only one PC in a home upgraded over time as the shared content creation device.
So PCs will tend to become less "personal," becoming a shared use device, more like a TV screen or microwave oven, in that sense.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
Saturday, February 2, 2013
Google, French Publishers Compromise on "Link Taxes"
Google will create a €60 million Digital Publishing Innovation Fund in France that is a compromise designed to avoid payment of "link taxes" to French publishers. The new fund will avoid setting a precedent whereby Google pays content owners to index their content.
On the other hand, the deal funnels resources to French publishers. As part of the deal, Google also says it will work with French publishers to increase their online revenues using Google's advertising technology.
The compromise avoids putting Google in a position where it directly is paying content owners to index their content. On the other hand, French publishers will be compensated in other ways, including by potentially higher advertising revenues.
The deal is significant because it shows the growing number of ways that Google has to adapt to growing regulatory oversight and commercial pressures by ecosystem partners that think application providers are building big businesses without adequate compensation to content developers or access providers.
The compromise probably is a direction that will happen more often in the future, as ecosystem revenues are essentially transferred from Google to other partners, but in indirect ways that do not force Google to directly pay for either terminating access or copyright fees.
On the other hand, the deal funnels resources to French publishers. As part of the deal, Google also says it will work with French publishers to increase their online revenues using Google's advertising technology.
The compromise avoids putting Google in a position where it directly is paying content owners to index their content. On the other hand, French publishers will be compensated in other ways, including by potentially higher advertising revenues.
The deal is significant because it shows the growing number of ways that Google has to adapt to growing regulatory oversight and commercial pressures by ecosystem partners that think application providers are building big businesses without adequate compensation to content developers or access providers.
The compromise probably is a direction that will happen more often in the future, as ecosystem revenues are essentially transferred from Google to other partners, but in indirect ways that do not force Google to directly pay for either terminating access or copyright fees.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
94% of U.S. Homes Can Buy Mobile Broadband at 3 Mbps Speeds
Some 93.9 percent of mobile Internet access subscribers in the United States have access at 3 Mbps or faster, compared to about 93 percent of fixed network subscribers, an NTIA analysis suggests. The latest NTIA analysis will be updated in another six months, and the NTIA says it still wants feedback on the accuracy of the maps supporting the data.
Some 34 percent of homes have access to fixed wireless networks offer access at 3 Mbps. When considering that figure, keep in mind that fixed wireless does not operate as ubiquitously as do DSL and cable modem networks. The NTIA data only suggests that about a third of U.S. households can buy service at 3 Mbps from a fixed wireless provider.
That scenario does not change for speeds of at leaset 6 Mbps. As you would guess, fixed networks using optical fiber or cable modems have broad coverage at 6 Mbps or higher speeds. Some 86 percent of locations can buy cable high speed access at 6 Mbps or faster.
About 64 percent of digital subscriber line locations are able to get 6 Mbps service. About 78.6 percent of locations have access to mobile broadband of at least 6 Mbps.
Availability begins to diverge more at speeds of 25 Mbps. Only about 7.7 percent of U.S. homes have access to DSL at that rate. But 75.5 percent of homes can buy cable modem service operating at 25 Mbps.
About 4.7 percent of homes can buy fixed wireless service at 25 Mbps.
Some 34 percent of homes have access to fixed wireless networks offer access at 3 Mbps. When considering that figure, keep in mind that fixed wireless does not operate as ubiquitously as do DSL and cable modem networks. The NTIA data only suggests that about a third of U.S. households can buy service at 3 Mbps from a fixed wireless provider.
That scenario does not change for speeds of at leaset 6 Mbps. As you would guess, fixed networks using optical fiber or cable modems have broad coverage at 6 Mbps or higher speeds. Some 86 percent of locations can buy cable high speed access at 6 Mbps or faster.
About 64 percent of digital subscriber line locations are able to get 6 Mbps service. About 78.6 percent of locations have access to mobile broadband of at least 6 Mbps.
Availability begins to diverge more at speeds of 25 Mbps. Only about 7.7 percent of U.S. homes have access to DSL at that rate. But 75.5 percent of homes can buy cable modem service operating at 25 Mbps.
About 4.7 percent of homes can buy fixed wireless service at 25 Mbps.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
Friday, February 1, 2013
Tablets Generating More "Mobile Shopping" than Smart Phones
New research suggests some 22 percent of U.S. tablet-owning consumers spend $50 or more per month and nine percent spend $100 or more. That is much higher than spending levels by smart phone owners, ABI Research says. “Tablets are quickly becoming the go-to transaction screen within the home,” says ABI Research mobile devices senior practice director Jeff Orr.
Some will argue that “tablet commerce” really is not “mobile commerce,” a point well taken, as most tablets are used when people are not actually “mobile” but inside their homes or offices. On the other hand, perhaps a majority of mobile device usage likewise occurs when people are inside their homes or offices, so the definitions are a bit fuzzy.
The larger and notable point is that mobile and untethered devices are becoming a bigger factor in consumer “buying” and “shopping,” a fact that explains the huge interest on the part of application providers in mobile advertising, mobile promotion and mobile commerce.
Virtually nobody would argue that tablet commerce or mobile commerce has seriously affected retail stores. But few might be willing to argue that this always will be the case.
Logistics, such as price checking, using a coupon and location-based searches, consistently rank as the most common activities performed by more than 50 percent of tablet shoppers in the previous 90 days, while shopping, ABI Research has found.
At the close of 2012, ABI Research estimates, nearly 200 million tablets will have shipped worldwide since 2009 and an additional one billion tablets are forecasted to ship over the next five years. That growing installed base of users is certain to lead to higher commerce volume.
Mobile commerce already represents double digit billions worth of transaction volume.
According to comScore e-commerce research, 10 percent of online retail dollars spent in the third quarter of 2012 were spent from users on mobile devices.
That might grow to 12 percent to 13 percent during the fourth quarter of 2012.
Make no mistake, neither “mobile shopping” or “tablet shopping” are especially large transaction categories right now, compared either to total retail shopping or even online shopping.
But most observers think mobile is destined to become much bigger, for obvious reasons, among them prosaic issues such as the generally more difficult display advertising business on small screen devices.
That suggests commerce might be a bigger fit for mobile devices.
Some will argue that “tablet commerce” really is not “mobile commerce,” a point well taken, as most tablets are used when people are not actually “mobile” but inside their homes or offices. On the other hand, perhaps a majority of mobile device usage likewise occurs when people are inside their homes or offices, so the definitions are a bit fuzzy.
The larger and notable point is that mobile and untethered devices are becoming a bigger factor in consumer “buying” and “shopping,” a fact that explains the huge interest on the part of application providers in mobile advertising, mobile promotion and mobile commerce.
Virtually nobody would argue that tablet commerce or mobile commerce has seriously affected retail stores. But few might be willing to argue that this always will be the case.
Logistics, such as price checking, using a coupon and location-based searches, consistently rank as the most common activities performed by more than 50 percent of tablet shoppers in the previous 90 days, while shopping, ABI Research has found.
At the close of 2012, ABI Research estimates, nearly 200 million tablets will have shipped worldwide since 2009 and an additional one billion tablets are forecasted to ship over the next five years. That growing installed base of users is certain to lead to higher commerce volume.
Mobile commerce already represents double digit billions worth of transaction volume.
According to comScore e-commerce research, 10 percent of online retail dollars spent in the third quarter of 2012 were spent from users on mobile devices.
That might grow to 12 percent to 13 percent during the fourth quarter of 2012.
Make no mistake, neither “mobile shopping” or “tablet shopping” are especially large transaction categories right now, compared either to total retail shopping or even online shopping.
But most observers think mobile is destined to become much bigger, for obvious reasons, among them prosaic issues such as the generally more difficult display advertising business on small screen devices.
That suggests commerce might be a bigger fit for mobile devices.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
Thursday, January 31, 2013
Spectrum Policy Innovations are Coming
If AT&T, Verizon and T-Mobile USA are actively working to explore how to share spectrum now used by the U.S. Department of Defense, that is a signal that the carriers believer there is a serious chance spectrum sharing could happen, even if the carriers typically prefer to use only licensed spectrum.
The immediate focus is a proposed sharing of 95 MHz of spectrum currently used by DoD and other federal agencies, in the 1755 to 1850 MHz spectrum band.
Spectrum sharing, releasing more unlicensed spectrum and new spectrum auctions, plus reassignment of frequencies originally awarded for mobile satellite service are key ways regulators now are trying to make more spectrum available as a way of promoting mobile and wireless competition and innovation.
Since their introduction in 1994, the United States has conducted more than 70 spectrum auctions to assign thousands of wireless licenses.
But regulators also are working to increase the amount and ease of using unlicensed spectrum as well. The "white spaces" spectrum, and a new proposed sharing of 5-GHz spectrum are examples of some of the ways additional spectrum could be made available to existing and new service providers.
If three of the four largest U.S. mobile service providers are working in public on spectrum sharing in the 1755 MHz to 1850 MHz spectrum, it indicates they believe the spectrum will be made available.
The immediate focus is a proposed sharing of 95 MHz of spectrum currently used by DoD and other federal agencies, in the 1755 to 1850 MHz spectrum band.
Spectrum sharing, releasing more unlicensed spectrum and new spectrum auctions, plus reassignment of frequencies originally awarded for mobile satellite service are key ways regulators now are trying to make more spectrum available as a way of promoting mobile and wireless competition and innovation.
Since their introduction in 1994, the United States has conducted more than 70 spectrum auctions to assign thousands of wireless licenses.
But regulators also are working to increase the amount and ease of using unlicensed spectrum as well. The "white spaces" spectrum, and a new proposed sharing of 5-GHz spectrum are examples of some of the ways additional spectrum could be made available to existing and new service providers.
If three of the four largest U.S. mobile service providers are working in public on spectrum sharing in the 1755 MHz to 1850 MHz spectrum, it indicates they believe the spectrum will be made available.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
What is the "Value" of the Fixed Access Network
Studies of smart phone user behavior confirm what most of us might have concluded, namely that Wi-Fi has become a key access method for smart phone users, and provides the answer to a question some might now be asking about the respective roles of mobile and fixed access networks.
That there are synergies between mobile and fixed networks is incontestable. All forms of access, whether fixed, untethered or mobile, are essentially “tail circuits” that connect users to core networks.
What is harder to determine is precisely where those synergies exist, and how big the synergy might be, when considering the highest value provided by fixed access, as compared to mobile access.
That issue increasingly is important as most people, in virtually all markets, rely on smart phones, potentially raising the issue of mobile substitution for the fixed network, and as fast mobile networks using Long Term Evolution create, in a new way, a chance to substitute mobile networks for Internet access that formerly would really have made sense only on a fixed network.
In other words, the growing question is “what is the value of the fixed network.”
Support for video entertainment, and consumption of large amounts of bandwidth at low cost, to support multiple users, emerges as perhaps the defining “value” of a fixed access connection. The key issue is that, increasingly, most digital appliances used in the home or at work use Wi-Fi, which is a wireless tail for a fixed network.
Android smart phone users tracked for a year by NPD Connected Intelligence use between half a gigabyte a month to about 1 Gbyte a month of mobile network data. Apple iPhone users tend to use a bit more.
Though the data might reflect the smaller number of iPhone users in the sample, consumption tended to run between 0.75 Gbytes a month up to about two gigabytes a month. By December 2012, though, Apple iPhone users were consuming data at about the same rate as Android users.
U.K. Android users send and receive 78% of all their data over WiFi networks, according to Nielsen, which also tracked the data usage of about 1,500 Android users.
Nielsen’s analysis suggests as much as 78 percent of all data consumed by users is using a Wi-Fi connection of some sort.
Data collected by Mobidia shows that Wi-Fi usage is close to ubiquitous in developed markets, where more than 90 percent of smart phone users also use Wi-Fi as a means of data connectivity. In Hong Kong and the Netherlands, use of Wi-Fi by smart phone users is over 98 percent.
That there are synergies between mobile and fixed networks is incontestable. All forms of access, whether fixed, untethered or mobile, are essentially “tail circuits” that connect users to core networks.
What is harder to determine is precisely where those synergies exist, and how big the synergy might be, when considering the highest value provided by fixed access, as compared to mobile access.
That issue increasingly is important as most people, in virtually all markets, rely on smart phones, potentially raising the issue of mobile substitution for the fixed network, and as fast mobile networks using Long Term Evolution create, in a new way, a chance to substitute mobile networks for Internet access that formerly would really have made sense only on a fixed network.
In other words, the growing question is “what is the value of the fixed network.”
Support for video entertainment, and consumption of large amounts of bandwidth at low cost, to support multiple users, emerges as perhaps the defining “value” of a fixed access connection. The key issue is that, increasingly, most digital appliances used in the home or at work use Wi-Fi, which is a wireless tail for a fixed network.
Android smart phone users tracked for a year by NPD Connected Intelligence use between half a gigabyte a month to about 1 Gbyte a month of mobile network data. Apple iPhone users tend to use a bit more.
Though the data might reflect the smaller number of iPhone users in the sample, consumption tended to run between 0.75 Gbytes a month up to about two gigabytes a month. By December 2012, though, Apple iPhone users were consuming data at about the same rate as Android users.
U.K. Android users send and receive 78% of all their data over WiFi networks, according to Nielsen, which also tracked the data usage of about 1,500 Android users.
Nielsen’s analysis suggests as much as 78 percent of all data consumed by users is using a Wi-Fi connection of some sort.
Data collected by Mobidia shows that Wi-Fi usage is close to ubiquitous in developed markets, where more than 90 percent of smart phone users also use Wi-Fi as a means of data connectivity. In Hong Kong and the Netherlands, use of Wi-Fi by smart phone users is over 98 percent.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
A Few Tips for Increasing Your Influence at the FCC
Sharon Gillett, former chief of the FCC's Wireline Competition Bureau, talks about
- some of the FCC major activities affecting broadband that communities can participate in and/or influence;
- the typical process for moving from policy ideas to actual programs;
- how to work the public comment period; and
- ways in which communities and small or regional ISPs and telcos may partner to influence the FCC policy and programs.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
Wednesday, January 30, 2013
Will U.K. Mobile Market Change after LTE Auctions?
It is of course axiomatic that without access to spectrum, no entity can be in the mobile service provider business. That access can be through owned or leased spectrum, but fundamentally, spectrum access is necessary. That naturally raises the question of whether “winning” fourth generation Long Term Evolution spectrum is “necessary” for a firm to be a market leader in mobile services, in the future.
Some might say so. “The importance of this spectrum auction in shaping the future of the U.K. wireless market cannot be understated,” said Daniel Gleeson, mobile analyst at IHS iSuppli. “Access to spectrum is the main barrier to entry for any company looking to build a new wireless network.”
It is true that seven companies are bidding for spectrum: the country’s four existing mobile operators along with three new players. With only three companies likely to win spectrum, at least one of the United Kingdom’s existing operators is likely to lose out,” said Gleeson.
The four existing players that have entered the auction are EE, O2, Vodafone and Three. The three new entrants are BT, PCCW and MLL Telecom.
Other European spectrum auctions have only seen a maximum of three operators win 800 MHz spectrum. The United Kingdom could follow this pattern, yielding three winners and four losers, IHS iSuppli says.
Among the existing mobile operators, the companies with the most to lose are O2 and Vodafone, which presently do not have 4G spectrum, IHS iSuppli said.
Not securing 800 MHz licenses would be a disaster for O2 or Vodafone, some might argue, even if both firms were to win spectrum at 2.6 GHz. The reason is that 800 MHz is viewed as essential for rural coverage, while the 2.6 GHz spectrum is seen as best suited to urban coverage.
Some might argue that the more likely outcome is that the fourth provider will wind up leasing spectrum from one of the other three providers, so the result might not be catastrophic. Still, owning spectrum arguably is safer than leasing spectrum.
But that analysis assumes the prices paid by the winners are reasonable, in light of the incremental revenue opportunities. Europe’s mobile service providers know well the dangers of overpaying for spectrum, as was the case when the 3G auctions were hold.
Operators overpaid for that spectrum, causing years of financial distress that also threatened bankruptcy for a few.
So it is possible the U.K. 4G auctions could rearrange business plans, perhaps in unexpected ways. Depending on the outcome, one or two of the leading four providers in the U.K. mobile market might find themselves more limited in terms of national coverage.
One or more of the “winners” might find themselves in more favorable positions, in terms of quality and quantity of spectrum. The auction, by itself, will not immediately change the market share situation. But it could begin a process that does change the market.
Some might say so. “The importance of this spectrum auction in shaping the future of the U.K. wireless market cannot be understated,” said Daniel Gleeson, mobile analyst at IHS iSuppli. “Access to spectrum is the main barrier to entry for any company looking to build a new wireless network.”
It is true that seven companies are bidding for spectrum: the country’s four existing mobile operators along with three new players. With only three companies likely to win spectrum, at least one of the United Kingdom’s existing operators is likely to lose out,” said Gleeson.
The four existing players that have entered the auction are EE, O2, Vodafone and Three. The three new entrants are BT, PCCW and MLL Telecom.
Other European spectrum auctions have only seen a maximum of three operators win 800 MHz spectrum. The United Kingdom could follow this pattern, yielding three winners and four losers, IHS iSuppli says.
Among the existing mobile operators, the companies with the most to lose are O2 and Vodafone, which presently do not have 4G spectrum, IHS iSuppli said.
Not securing 800 MHz licenses would be a disaster for O2 or Vodafone, some might argue, even if both firms were to win spectrum at 2.6 GHz. The reason is that 800 MHz is viewed as essential for rural coverage, while the 2.6 GHz spectrum is seen as best suited to urban coverage.
Some might argue that the more likely outcome is that the fourth provider will wind up leasing spectrum from one of the other three providers, so the result might not be catastrophic. Still, owning spectrum arguably is safer than leasing spectrum.
But that analysis assumes the prices paid by the winners are reasonable, in light of the incremental revenue opportunities. Europe’s mobile service providers know well the dangers of overpaying for spectrum, as was the case when the 3G auctions were hold.
Operators overpaid for that spectrum, causing years of financial distress that also threatened bankruptcy for a few.
So it is possible the U.K. 4G auctions could rearrange business plans, perhaps in unexpected ways. Depending on the outcome, one or two of the leading four providers in the U.K. mobile market might find themselves more limited in terms of national coverage.
One or more of the “winners” might find themselves in more favorable positions, in terms of quality and quantity of spectrum. The auction, by itself, will not immediately change the market share situation. But it could begin a process that does change the market.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
DT Delays joyn Launch
Deutsche Telekom apparently has delayed its launch of the “joyn ” messaging service. Joyn originally was scheduled to launch in December 2012 but DT apparently has run into implementation issues.
Joyn, the GSMA-backed effort to create a carrier over the top messaging service, will allow DT customers to chat and send files, free of charge, on all smart phone tariffs, at no incremental cost incurring data usage charges, for all customers who have a calling plan with flat-rate data usage or text messaging plans.
Some have questioned whether joyn really will be able to compete with WhatsApp and other over the top messaging services, but the retail packaging plan DT has chosen is intended to make joyn usage an amenity for users who already are paying what DT considers to be reasonable amounts of money for voice and messaging usage.
Smart phone adoption is driving mobile service provider mobile broadband revenue. But smart phones also are cannibalizing service provider voice and messaging revenue.
In 2012 the increase in smart phone penetration will cause voice and messaging revenue erosion of 3.9 percent in Western Europe and 1.6 percent erosion in Eastern Europe, according to Informa Telecoms & Media.
In fact, every increase of 10 percentage points in smart phone penetration in a given market costs Western European operators a 0.5 percent loss of voice and messaging revenue, according to Informa calculations.
Joyn is a service made possible by the “Rich Communication Suite,” essentially messaging applications built on IP Multimedia Subsystem (IMS) standards.
Joyn, the GSMA-backed effort to create a carrier over the top messaging service, will allow DT customers to chat and send files, free of charge, on all smart phone tariffs, at no incremental cost incurring data usage charges, for all customers who have a calling plan with flat-rate data usage or text messaging plans.
Some have questioned whether joyn really will be able to compete with WhatsApp and other over the top messaging services, but the retail packaging plan DT has chosen is intended to make joyn usage an amenity for users who already are paying what DT considers to be reasonable amounts of money for voice and messaging usage.
Smart phone adoption is driving mobile service provider mobile broadband revenue. But smart phones also are cannibalizing service provider voice and messaging revenue.
In 2012 the increase in smart phone penetration will cause voice and messaging revenue erosion of 3.9 percent in Western Europe and 1.6 percent erosion in Eastern Europe, according to Informa Telecoms & Media.
In fact, every increase of 10 percentage points in smart phone penetration in a given market costs Western European operators a 0.5 percent loss of voice and messaging revenue, according to Informa calculations.
Joyn is a service made possible by the “Rich Communication Suite,” essentially messaging applications built on IP Multimedia Subsystem (IMS) standards.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
France Telecom LTE Will Cost More than 3G Service
France Telecom will raise the prices of some of its mobile offers in France when it launches faster fourth-generation Long Term Evolution mobile networks later in 2013, according to Gervais Pellissier, France Telecom CFO.
France Telecom had done so in the U.K. market when it launched LTE services, boosting plan prices by about six to 10 pounds.
France Telecom plans to launch 4G LTE in France in April 2013.
France Telecom had done so in the U.K. market when it launched LTE services, boosting plan prices by about six to 10 pounds.
France Telecom plans to launch 4G LTE in France in April 2013.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
Subscribe to:
Comments (Atom)
On the Use and Misuse of Principles, Theorems and Concepts
When financial commentators compile lists of "potential black swans," they misunderstand the concept. As explained by Taleb Nasim ...
-
We have all repeatedly seen comparisons of equity value of hyperscale app providers compared to the value of connectivity providers, which s...
-
It really is surprising how often a Pareto distribution--the “80/20 rule--appears in business life, or in life, generally. Basically, the...
-
One recurring issue with forecasts of multi-access edge computing is that it is easier to make predictions about cost than revenue and infra...