Tiered pricing--where higher amounts of use will result in higher prices--is inevitable, say analysts at Coda Research Consultancy, driven by U.S. mobile data consumption toward 327TB per month in 2015.
With compound annual growth rates of 117 percent, tiered pricing for mobile internet access will become unavoidable, the company predicts. Most of that increase will come from video, which is growing at a
138 percent CAGR to reach 224TB per month in 2015. At that point, mobile video will represent two thirds of mobile handset data traffic.
The key problem, though, is peak demand, at only some cell sites, as already is the case.
“As carrier networks now stand, network utilization will reach 100 percent in 2012 during peak times," says Steve Smith, Coda Research Consultancy co-founder. That is going to mean actual blocking of access during peak hours, much as users on older fixed networks once experienced occasional "fast busy" signals that indicated no circuits were available for use.
Use of pricing mechanisms will help, as it always does, by allowing consumers to make choices about their consumption. Many object that tiered pricing will face huge opposition from consumers conditioned to "unlimited" usage.
I suspect that will prove wrong. Buckets of usage already have been accepted by consumers who understand they can pay less for lower buckets of use, or more money for higher or unlimited use.
What users manifestly do not like is unpredictability; uncertainty about how high their bills will be at the end of the month. So long as consumers have accurate ways to measure their own usage, and an ability to adjust their plans as needed, without penalty, users will adapt easily to buckets of broadband usage.
In fact, consumers may well appreciate being able to decide for themselves whether they want to pay more to get more, or can simply adjust their usage at certain times of day, or at some places, or delay using some applications, in exchange for lower prices.
Mobile video users will grow at about a 34 percent CAGR, to reach 95 million users in the U.S. market in 2015. Use of mobile social networking will grow at a 21 percent CAGR to 2015.
Non-text-messaging-derived data revenues will climb at a 17 percent CAGR, and will comprise 87 percent of all data revenues in 2015, says Coda.
Tuesday, March 30, 2010
Tiered Mobile Broadband Pricing "Inevitable"
Labels:
buckets,
consumer behavior,
mobile pricing
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Monday, March 29, 2010
IBM Likes M2M or "Internet of Things" Potential
Why IBM, among others, is bullish on the potential for using mobile broadband networks for all sorts of useful things other than Web surfing or voice calls from mobile phones.
Labels:
M2M
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Operator App Stores Get More Traction Than You Might Think
Though many observers, including many service provider executives, might be skeptical about the long-term viability of operator-sponsored mobile application stores, a new study by Nielsen suggests consumers are favorably impressed with operator app stores, as compared to handset stores such as the Apple App Store.
(click image for larger view)
Many observers believe device app stores will ultimately gain favor, but a new Nielsen survey finds ongoing loyalty to carrier stores. As of the end of 2009, half of all applications users were accessing carrier app stores according to Nielsen’s new App Playbook service.
That is not to say the Apple App Store has lost any luster in the United States. The relatively new BlackBerry App World Store also was the second most popular app store, in part because of BlackBerry’s industry-leading installed base.
But carrier application stores were not as far behind as some might think. About 84 percent of respondents said they were satisfied with the Apple App Store, while 81 percent said they were happy with the Android Market.
Some 59 percent of respondents said they were satisfied with the BlackBerry App World. About 56 percent reported satisfaction with the Windows Marketplace.
Most mobile carrier stores compare favorably with BlackBerry. About 64 percent of respondents were satisified witht he AT&T Application Store, while 65 percent said they were satisfied with the Sprint Application Store.
Some 66 percent said they were happy with the T-Mobile Application Store and 62 percent reported they were satisfied with the Verizon Application Store.
Nielsen’s App Playbook surveys more than 4,000 application downloaders in the United States every six months about their mobile application usage.
more detail
(click image for larger view)
Many observers believe device app stores will ultimately gain favor, but a new Nielsen survey finds ongoing loyalty to carrier stores. As of the end of 2009, half of all applications users were accessing carrier app stores according to Nielsen’s new App Playbook service.
That is not to say the Apple App Store has lost any luster in the United States. The relatively new BlackBerry App World Store also was the second most popular app store, in part because of BlackBerry’s industry-leading installed base.
But carrier application stores were not as far behind as some might think. About 84 percent of respondents said they were satisfied with the Apple App Store, while 81 percent said they were happy with the Android Market.
Some 59 percent of respondents said they were satisfied with the BlackBerry App World. About 56 percent reported satisfaction with the Windows Marketplace.
Most mobile carrier stores compare favorably with BlackBerry. About 64 percent of respondents were satisified witht he AT&T Application Store, while 65 percent said they were satisfied with the Sprint Application Store.
Some 66 percent said they were happy with the T-Mobile Application Store and 62 percent reported they were satisfied with the Verizon Application Store.
Nielsen’s App Playbook surveys more than 4,000 application downloaders in the United States every six months about their mobile application usage.
more detail
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Ofcom Wants Better Consumer Information About Broadband Access Speeds
Ofcom, the U.K. communications regulator, is not happy with the accuracy of information provided consumers about their real-world broadband access services, and wants to revise the reporting process so better information is provided.
Mystery shoppers commissioned by Ofcom have found that 15 percent of the time, "potential customers" were not given an estimate of their access line speed, and 42 percent were only given one after prompting the sales agent near the end of the sales process.
The accuracy of the information proved to vary. In some cases, users were given double the line speed of another provider for the same line and technology, and sometimes received different answers over the phone when compared with the website of the same service provider. The majority of line speeds also did not match (within 1Mbps) the speeds given by the BT Wholesale line checker.
There is no question but that "best effort" broadband services are difficult to accurately predict or describe. It is true that users will sometimes experience bursts that correspond with the advertised "up to" speed. Most of the time, actual experienced sustained rates are lower, because of contention ratios and actual end user volume.
So Ofcom proposes that ISP's provide speed estimates based on line length, line capacitance and line attenuation, all measures that will provide a better approximation of typical download speeds.
Ofcom also wants to ensure that shoppers are given this information early in the sale process, particularly before payment information or a request for service is made.
Ofcom also seek to ensure that factors that affect broadband speed are explained. Specifically, Ofcom wants to ensure that shoppers are told how network capacity, congestion on the Internet and traffic management policies could affect performance. Consumers also should be told that actual throughput speeds will be lower than advertised or theoretical line speeds.
U.K. consumers already have the right to be moved to a cheaper, lower speed option when the plan they bought does not measure up. In cases where their is but one tier of service, Ofcom wants to allow consumers to leave their contracts without penalty.
Ofcom apparently will try to get such changes made voluntary. If changes aren't agreed to, or implemented, a regulatory review may occur, which could lead to formal regulation.
Such policies are not unreasonable consumer protection efforts. The problem is that formal guarantees are next to impossible so long as connections operate on a "best effort" basis.
Even on a quality-assured connection, which would have to be based on packet prioritization policies, throughput will vary throughout the day, based on overall contention for network resources, though far less than is the case on a "best effort" connection.
To closely match expected routine performance with an advertised top speed will require packet prioritization.
source
Mystery shoppers commissioned by Ofcom have found that 15 percent of the time, "potential customers" were not given an estimate of their access line speed, and 42 percent were only given one after prompting the sales agent near the end of the sales process.
The accuracy of the information proved to vary. In some cases, users were given double the line speed of another provider for the same line and technology, and sometimes received different answers over the phone when compared with the website of the same service provider. The majority of line speeds also did not match (within 1Mbps) the speeds given by the BT Wholesale line checker.
There is no question but that "best effort" broadband services are difficult to accurately predict or describe. It is true that users will sometimes experience bursts that correspond with the advertised "up to" speed. Most of the time, actual experienced sustained rates are lower, because of contention ratios and actual end user volume.
So Ofcom proposes that ISP's provide speed estimates based on line length, line capacitance and line attenuation, all measures that will provide a better approximation of typical download speeds.
Ofcom also wants to ensure that shoppers are given this information early in the sale process, particularly before payment information or a request for service is made.
Ofcom also seek to ensure that factors that affect broadband speed are explained. Specifically, Ofcom wants to ensure that shoppers are told how network capacity, congestion on the Internet and traffic management policies could affect performance. Consumers also should be told that actual throughput speeds will be lower than advertised or theoretical line speeds.
U.K. consumers already have the right to be moved to a cheaper, lower speed option when the plan they bought does not measure up. In cases where their is but one tier of service, Ofcom wants to allow consumers to leave their contracts without penalty.
Ofcom apparently will try to get such changes made voluntary. If changes aren't agreed to, or implemented, a regulatory review may occur, which could lead to formal regulation.
Such policies are not unreasonable consumer protection efforts. The problem is that formal guarantees are next to impossible so long as connections operate on a "best effort" basis.
Even on a quality-assured connection, which would have to be based on packet prioritization policies, throughput will vary throughout the day, based on overall contention for network resources, though far less than is the case on a "best effort" connection.
To closely match expected routine performance with an advertised top speed will require packet prioritization.
source
Labels:
broadband access,
consumer protection,
Ofcom
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Mobile Presence-Based Services $6 Billion by 2012
The value of presence-based mobile services will increase to more than $6 billion by 2012, according to Juniper Research. Increasing smartphone penetration in developed markets, coupled with rising global usage of mobile instant messaging will help to drive the trend, says John Levett, Juniper Research analyst.
Juniper thinks the key drivers will include presence-based text message alerts and services, geolocation applications that allow people to collaorate, share location details and take advantage of local knowledge, as well as social Web applications including social networking, user-generated content, blogs and dating apps.
Up to this point, revenues from presence-based services are almost exclusively derived from operator-billed mobile IM accounts. The amount of that activity faces two contradictory trends, one might argue.
On one hand, mobile IM will tend to fare better as end user adoption of 3G or 4G services increases. Broader adoption of 3G and 4G should therefore lead to heavier use of mobile IM, which should drive higher revenues. Mobile Web applications such as IM work best, and therefore encourage use, on faster data networks.
On the other hand, operator-billed IM revenues often are based on user inability to easily use over-the-top VoIP and IM applications that do not drive operator revenues. Over time, access to such open applications will deprive operators of the ability to profit from captive IM application access.
Juniper believes there is a way to thread the needle, as mobile broadband becomes a standard service for most developed-market customers and as operators move to embrace mobile VoIP in ways that include them in the revenue streams created by some over-the-top providers.
source
Juniper thinks the key drivers will include presence-based text message alerts and services, geolocation applications that allow people to collaorate, share location details and take advantage of local knowledge, as well as social Web applications including social networking, user-generated content, blogs and dating apps.
Up to this point, revenues from presence-based services are almost exclusively derived from operator-billed mobile IM accounts. The amount of that activity faces two contradictory trends, one might argue.
On one hand, mobile IM will tend to fare better as end user adoption of 3G or 4G services increases. Broader adoption of 3G and 4G should therefore lead to heavier use of mobile IM, which should drive higher revenues. Mobile Web applications such as IM work best, and therefore encourage use, on faster data networks.
On the other hand, operator-billed IM revenues often are based on user inability to easily use over-the-top VoIP and IM applications that do not drive operator revenues. Over time, access to such open applications will deprive operators of the ability to profit from captive IM application access.
Juniper believes there is a way to thread the needle, as mobile broadband becomes a standard service for most developed-market customers and as operators move to embrace mobile VoIP in ways that include them in the revenue streams created by some over-the-top providers.
source
Labels:
IM,
Juniper Research,
mobile IM,
presence
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Sunday, March 28, 2010
Is Another National LTE Network Needed?
Do businesses and consumers in the United States need one more fourth-generation nationwide wireless network, aside from the existing Clearwire, soon-to-be-built Verizon and AT&T networks, as well as regional networks being created by regional mobile providers and cable companies, not to mention high-speed 3G networks running at top speeds of 22 Mbps?
Though no firm answer can be given to that question, we might find out relatively soon whether investors think there is a need for another facilities-based 4G network of national coverage.
Harbinger Capital, which recently merged with SkyTerra, proposes to build a fully integrated satellite-terrestrial network to serve North American mobile users, with a national 4G terrestrial network covering 260 million people by the end of 2013.
The planned network would launch before the third quarter of 2011 and cover nine million people, with trials set initially for Denver and Phoenix. The next milestone is that 100 million people have to be covered by the end of 2012, 145 million by the end of 2013 and at least 260 million people in the United States by the end of 2015. Harbinger told the FCC that all major markets will be installed by the end of the second quarter of 2013.
The original thinking has been that wireless services within a number of vertical markets that are highly dependent upon the ubiquitous coverage and redundancy to be provided by its satellite network would be the core of the business strategy. But Harbinger might think there is a market broader than that as well.
Harbinger actually is required by the Federal Communications Commission to provide wholesale access to third parties, and also to restrict total Verizon Wireless and AT&T traffic to no more than 25 percent of total, to provide more competition in the market.
The big issue is whether there is substantial need for additional spectrum at this point. One might argue that industry requests, as well as FCC proposals, for allocation of an additional 500 megaHertz of spectrum for mobile broadband are clear evidence of need.
But there are other issues of market structure and competition. Assuming hundreds of new megaHertz of spectrum can eventually be relocated, most observers think the buyers of such spectrum would be the largest mobile providers such as AT&T and Verizon.
The Harbinger network, by definition, would largely be a platform for other providers, as it would operate as a wholesale provider.
The key business issue is whether there actually is sufficient business demand for another national 4G terrestrial network, though. Sprint and Clearwire both have relatively lavish amounts of spectrum already, and both have shown a willingness to sell wholesale capacity.
One might argue the key differentiator would be the satellite roaming features that would be available on handsets that normally default to the terrestrial network. But the bigger test will be of investor sentiment, as Harbinger will have to raise billions to build the new terrestrial network.
The 36,000 base stations that Harbinger plans to use, along with the tower sites, backhaul and other gear associated with a terrestrial network will require billions of dollars worth of investment.
Analyst Chris King at Stifel Nicolaus estimates that Verizon’s LTE network will cost about $5 billion to deploy. Clearwire has also spent billions on its network, with analyst estimates ranging from $3 billion to about $6 billion. There is no particular reason to think the ubiquitous terrestrial network Harbinger expects to build would cost less.
Investors will have to be found first, before there is a chance to test the thesis that another facilities-based 4G network is needed.
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Tesco Abandons VoIP Market
U.K. retailer Tesco, which began selling consumer VoIP service in 2006, now is pulling the plug, though it will continue to sell mobile service. Without reading more into the news than is warranted, the move is illustrative of the fact that consumer VoIP might be less an innovation than some had hoped for, and certaintly is a less-robust business than anticipated, especially compared to mobile service, at least for the moment.
That is not to say other competitors, with different assets, can fare better. But the April 27, 2010 shutoff at least suggests that the "VoIP" market has not proven to be the lucrative business Tesco once believed it was, given its ability to support and market the business, as well as the evolution of end user demand, which arguably has tipped in the direction of mobility.
Earlier in the last decade, there was much more apparent optimism that fixed-line VoIP would "change telephony forever," creating significant new opportunities for non-traditional providers.
One might argue that VoIP's primary impact has been to accelerate voice price erosion, without creating a significant new market, though it has been the way cable operators have taken market share from telcos.
Tesco says "trends in technology have moved forward since we launched Internet phone so that this is no longer a sustainable service". One might infer that means mobility now is the "hot" service.
"Tesco Internet Phone" was basically a Skype-style PC offering, though the supermarket did offer a Vonage-style terminal adapter version as well.
That is not to say further innovation in voice services is impossible, or in fact unlikely. There will be advances. The issue is whether the scale, impact and economic importance of such voice innovations is going to approach the advances being made in mobility, broadband, Internet and Web services.
related article
That is not to say other competitors, with different assets, can fare better. But the April 27, 2010 shutoff at least suggests that the "VoIP" market has not proven to be the lucrative business Tesco once believed it was, given its ability to support and market the business, as well as the evolution of end user demand, which arguably has tipped in the direction of mobility.
Earlier in the last decade, there was much more apparent optimism that fixed-line VoIP would "change telephony forever," creating significant new opportunities for non-traditional providers.
One might argue that VoIP's primary impact has been to accelerate voice price erosion, without creating a significant new market, though it has been the way cable operators have taken market share from telcos.
Tesco says "trends in technology have moved forward since we launched Internet phone so that this is no longer a sustainable service". One might infer that means mobility now is the "hot" service.
"Tesco Internet Phone" was basically a Skype-style PC offering, though the supermarket did offer a Vonage-style terminal adapter version as well.
That is not to say further innovation in voice services is impossible, or in fact unlikely. There will be advances. The issue is whether the scale, impact and economic importance of such voice innovations is going to approach the advances being made in mobility, broadband, Internet and Web services.
related article
Labels:
business strategy,
consumer demand,
Tesco,
VoIP
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Subscribe to:
Posts (Atom)
Directv-Dish Merger Fails
Directv’’s termination of its deal to merge with EchoStar, apparently because EchoStar bondholders did not approve, means EchoStar continue...
-
We have all repeatedly seen comparisons of equity value of hyperscale app providers compared to the value of connectivity providers, which s...
-
It really is surprising how often a Pareto distribution--the “80/20 rule--appears in business life, or in life, generally. Basically, the...
-
One recurring issue with forecasts of multi-access edge computing is that it is easier to make predictions about cost than revenue and infra...