Chetan Sharma Consulting forecasts that if left unchecked, the costs of delivering mobile data will likely outstrip incremental revenues by the second half of 2011 in the U.S. market and become unsustainable by 2013.
The rapid growth in mobile data costs has prompted operators to look at more sophisticated network congestion management strategies that fall into four categories: policy control, data traffic offload, infrastructure investment, and network optimization.
Shifting data traffic off a congested mobile network and onto another access technology fundamentally changes the economics of delivering that data. Offload is being implemented by operators globally, including offload to Wi-Fi and offload to femtocells.
Operators deploying a mixed multi-access offload strategy can expect savings in the range of 20 to 25 per cent per year. In the US market, operators will save between $30 and $40 billion per annum by 2013 through an offload strategy alone.
More-efficient new networks will help as well. Infrastructure evolution to 3.5G (HSPA) and 4G (LTE ) lowers the cost-per-bit for data throughput on the network, thereby reducing overall costs.
Chetan Sharma Consulting forecasts that evolving to HSPA and LTE will result in cost savings of just under 20 per cent or almost $25 billion per year in the U.S. market by 2013.
Network optimisation, through techniques such as compression and caching also adds incremental
savings by reducing the total number of bits traversing the network. Typically, Sharma reports,
operators can generate savings of five to 10 per cent by 2013 through this strategy.
Anecdotally, operators have reported that 80 per cent of the traffic in urban centers is being
generated by 10 per cent of the cell sites. So policy control (how, when and under which circumstances subscribers can access networks) can contribute annual cost savings of over 10 per cent, equating to over $15 billion in annual cost reduction by 2013 in the US market, Chetan Sharma says.
But cost reduction is only one side of the equation. Tiered and usage-based pricing also is required. Such policies need not be heavyhanded, top-down service provider rules but rather flexible, dynamic, and personalised pricing models that reflect subscribers’ preferences and context.
Taken as a whole, all the optimization techniques and new pricing models will be needed as the whole mobile business changes from a voice revenue model to an "bandwidth-based" business.
Tuesday, February 9, 2010
Multiple Tools Needed to Preserve Mobile Bandwidth
Labels:
broadband,
business model,
mobile,
mobile broadband,
policy
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Which Growth Pattern Emerges as Recession Ends?
Many economists and market watchers think consumers eventually will return to spending patterns as they existed prior to the recent recession, and on the growth pattern of the 20 years before the recession.
Others warn that growth patterns are more likely to revert to patterns of the 1945 to 1970s, when annual growth in consumer spending was much more restrained.
So the question for many might be, which view is right? For application and service providers, the question might not be as germane. The reason is that consumer spending on network-delivered services and applications was stable over the entire period, and in fact has shown a slow, steady growth.
In other words, people are shifting more of their available entertainment budget to network-based products. Communications spending likewise has slowly grown its percentage of overall discretionary spending, not fluctuating wildly from one year to the next.
Of course, lots of other background factors have changed. There are more products, more applications, more services and providers to choose from.
The value of many products has taken on an increasing "network services" character as well. Consider the value of a PC without Internet access, for example.
The point is that whichever forecast proves correct--either a return to the growth trend of the past two decades, or a reversion to the lower spending growth of the years 1945 to 1979, network-based products are likely to continue a slow, steady, upward growth trend. That may not be true for other industries.
Others warn that growth patterns are more likely to revert to patterns of the 1945 to 1970s, when annual growth in consumer spending was much more restrained.
So the question for many might be, which view is right? For application and service providers, the question might not be as germane. The reason is that consumer spending on network-delivered services and applications was stable over the entire period, and in fact has shown a slow, steady growth.
In other words, people are shifting more of their available entertainment budget to network-based products. Communications spending likewise has slowly grown its percentage of overall discretionary spending, not fluctuating wildly from one year to the next.
Of course, lots of other background factors have changed. There are more products, more applications, more services and providers to choose from.
The value of many products has taken on an increasing "network services" character as well. Consider the value of a PC without Internet access, for example.
The point is that whichever forecast proves correct--either a return to the growth trend of the past two decades, or a reversion to the lower spending growth of the years 1945 to 1979, network-based products are likely to continue a slow, steady, upward growth trend. That may not be true for other industries.
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Monday, February 8, 2010
The "Problem" With Nexus One is the Retail Packaging, Not the Phone
By some accounts, the Google Nexus One phone has not sold as many units as some might have hoped. Flurry, a mobile analytics firm, estimates that 20,000 Nexus Ones were sold in the first week. That tracks poorly compared to the myTouch3G, which sold up to 60,000, and the Motorola Droid, which sold 250,000 in the first week.
Some people really like the idea of "unlocked" phones, despite the full retail price, as the price of gaining freedom to use "any" carrier (in the U.S. market two of four major carriers). But so far, most U.S. consumers seem to prefer the old "closed" model, where they get discounts on devices in exchange for contracts.
Beyond that, there is the clumsy customer support process. Users can email Google and get an answer within 48 hours. I don't know about you, but if any service provider took that long to get back to me when I have a problem, they will not be my service provider much longer than that. I can easily find a replacement provider within two days.
But that's the problem with Google's current model. With the current model, a customer contacts Google, and hopes the problem is not something the carrier (T-Mobile) or HTC (the device manufacturer) has to fix.
That's no slam on the device. But the customer interface is wrong. People are used to buying from one retailer that "owns" the customer service responsibility. And people will not be happy with two termination fees for early cancellation of a contract--one charged by T-Mobile USA and a separate restocking fee levied by Google.
Ignoring the amount of the fee and the logic, that's just going to make people mad. People generally understand the early termination fee. But they don't expect to pay twice.
Unlocked phones have sold better in Europe, but there is a huge difference between the U.S. market and Europe. In Europe, when one buys an unlocked device at full price, it really does work on all networks. In the United States, Verizon and Sprint use the CDMA air interface while AT&T and T-Mobile use the GSM air interface.
So an unlocked phone will only work on half of those networks. Under such conditions, the value of an unlocked phone is dramatically reduced. But most consumers don't really care about air interface or "locking."
They are used to a retail relationship where they know who owns the product and process. And there still is not much evidence to indicate the value of an unlocked, full retail device is more important than the comfort of knowing who is responsible when something doesn't work properly.
Despite the generally-accepted wisdom that "open" ecosystems innovate faster (which is true), that doesn't mean customer experience is better. As Apple has shown time and again, a closed, tightly-integrated approach can produce a much-better experience and lots of innovation at the same time.
So far, it doesn't appear the unlocked Nexus One model is doing that.
Some people really like the idea of "unlocked" phones, despite the full retail price, as the price of gaining freedom to use "any" carrier (in the U.S. market two of four major carriers). But so far, most U.S. consumers seem to prefer the old "closed" model, where they get discounts on devices in exchange for contracts.
Beyond that, there is the clumsy customer support process. Users can email Google and get an answer within 48 hours. I don't know about you, but if any service provider took that long to get back to me when I have a problem, they will not be my service provider much longer than that. I can easily find a replacement provider within two days.
But that's the problem with Google's current model. With the current model, a customer contacts Google, and hopes the problem is not something the carrier (T-Mobile) or HTC (the device manufacturer) has to fix.
That's no slam on the device. But the customer interface is wrong. People are used to buying from one retailer that "owns" the customer service responsibility. And people will not be happy with two termination fees for early cancellation of a contract--one charged by T-Mobile USA and a separate restocking fee levied by Google.
Ignoring the amount of the fee and the logic, that's just going to make people mad. People generally understand the early termination fee. But they don't expect to pay twice.
Unlocked phones have sold better in Europe, but there is a huge difference between the U.S. market and Europe. In Europe, when one buys an unlocked device at full price, it really does work on all networks. In the United States, Verizon and Sprint use the CDMA air interface while AT&T and T-Mobile use the GSM air interface.
So an unlocked phone will only work on half of those networks. Under such conditions, the value of an unlocked phone is dramatically reduced. But most consumers don't really care about air interface or "locking."
They are used to a retail relationship where they know who owns the product and process. And there still is not much evidence to indicate the value of an unlocked, full retail device is more important than the comfort of knowing who is responsible when something doesn't work properly.
Despite the generally-accepted wisdom that "open" ecosystems innovate faster (which is true), that doesn't mean customer experience is better. As Apple has shown time and again, a closed, tightly-integrated approach can produce a much-better experience and lots of innovation at the same time.
So far, it doesn't appear the unlocked Nexus One model is doing that.
Labels:
Android,
customer experience,
Google,
marketing,
Nexus One,
user experience
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
How PC Usage is Different from Mobile
To state the obvious, users behave differently on their mobile devices than they do on their PCs, which ought to have implications for a world where perhaps half to two thirds of all Web and Internet access is from a mobile device.
A study of 16 information workers over a period of time illustrates some of the differences (again, keeping in mind that habits likely continue to evolve).
Mobile service providers, for example, know there is a huge difference between users on PCs and smartphone users.
Namely, PC users consume lots more data. And that is what the study conducted by Microsoft and the University of Washington also noted. The other obvious observation was that phones are used for voice and text messaging. PCs can be used for those applications, but in this study of office workers, that was not the case.
And productivity applications, though important for desktop use, was not the focus on mobiles, where "maps" seem to be more important, as you might expect. Users relied on both devices for email and Web access. Beyond that, the usage profiles were different.
Aside from the sheer difference in volume, understandable given the "on the go" nature of a mobile phone, users did different things on their mobiles. One might hypothesize that mobile device input-output limitations and time constraints (people are on the go) account for much of the difference in behavior. Heavy document or file interactions are not prevalent on mobiles.
That doesn't mean people will stop doing things at their desks that require full PC support. It does suggest that as use of mobiles becomes a bigger driver of Internet usage, the key applications will change. Mobiles are "becoming PCs," but that does not mean they will be used the same way, at all. The Microsoft study simply confirms that fact.
A study of 16 information workers over a period of time illustrates some of the differences (again, keeping in mind that habits likely continue to evolve).
Mobile service providers, for example, know there is a huge difference between users on PCs and smartphone users.
Namely, PC users consume lots more data. And that is what the study conducted by Microsoft and the University of Washington also noted. The other obvious observation was that phones are used for voice and text messaging. PCs can be used for those applications, but in this study of office workers, that was not the case.
And productivity applications, though important for desktop use, was not the focus on mobiles, where "maps" seem to be more important, as you might expect. Users relied on both devices for email and Web access. Beyond that, the usage profiles were different.
Aside from the sheer difference in volume, understandable given the "on the go" nature of a mobile phone, users did different things on their mobiles. One might hypothesize that mobile device input-output limitations and time constraints (people are on the go) account for much of the difference in behavior. Heavy document or file interactions are not prevalent on mobiles.
That doesn't mean people will stop doing things at their desks that require full PC support. It does suggest that as use of mobiles becomes a bigger driver of Internet usage, the key applications will change. Mobiles are "becoming PCs," but that does not mean they will be used the same way, at all. The Microsoft study simply confirms that fact.
Labels:
mobile PC,
mobile phone,
PC,
smartphone
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Mobile Broadband Will Need a New Business Model
One way or the other, something has got to change in the mobile business as voice ceases to be the industry revenue driver. Today mobile service providers get 86 percent of their revenue from low-bandwidth applications like voice and text. But that will keep changing in predictable ways.
Namely, most capacity requirements will be driven by low-margin data rather than high-margin voice and text. Over the long term, it is irrational to better price services in relationship to cost without attributing more revenue directly to the data services that are driving capital investment.
That doesn't mean every single service or application necessarily has to be priced in relationship to cost. Loss leaders at supermarkets, promotional DVD prices at Target and other promotional pricing happens all the time, in every business. Some products have high margin, others low or even negative margins.
The point is that current retail pricing will get more irrational as data demand grows, and that something will have to be done about it.
Carriers are investing in new capacity, but that alone will not be enough to bring revenue and capacity into balance. By 2013, virtually all traffic load will be driven by broadband data of one sort or another, especially video. That means, over time, new ways of charging for network usage will have to be created.
Like it or not, network management is going to be necessary, plus traffic offload and policy management. The issue, in part, is that demand is unevenly distributed. Even at peak hours of congestion, only a minor percentage of cell sites actually account for most of the congestion. To speak of congestion management at the "whole network" level is not to capture the issue.
The key issue is peak-hour congestion at perhaps 10 percent to 15 percent of sites. Put another way, even at peak congestion, 85 to 90 percent of sites do not experience difficulty. That means it might be necessary to use different policies at a small number of physical sites, not the entire network, even at peak hours.
So even if traffic shaping, bit priority policies and other tools are not generally required at every site, for every application or user, there will be a need to do so at some sites, some of the time.
Namely, most capacity requirements will be driven by low-margin data rather than high-margin voice and text. Over the long term, it is irrational to better price services in relationship to cost without attributing more revenue directly to the data services that are driving capital investment.
That doesn't mean every single service or application necessarily has to be priced in relationship to cost. Loss leaders at supermarkets, promotional DVD prices at Target and other promotional pricing happens all the time, in every business. Some products have high margin, others low or even negative margins.
The point is that current retail pricing will get more irrational as data demand grows, and that something will have to be done about it.
Carriers are investing in new capacity, but that alone will not be enough to bring revenue and capacity into balance. By 2013, virtually all traffic load will be driven by broadband data of one sort or another, especially video. That means, over time, new ways of charging for network usage will have to be created.
Like it or not, network management is going to be necessary, plus traffic offload and policy management. The issue, in part, is that demand is unevenly distributed. Even at peak hours of congestion, only a minor percentage of cell sites actually account for most of the congestion. To speak of congestion management at the "whole network" level is not to capture the issue.
The key issue is peak-hour congestion at perhaps 10 percent to 15 percent of sites. Put another way, even at peak congestion, 85 to 90 percent of sites do not experience difficulty. That means it might be necessary to use different policies at a small number of physical sites, not the entire network, even at peak hours.
So even if traffic shaping, bit priority policies and other tools are not generally required at every site, for every application or user, there will be a need to do so at some sites, some of the time.
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Apple and RIM Are Winners in Handset Market, Profit-Wise
The Apple iPhone might not be the only reason the mobile handset market has changed over the past several years, but it is a major influence, according to a new analysis by analysts at Deutsche Bank.
In 2006, before the iPhone was available, Nokia had nearly half--47 percent--of industry profits. By the end of 2010, it will have 25 percent.
In 2006, Sony Ericsson had 11 percent share. By the end of 2010 it will have a negative one percent operating profit.
Motorola had 18 percent share in 2006 and will have declined to about a negative one percent by the end of 2010.
By the end of 2010 Apple will have an estimated 37 percent share, while Research in Motion, which had four percent share in 2006, will have grown to 16 percent.
Most of the other suppliers will have remained about where they were in 2006, except for Lucky Goldstar, which will have grown from one percent to six percent.
Keep in mind, these figures reflect profits, not handset share.
In 2006, before the iPhone was available, Nokia had nearly half--47 percent--of industry profits. By the end of 2010, it will have 25 percent.
In 2006, Sony Ericsson had 11 percent share. By the end of 2010 it will have a negative one percent operating profit.
Motorola had 18 percent share in 2006 and will have declined to about a negative one percent by the end of 2010.
By the end of 2010 Apple will have an estimated 37 percent share, while Research in Motion, which had four percent share in 2006, will have grown to 16 percent.
Most of the other suppliers will have remained about where they were in 2006, except for Lucky Goldstar, which will have grown from one percent to six percent.
Keep in mind, these figures reflect profits, not handset share.
Labels:
Apple,
HTC,
LGE,
Lucky Goldstar,
Motorola,
Nokia,
Palm,
RIM,
Samsung,
Sony Ericsson
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Sunday, February 7, 2010
Conferencing Now Part of UC, Study Finds
UC is often thought of as a broad solution set including a unified directory, unified messaging, a single number (find me, follow me), presence awareness and the ability to track all forms of communication, say Josie Sephton and Dale Vile, Freeform Dynamics researchers.
What seems to have changed lately is the increased role conferencing solutions seem to be playing as parts of an integrated UC solution. Among lead adopters, audio conferencing is viewed as a mandatory feature by more than 70 percent of information technology executives surveyed by Freeform Dynamics.
More than 40 percent of all respondents said that audio conferencing is mandatory (Click on image to see larger view).
Nearly 20 percent of the most-aggressive UC adopters say video calling is mandatory, while more than 65 percent say that features is "desirable." So far, fewer than 10 percent of all respondents say video calling is mandatory.
About 25 percent of early UC adopters say video conferencing is a mandatory UC feature, and about 55 percent of early adopters say Web conferencing is a mandatory UC feature.
Instant messaging is seen by more than 80 percent of early adopters as a mandatory feature. Nearly 40 percent of all enterprise IT executives say IM is necessary.
What seems to have changed lately is the increased role conferencing solutions seem to be playing as parts of an integrated UC solution. Among lead adopters, audio conferencing is viewed as a mandatory feature by more than 70 percent of information technology executives surveyed by Freeform Dynamics.
More than 40 percent of all respondents said that audio conferencing is mandatory (Click on image to see larger view).
Nearly 20 percent of the most-aggressive UC adopters say video calling is mandatory, while more than 65 percent say that features is "desirable." So far, fewer than 10 percent of all respondents say video calling is mandatory.
About 25 percent of early UC adopters say video conferencing is a mandatory UC feature, and about 55 percent of early adopters say Web conferencing is a mandatory UC feature.
Instant messaging is seen by more than 80 percent of early adopters as a mandatory feature. Nearly 40 percent of all enterprise IT executives say IM is necessary.
Gary Kim has been a digital infra analyst and journalist for more than 30 years, covering the business impact of technology, pre- and post-internet. He sees a similar evolution coming with AI. General-purpose technologies do not come along very often, but when they do, they change life, economies and industries.
Subscribe to:
Posts (Atom)
Zoom Wants to Become a "Digital Twin Equipped With Your Institutional Knowledge"
Perplexity and OpenAI hope to use artificial intelligence to challenge Google for search leadership. So Zoom says it will use AI to challen...
-
We have all repeatedly seen comparisons of equity value of hyperscale app providers compared to the value of connectivity providers, which s...
-
It really is surprising how often a Pareto distribution--the “80/20 rule--appears in business life, or in life, generally. Basically, the...
-
One recurring issue with forecasts of multi-access edge computing is that it is easier to make predictions about cost than revenue and infra...