Apple could begin selling U.S. television shows for $1, half of its charge on its iTunes digital media store, on the iPad, the Financial Times reports. If Apple does so, it would mean at least some U.S. content owners have decided to take the gamble of offsetting lower retail price points with higher sales.
As powerful as "free" might be for many products, $1 likewise has proven to be an enormously
successful price point for mobile application store downloads, for example. Also, Redbox DVD
rentals are priced at $1, and that price point has been gaining traction.
Apple has been selling TV episodes for about $2 each on its iTunes store, while high-definition fare that displays well on a TV set sells for $3 an episode.
Video entertainment has been a big part of thinking about what new market the iPad might be able to create, between the smartphone and the notebook or netbook PC.
Apple also has been in discussions with content owners about a “best of TV” subscription service, perhaps offered at about $30 a month, that hopefully would create a new niche in the market as well, more than one-off downloads and streaming but less than the full channel line-up that customers can buy from cable, satellite or telco providers.
The trick, of course, is to create a new niche that does not automaticaly cannibalize the value of other existing channels. That is likely one reason why Apple has not tried to create a subscription TV service for its Apple TV device.
Thursday, February 11, 2010
$1 TV Episodes for iPad?
Labels:
Apple,
iPad,
iTunes,
online video,
Redbox
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
Wednesday, February 10, 2010
Send SMS Messages to Multiple Recipients Using Google Voice
One of the differences between email and texting, aside from the SMS character limitation, is sending a single message to multiple recipients. Google Voice now allows such multi-party text messages.
Users just click on the SMS button at the top of their Google Voice inboxes, enter names or numbers (separated by commas) in the "To" field, write messages and click "send."
Replies from each recipient are threaded into separate conversations, so users can keep track of them in their Google Voice inboxes. To prevent spam, Google sets a maximum of five recipients per message.
It's useful.
Users just click on the SMS button at the top of their Google Voice inboxes, enter names or numbers (separated by commas) in the "To" field, write messages and click "send."
Replies from each recipient are threaded into separate conversations, so users can keep track of them in their Google Voice inboxes. To prevent spam, Google sets a maximum of five recipients per message.
It's useful.
Labels:
Google Voice,
SMS,
text messaging
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
Google to Build Fiber to Home Test Networks
In the latest version of its Internet access demonstration projects, Google says it will build some wholesale fiber-to-the-home networks. James Kelly, Google project manager, says Google is looking for communities interested in becoming trial sites, and will be accepting requests for information until March 26, 2010.
Initial plans call for building FTTH facilities serving 50,000 people, perhaps as many as 500,000. That's a bit indefinite. Assume a typical household has about 2.5 people in it and one could see perhaps 20,000 to 200,000 homes connected.
But there are lots of unanswered questions. It isn't clear whether Google means "a network passing X number of homes" or "a network serving X number of homes." Those are very different sorts of numbers if Google builds anyplace where a strong cable operator and a strong telco provider already are in business.
As a demonstration project, Google would learn as much serving 20,000 potential customers as 200,000. Frankly, there is very little that is unknown about the cost of building a fiber-to-home network, really. What Google might be interested in is the business case for a wholesale broadband network that didn't have to supply voice and traditional video services.
But again, there is very little that is unknown about that business case, either. We know the cost to build the network and operate it as an ISP. You can derive commercial rates by looking at what is charged in the local, regional or national markets, multiply by take rate of homes passed and you'd have your scenario without digging a single trench.
Kelly says "we are going to try out new ways to build and operate fiber networks and share what we learn with the world."
There likely is not much to be uncovered about the process of building FTTH, so Google likely means sharing what it learns about the business model for wholesale networks, which a few have experimented with in the U.S. market.
Broadweave, which operates in Provo, Utah, is an example, though it is unclear whether the firm will continue to allow wholesale customers to use its network. The current thinking seems to be to operate as a typical retail triple-play provider, rather than as a wholesale provider of access to third parties who may wish to do so.
Broadweave Networks took over a project originally started as a municipal FTTH network, paying $40.6 million for the assets, and struggled in 2009. The company drastically slowed its growth among residential customers to save on the high cost of new sign-ups in the spring of 2009, reports the Daily Herald.
Broadweave began asking prospective customers to foot part of the $1,000 installation bill in an effort to discourage new customers and effectively slow growth while the company grapples with ongoing financial concerns.
The company is instead decided to focus on commercial accounts and reactivation of former customers who already have optical drops in place.
The flow of new customers reportedly was slowed from hundreds a month to "tens or dozens" a month.
Google might attempt to prove the thesis that an FTTH wholesaler can make money strictly as a supplier of access services to third party partners who simply lease capacity on such networks. The big question always has been whether any such network actually is viable in markets where there are strong cable and telco competitors already in place.
Google previously has built limited municipal Wi-Fi networks as well. It isn't clear what Google believes it might have learned from those experiments.
In truth, the gambit most likely is simply another tool to be used in Google's lobbying for setting of national broadband policies, and not much more.
Initial plans call for building FTTH facilities serving 50,000 people, perhaps as many as 500,000. That's a bit indefinite. Assume a typical household has about 2.5 people in it and one could see perhaps 20,000 to 200,000 homes connected.
But there are lots of unanswered questions. It isn't clear whether Google means "a network passing X number of homes" or "a network serving X number of homes." Those are very different sorts of numbers if Google builds anyplace where a strong cable operator and a strong telco provider already are in business.
As a demonstration project, Google would learn as much serving 20,000 potential customers as 200,000. Frankly, there is very little that is unknown about the cost of building a fiber-to-home network, really. What Google might be interested in is the business case for a wholesale broadband network that didn't have to supply voice and traditional video services.
But again, there is very little that is unknown about that business case, either. We know the cost to build the network and operate it as an ISP. You can derive commercial rates by looking at what is charged in the local, regional or national markets, multiply by take rate of homes passed and you'd have your scenario without digging a single trench.
Kelly says "we are going to try out new ways to build and operate fiber networks and share what we learn with the world."
There likely is not much to be uncovered about the process of building FTTH, so Google likely means sharing what it learns about the business model for wholesale networks, which a few have experimented with in the U.S. market.
Broadweave, which operates in Provo, Utah, is an example, though it is unclear whether the firm will continue to allow wholesale customers to use its network. The current thinking seems to be to operate as a typical retail triple-play provider, rather than as a wholesale provider of access to third parties who may wish to do so.
Broadweave Networks took over a project originally started as a municipal FTTH network, paying $40.6 million for the assets, and struggled in 2009. The company drastically slowed its growth among residential customers to save on the high cost of new sign-ups in the spring of 2009, reports the Daily Herald.
Broadweave began asking prospective customers to foot part of the $1,000 installation bill in an effort to discourage new customers and effectively slow growth while the company grapples with ongoing financial concerns.
The company is instead decided to focus on commercial accounts and reactivation of former customers who already have optical drops in place.
The flow of new customers reportedly was slowed from hundreds a month to "tens or dozens" a month.
Google might attempt to prove the thesis that an FTTH wholesaler can make money strictly as a supplier of access services to third party partners who simply lease capacity on such networks. The big question always has been whether any such network actually is viable in markets where there are strong cable and telco competitors already in place.
Google previously has built limited municipal Wi-Fi networks as well. It isn't clear what Google believes it might have learned from those experiments.
In truth, the gambit most likely is simply another tool to be used in Google's lobbying for setting of national broadband policies, and not much more.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
Broadband "Overshoot": How Big an Issue?
Fixed network operators, including cable operators and telcos, face two different problems when considering upgrades of their broadband access networks. One danger is not investing enough to keep capacity in line with the level of market demand. The opposite problem is investing too much.
In fact, that is a scenario at least some mobile competitors hope might be the case. "Fixed line broadband will overshoot the performance needs of the market, resulting in increasing data cord cutting as individuals, families, and businesses appreciate the value of mobility more than the value of excess bandwidth," says Russ McGuire Sprint Nextel VP.
That's not an unexpected view,coming from a wireless-only company building high-capacity wireless networks, but it is worth considering.Mobile broadband might, or might not, be a reasonable substitute for users who really want to watch lots of video on their PCs, smartphones or other mobile devices.
But mobile is likely to be quite a reasonable alternative for users who don't stream or download much video, especially if they are fairly mobile, either locally or over larger distances.
In fact, that is a scenario at least some mobile competitors hope might be the case. "Fixed line broadband will overshoot the performance needs of the market, resulting in increasing data cord cutting as individuals, families, and businesses appreciate the value of mobility more than the value of excess bandwidth," says Russ McGuire Sprint Nextel VP.
That's not an unexpected view,coming from a wireless-only company building high-capacity wireless networks, but it is worth considering.Mobile broadband might, or might not, be a reasonable substitute for users who really want to watch lots of video on their PCs, smartphones or other mobile devices.
But mobile is likely to be quite a reasonable alternative for users who don't stream or download much video, especially if they are fairly mobile, either locally or over larger distances.
Labels:
broadband,
mobile broadband
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
Video and Web Drive Mobile Bandwidth Consumption
Mobile bandwidth demand already is driven by video and Web access, a new analysis by Allot shows (click on image for larger view).And though peer-to-peer applications were the cause of bandwidth fears several years ago, most video activity now occurs using HTTP, meaning it is now part of the Web browser experience.
As is true for backbone networks and fixed networks, voice, instant messaging, email and all other apps besides video and Web applications are a negligible driver of bandwidth consumption.
That doesn't mean revenue reflects bandwidth use. Revenue still is inordinately driven by voice and texting. Over time, that will change. If broadband is what is driving use of the network, then broadband has to become the mainstay of the revenue model as well.
Labels:
business model,
marketing,
mobile broadband
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
Survey Finds Shockingly Low UC Adoption
Maybe it's just me, but after decades of the industry talking about, and delivering, unified messaging features, and after more than a decade of pushing other features such as unified directories, find me-follow me and other "unified" communications features, it still does not appear that all that many organizations really are using them.
Or so it would appear after a survey of 544 information technology professionals in the United States and United Kingdom by Freeform Dynamics.
The study suggests there currently is what some of us might call "shockingly low" adoption of unified communications. You might have thought otherwise, given the shift to new terminology such as "unified communications and collaboration." That might suggest saturation of UC, and a need for UCC.
The Freeform Dynamics might indicate something else: perhaps customers are not so enamored of the UC solutions they have been offered. Suppliers can react in a couple of ways. Maybe customers and prospects simply do not understand the value, in which case marketing and education should do the trick.
The other tack is to humbly acknowledge that the solutions we have been offering do not add enough value, do not offer additional value at the right price points, or that there are unarticulated problems we have not addressed.
The Freedorm Dynamics study might suggest that the industry has not yet found the "killer app" that makes UCC or UC intuitively valuable to most prospects and buyers.
Or so it would appear after a survey of 544 information technology professionals in the United States and United Kingdom by Freeform Dynamics.
The study suggests there currently is what some of us might call "shockingly low" adoption of unified communications. You might have thought otherwise, given the shift to new terminology such as "unified communications and collaboration." That might suggest saturation of UC, and a need for UCC.
The Freeform Dynamics might indicate something else: perhaps customers are not so enamored of the UC solutions they have been offered. Suppliers can react in a couple of ways. Maybe customers and prospects simply do not understand the value, in which case marketing and education should do the trick.
The other tack is to humbly acknowledge that the solutions we have been offering do not add enough value, do not offer additional value at the right price points, or that there are unarticulated problems we have not addressed.
The Freedorm Dynamics study might suggest that the industry has not yet found the "killer app" that makes UCC or UC intuitively valuable to most prospects and buyers.
Labels:
collaboration,
UC,
unified messaging
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
IT Professionals Don't Think Much of Enterprise Communications, Study Suggests
In a recent survey of 544 information technology professionals, Freeform Dynamics discovered that relatively few U.K. and U.S. IT professionals are satisfied that their communications capabilities are highly efficient and effective.
Except for firms with fewer than 10 employees, less than 20 percent of respondents indicated their communications capabilities were, in fact, very well suited to current business requirements.
You may take that as a good sign that much upside continues to exist for unified and advanced communications that IT professionals believe really help their organizations perform more effectively.
But you might also take it as a sign that the industry, collectively, has done a poor job of creating and delivering on solutions that IT professionals believe are well suited to business requirements. Either way, the Freeform Dynamics study suggests there is much opportunity to provide solutions that actually are perceived to deliver value.
One is tempted to say we haven't done a very good job with unified communications, but it might be worse than that. We might not have done such a great job with communications, period.
Except for firms with fewer than 10 employees, less than 20 percent of respondents indicated their communications capabilities were, in fact, very well suited to current business requirements.
You may take that as a good sign that much upside continues to exist for unified and advanced communications that IT professionals believe really help their organizations perform more effectively.
But you might also take it as a sign that the industry, collectively, has done a poor job of creating and delivering on solutions that IT professionals believe are well suited to business requirements. Either way, the Freeform Dynamics study suggests there is much opportunity to provide solutions that actually are perceived to deliver value.
One is tempted to say we haven't done a very good job with unified communications, but it might be worse than that. We might not have done such a great job with communications, period.
Labels:
UC,
unified communications
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
Tuesday, February 9, 2010
Wi-Fi Now Crucial for Mobile Networks
A new study by Coda Research Consultancy predicts that Wi-Fi enabled mobile handset penetration in the United States will grow at 25 percent compound average growth rates between 2009 and 2015.Most of that growth will come as smartphone sales pick up, and the Wi-Fi capability will be crucial for mobile service providers attempting to maintain high quality service. Since much data demand is created by smartphone users, networks can offload quite a lot of traffic to Wi-Fi-connected fixed networks using the Wi-Fi capability.
It's a "win-win" situation. Users often will discover their devices perform faster on Wi-Fi, while mobile service providers can conserve capital investment. Some users will find Wi-Fi helps them manage their bandwidth caps. Also, Wi-Fi-equiped smartphones will make fixed connections at home more valuable as well.
Labels:
smart phone,
Wi-Fi
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
User Behavior Changes Mobile Device Design Priorities
Smartphones get used for work purposes, to be sure, but what really seems to make mobile Web and Internet access behavior different from PC behavior are the things people do on their mobiles. And the Apple iPhone, as much as anything else, points to where we are going.
It isn't so much that users increasingly listen to music, play games, use social networking sites and send instant messages on their mobiles. Users can do those things on their PCs as well.
They use the Web, catch up on news or watch videos on both mobile and fixed PC platforms. But there seems little doubt that, for most people, it is personal and entertainment apps that increasingly are important, not keeping up with work activities.
We used to describe this behavior as requiring smartphones that balance work and personal life. These days, the emphasis for device design seems deliberately skewed to personal usage modes. That isn't to discount continuing use of smartphones for work purposes. But it is to note that device design has moved well beyond "productivity."
In fact, design priorities seem almost to have flipped. Where it once was important to handle email and calendar well, it now seems important to handle Web, music and navigation applications well, while also supporting email and calendar functions.
It isn't so much that users increasingly listen to music, play games, use social networking sites and send instant messages on their mobiles. Users can do those things on their PCs as well.
They use the Web, catch up on news or watch videos on both mobile and fixed PC platforms. But there seems little doubt that, for most people, it is personal and entertainment apps that increasingly are important, not keeping up with work activities.
We used to describe this behavior as requiring smartphones that balance work and personal life. These days, the emphasis for device design seems deliberately skewed to personal usage modes. That isn't to discount continuing use of smartphones for work purposes. But it is to note that device design has moved well beyond "productivity."
In fact, design priorities seem almost to have flipped. Where it once was important to handle email and calendar well, it now seems important to handle Web, music and navigation applications well, while also supporting email and calendar functions.
Labels:
iPhone,
smart phone
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
Multiple Tools Needed to Preserve Mobile Bandwidth
Chetan Sharma Consulting forecasts that if left unchecked, the costs of delivering mobile data will likely outstrip incremental revenues by the second half of 2011 in the U.S. market and become unsustainable by 2013.
The rapid growth in mobile data costs has prompted operators to look at more sophisticated network congestion management strategies that fall into four categories: policy control, data traffic offload, infrastructure investment, and network optimization.
Shifting data traffic off a congested mobile network and onto another access technology fundamentally changes the economics of delivering that data. Offload is being implemented by operators globally, including offload to Wi-Fi and offload to femtocells.
Operators deploying a mixed multi-access offload strategy can expect savings in the range of 20 to 25 per cent per year. In the US market, operators will save between $30 and $40 billion per annum by 2013 through an offload strategy alone.
More-efficient new networks will help as well. Infrastructure evolution to 3.5G (HSPA) and 4G (LTE ) lowers the cost-per-bit for data throughput on the network, thereby reducing overall costs.
Chetan Sharma Consulting forecasts that evolving to HSPA and LTE will result in cost savings of just under 20 per cent or almost $25 billion per year in the U.S. market by 2013.
Network optimisation, through techniques such as compression and caching also adds incremental
savings by reducing the total number of bits traversing the network. Typically, Sharma reports,
operators can generate savings of five to 10 per cent by 2013 through this strategy.
Anecdotally, operators have reported that 80 per cent of the traffic in urban centers is being
generated by 10 per cent of the cell sites. So policy control (how, when and under which circumstances subscribers can access networks) can contribute annual cost savings of over 10 per cent, equating to over $15 billion in annual cost reduction by 2013 in the US market, Chetan Sharma says.
But cost reduction is only one side of the equation. Tiered and usage-based pricing also is required. Such policies need not be heavyhanded, top-down service provider rules but rather flexible, dynamic, and personalised pricing models that reflect subscribers’ preferences and context.
Taken as a whole, all the optimization techniques and new pricing models will be needed as the whole mobile business changes from a voice revenue model to an "bandwidth-based" business.
The rapid growth in mobile data costs has prompted operators to look at more sophisticated network congestion management strategies that fall into four categories: policy control, data traffic offload, infrastructure investment, and network optimization.
Shifting data traffic off a congested mobile network and onto another access technology fundamentally changes the economics of delivering that data. Offload is being implemented by operators globally, including offload to Wi-Fi and offload to femtocells.
Operators deploying a mixed multi-access offload strategy can expect savings in the range of 20 to 25 per cent per year. In the US market, operators will save between $30 and $40 billion per annum by 2013 through an offload strategy alone.
More-efficient new networks will help as well. Infrastructure evolution to 3.5G (HSPA) and 4G (LTE ) lowers the cost-per-bit for data throughput on the network, thereby reducing overall costs.
Chetan Sharma Consulting forecasts that evolving to HSPA and LTE will result in cost savings of just under 20 per cent or almost $25 billion per year in the U.S. market by 2013.
Network optimisation, through techniques such as compression and caching also adds incremental
savings by reducing the total number of bits traversing the network. Typically, Sharma reports,
operators can generate savings of five to 10 per cent by 2013 through this strategy.
Anecdotally, operators have reported that 80 per cent of the traffic in urban centers is being
generated by 10 per cent of the cell sites. So policy control (how, when and under which circumstances subscribers can access networks) can contribute annual cost savings of over 10 per cent, equating to over $15 billion in annual cost reduction by 2013 in the US market, Chetan Sharma says.
But cost reduction is only one side of the equation. Tiered and usage-based pricing also is required. Such policies need not be heavyhanded, top-down service provider rules but rather flexible, dynamic, and personalised pricing models that reflect subscribers’ preferences and context.
Taken as a whole, all the optimization techniques and new pricing models will be needed as the whole mobile business changes from a voice revenue model to an "bandwidth-based" business.
Labels:
broadband,
business model,
mobile,
mobile broadband,
policy
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
Which Growth Pattern Emerges as Recession Ends?
Many economists and market watchers think consumers eventually will return to spending patterns as they existed prior to the recent recession, and on the growth pattern of the 20 years before the recession.
Others warn that growth patterns are more likely to revert to patterns of the 1945 to 1970s, when annual growth in consumer spending was much more restrained.
So the question for many might be, which view is right? For application and service providers, the question might not be as germane. The reason is that consumer spending on network-delivered services and applications was stable over the entire period, and in fact has shown a slow, steady growth.
In other words, people are shifting more of their available entertainment budget to network-based products. Communications spending likewise has slowly grown its percentage of overall discretionary spending, not fluctuating wildly from one year to the next.
Of course, lots of other background factors have changed. There are more products, more applications, more services and providers to choose from.
The value of many products has taken on an increasing "network services" character as well. Consider the value of a PC without Internet access, for example.
The point is that whichever forecast proves correct--either a return to the growth trend of the past two decades, or a reversion to the lower spending growth of the years 1945 to 1979, network-based products are likely to continue a slow, steady, upward growth trend. That may not be true for other industries.
Others warn that growth patterns are more likely to revert to patterns of the 1945 to 1970s, when annual growth in consumer spending was much more restrained.
So the question for many might be, which view is right? For application and service providers, the question might not be as germane. The reason is that consumer spending on network-delivered services and applications was stable over the entire period, and in fact has shown a slow, steady growth.
In other words, people are shifting more of their available entertainment budget to network-based products. Communications spending likewise has slowly grown its percentage of overall discretionary spending, not fluctuating wildly from one year to the next.
Of course, lots of other background factors have changed. There are more products, more applications, more services and providers to choose from.
The value of many products has taken on an increasing "network services" character as well. Consider the value of a PC without Internet access, for example.
The point is that whichever forecast proves correct--either a return to the growth trend of the past two decades, or a reversion to the lower spending growth of the years 1945 to 1979, network-based products are likely to continue a slow, steady, upward growth trend. That may not be true for other industries.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
Monday, February 8, 2010
The "Problem" With Nexus One is the Retail Packaging, Not the Phone
By some accounts, the Google Nexus One phone has not sold as many units as some might have hoped. Flurry, a mobile analytics firm, estimates that 20,000 Nexus Ones were sold in the first week. That tracks poorly compared to the myTouch3G, which sold up to 60,000, and the Motorola Droid, which sold 250,000 in the first week.
Some people really like the idea of "unlocked" phones, despite the full retail price, as the price of gaining freedom to use "any" carrier (in the U.S. market two of four major carriers). But so far, most U.S. consumers seem to prefer the old "closed" model, where they get discounts on devices in exchange for contracts.
Beyond that, there is the clumsy customer support process. Users can email Google and get an answer within 48 hours. I don't know about you, but if any service provider took that long to get back to me when I have a problem, they will not be my service provider much longer than that. I can easily find a replacement provider within two days.
But that's the problem with Google's current model. With the current model, a customer contacts Google, and hopes the problem is not something the carrier (T-Mobile) or HTC (the device manufacturer) has to fix.
That's no slam on the device. But the customer interface is wrong. People are used to buying from one retailer that "owns" the customer service responsibility. And people will not be happy with two termination fees for early cancellation of a contract--one charged by T-Mobile USA and a separate restocking fee levied by Google.
Ignoring the amount of the fee and the logic, that's just going to make people mad. People generally understand the early termination fee. But they don't expect to pay twice.
Unlocked phones have sold better in Europe, but there is a huge difference between the U.S. market and Europe. In Europe, when one buys an unlocked device at full price, it really does work on all networks. In the United States, Verizon and Sprint use the CDMA air interface while AT&T and T-Mobile use the GSM air interface.
So an unlocked phone will only work on half of those networks. Under such conditions, the value of an unlocked phone is dramatically reduced. But most consumers don't really care about air interface or "locking."
They are used to a retail relationship where they know who owns the product and process. And there still is not much evidence to indicate the value of an unlocked, full retail device is more important than the comfort of knowing who is responsible when something doesn't work properly.
Despite the generally-accepted wisdom that "open" ecosystems innovate faster (which is true), that doesn't mean customer experience is better. As Apple has shown time and again, a closed, tightly-integrated approach can produce a much-better experience and lots of innovation at the same time.
So far, it doesn't appear the unlocked Nexus One model is doing that.
Some people really like the idea of "unlocked" phones, despite the full retail price, as the price of gaining freedom to use "any" carrier (in the U.S. market two of four major carriers). But so far, most U.S. consumers seem to prefer the old "closed" model, where they get discounts on devices in exchange for contracts.
Beyond that, there is the clumsy customer support process. Users can email Google and get an answer within 48 hours. I don't know about you, but if any service provider took that long to get back to me when I have a problem, they will not be my service provider much longer than that. I can easily find a replacement provider within two days.
But that's the problem with Google's current model. With the current model, a customer contacts Google, and hopes the problem is not something the carrier (T-Mobile) or HTC (the device manufacturer) has to fix.
That's no slam on the device. But the customer interface is wrong. People are used to buying from one retailer that "owns" the customer service responsibility. And people will not be happy with two termination fees for early cancellation of a contract--one charged by T-Mobile USA and a separate restocking fee levied by Google.
Ignoring the amount of the fee and the logic, that's just going to make people mad. People generally understand the early termination fee. But they don't expect to pay twice.
Unlocked phones have sold better in Europe, but there is a huge difference between the U.S. market and Europe. In Europe, when one buys an unlocked device at full price, it really does work on all networks. In the United States, Verizon and Sprint use the CDMA air interface while AT&T and T-Mobile use the GSM air interface.
So an unlocked phone will only work on half of those networks. Under such conditions, the value of an unlocked phone is dramatically reduced. But most consumers don't really care about air interface or "locking."
They are used to a retail relationship where they know who owns the product and process. And there still is not much evidence to indicate the value of an unlocked, full retail device is more important than the comfort of knowing who is responsible when something doesn't work properly.
Despite the generally-accepted wisdom that "open" ecosystems innovate faster (which is true), that doesn't mean customer experience is better. As Apple has shown time and again, a closed, tightly-integrated approach can produce a much-better experience and lots of innovation at the same time.
So far, it doesn't appear the unlocked Nexus One model is doing that.
Labels:
Android,
customer experience,
Google,
marketing,
Nexus One,
user experience
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
How PC Usage is Different from Mobile
To state the obvious, users behave differently on their mobile devices than they do on their PCs, which ought to have implications for a world where perhaps half to two thirds of all Web and Internet access is from a mobile device.
A study of 16 information workers over a period of time illustrates some of the differences (again, keeping in mind that habits likely continue to evolve).
Mobile service providers, for example, know there is a huge difference between users on PCs and smartphone users.
Namely, PC users consume lots more data. And that is what the study conducted by Microsoft and the University of Washington also noted. The other obvious observation was that phones are used for voice and text messaging. PCs can be used for those applications, but in this study of office workers, that was not the case.
And productivity applications, though important for desktop use, was not the focus on mobiles, where "maps" seem to be more important, as you might expect. Users relied on both devices for email and Web access. Beyond that, the usage profiles were different.
Aside from the sheer difference in volume, understandable given the "on the go" nature of a mobile phone, users did different things on their mobiles. One might hypothesize that mobile device input-output limitations and time constraints (people are on the go) account for much of the difference in behavior. Heavy document or file interactions are not prevalent on mobiles.
That doesn't mean people will stop doing things at their desks that require full PC support. It does suggest that as use of mobiles becomes a bigger driver of Internet usage, the key applications will change. Mobiles are "becoming PCs," but that does not mean they will be used the same way, at all. The Microsoft study simply confirms that fact.
A study of 16 information workers over a period of time illustrates some of the differences (again, keeping in mind that habits likely continue to evolve).
Mobile service providers, for example, know there is a huge difference between users on PCs and smartphone users.
Namely, PC users consume lots more data. And that is what the study conducted by Microsoft and the University of Washington also noted. The other obvious observation was that phones are used for voice and text messaging. PCs can be used for those applications, but in this study of office workers, that was not the case.
And productivity applications, though important for desktop use, was not the focus on mobiles, where "maps" seem to be more important, as you might expect. Users relied on both devices for email and Web access. Beyond that, the usage profiles were different.
Aside from the sheer difference in volume, understandable given the "on the go" nature of a mobile phone, users did different things on their mobiles. One might hypothesize that mobile device input-output limitations and time constraints (people are on the go) account for much of the difference in behavior. Heavy document or file interactions are not prevalent on mobiles.
That doesn't mean people will stop doing things at their desks that require full PC support. It does suggest that as use of mobiles becomes a bigger driver of Internet usage, the key applications will change. Mobiles are "becoming PCs," but that does not mean they will be used the same way, at all. The Microsoft study simply confirms that fact.
Labels:
mobile PC,
mobile phone,
PC,
smartphone
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
Mobile Broadband Will Need a New Business Model
One way or the other, something has got to change in the mobile business as voice ceases to be the industry revenue driver. Today mobile service providers get 86 percent of their revenue from low-bandwidth applications like voice and text. But that will keep changing in predictable ways.
Namely, most capacity requirements will be driven by low-margin data rather than high-margin voice and text. Over the long term, it is irrational to better price services in relationship to cost without attributing more revenue directly to the data services that are driving capital investment.
That doesn't mean every single service or application necessarily has to be priced in relationship to cost. Loss leaders at supermarkets, promotional DVD prices at Target and other promotional pricing happens all the time, in every business. Some products have high margin, others low or even negative margins.
The point is that current retail pricing will get more irrational as data demand grows, and that something will have to be done about it.
Carriers are investing in new capacity, but that alone will not be enough to bring revenue and capacity into balance. By 2013, virtually all traffic load will be driven by broadband data of one sort or another, especially video. That means, over time, new ways of charging for network usage will have to be created.
Like it or not, network management is going to be necessary, plus traffic offload and policy management. The issue, in part, is that demand is unevenly distributed. Even at peak hours of congestion, only a minor percentage of cell sites actually account for most of the congestion. To speak of congestion management at the "whole network" level is not to capture the issue.
The key issue is peak-hour congestion at perhaps 10 percent to 15 percent of sites. Put another way, even at peak congestion, 85 to 90 percent of sites do not experience difficulty. That means it might be necessary to use different policies at a small number of physical sites, not the entire network, even at peak hours.
So even if traffic shaping, bit priority policies and other tools are not generally required at every site, for every application or user, there will be a need to do so at some sites, some of the time.
Namely, most capacity requirements will be driven by low-margin data rather than high-margin voice and text. Over the long term, it is irrational to better price services in relationship to cost without attributing more revenue directly to the data services that are driving capital investment.
That doesn't mean every single service or application necessarily has to be priced in relationship to cost. Loss leaders at supermarkets, promotional DVD prices at Target and other promotional pricing happens all the time, in every business. Some products have high margin, others low or even negative margins.
The point is that current retail pricing will get more irrational as data demand grows, and that something will have to be done about it.
Carriers are investing in new capacity, but that alone will not be enough to bring revenue and capacity into balance. By 2013, virtually all traffic load will be driven by broadband data of one sort or another, especially video. That means, over time, new ways of charging for network usage will have to be created.
Like it or not, network management is going to be necessary, plus traffic offload and policy management. The issue, in part, is that demand is unevenly distributed. Even at peak hours of congestion, only a minor percentage of cell sites actually account for most of the congestion. To speak of congestion management at the "whole network" level is not to capture the issue.
The key issue is peak-hour congestion at perhaps 10 percent to 15 percent of sites. Put another way, even at peak congestion, 85 to 90 percent of sites do not experience difficulty. That means it might be necessary to use different policies at a small number of physical sites, not the entire network, even at peak hours.
So even if traffic shaping, bit priority policies and other tools are not generally required at every site, for every application or user, there will be a need to do so at some sites, some of the time.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
Apple and RIM Are Winners in Handset Market, Profit-Wise
The Apple iPhone might not be the only reason the mobile handset market has changed over the past several years, but it is a major influence, according to a new analysis by analysts at Deutsche Bank.
In 2006, before the iPhone was available, Nokia had nearly half--47 percent--of industry profits. By the end of 2010, it will have 25 percent.
In 2006, Sony Ericsson had 11 percent share. By the end of 2010 it will have a negative one percent operating profit.
Motorola had 18 percent share in 2006 and will have declined to about a negative one percent by the end of 2010.
By the end of 2010 Apple will have an estimated 37 percent share, while Research in Motion, which had four percent share in 2006, will have grown to 16 percent.
Most of the other suppliers will have remained about where they were in 2006, except for Lucky Goldstar, which will have grown from one percent to six percent.
Keep in mind, these figures reflect profits, not handset share.
In 2006, before the iPhone was available, Nokia had nearly half--47 percent--of industry profits. By the end of 2010, it will have 25 percent.
In 2006, Sony Ericsson had 11 percent share. By the end of 2010 it will have a negative one percent operating profit.
Motorola had 18 percent share in 2006 and will have declined to about a negative one percent by the end of 2010.
By the end of 2010 Apple will have an estimated 37 percent share, while Research in Motion, which had four percent share in 2006, will have grown to 16 percent.
Most of the other suppliers will have remained about where they were in 2006, except for Lucky Goldstar, which will have grown from one percent to six percent.
Keep in mind, these figures reflect profits, not handset share.
Labels:
Apple,
HTC,
LGE,
Lucky Goldstar,
Motorola,
Nokia,
Palm,
RIM,
Samsung,
Sony Ericsson
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
Subscribe to:
Comments (Atom)
"Lean Back" and "Lean Forward" Differences Might Always Condition VR or Metaverse Adoption
By now, it is hard to argue against the idea that the commercial adoption of “ metaverse ” and “ virtual reality ” for consumer media was in...
-
We have all repeatedly seen comparisons of equity value of hyperscale app providers compared to the value of connectivity providers, which s...
-
It really is surprising how often a Pareto distribution--the “80/20 rule--appears in business life, or in life, generally. Basically, the...
-
One recurring issue with forecasts of multi-access edge computing is that it is easier to make predictions about cost than revenue and infra...










