It never is too hard to find a communications executive or company that worries about what Google might do next, or where Google might compete directly with service providers.
In an abstract sense, Google evokes fear because of its ability to innovate fast, its ability to surround a competitor or business, its cash, its ambition and only as a byproduct its ability to make itself the "thing" consumers bond to, not the ISP dumb pipe.
Of course, the perceived "threat" is very concrete to some contestants in the communications ecosystem.
The NexusOne handset means Google is a device supplier. The Android handset operating system makes it a major mobile supplier to many other original equipment manufacturers, and gives Google influence over the pace and direction of mobile OS development. Google Play is among the handful of truly significant app stores.
Google's investment in Clearwire, and past willingness to bid on 700-MHz Long Term Evolution spectrum, mean Google potentially could be viewed as a future access provider in its own right.
Google Fiber, even if only at one location, has become a reality. And some might point to Google Voice, or Google's ownership of dark fiber as other ways Google already is a part of the industry ecosystem.
Google also was on the other side of the net neutrality argument from most service providers.
And Google Wallet does compete directly with Isis, the mobile wallet backed by AT&T, Verizon Wireless and T-Mobile USA.
Google continues to experiment with mobile and untethered access networks. You might argue Google still has a clear and simple business objective. It makes its money from Internet advertising. People who do not have the Internet cannot become prospects. So Google wants everybody to have Internet access.
Ad inventory also hinges on access speed. Faster page loading, for example, also means Google can display more inventory. In a real sense, ad inventory is contingent on access speed. Experience also shows that faster speed leads to higher usage, and hence more potential to show inventory.
But some would argue that despite its potential future competition with communication service providers, Google has other bigger challenges to face, including Facebook, other search providers, Microsoft and Apple.
And some might even argue that, given the growing importance of mobile "search," commerce is becoming more important. And the firm most significant in that area is Amazon.
That isn't to say there is no scenario under which Google might entertain becoming a mobile service provider, in some way. But that is more attractive to Google only if the biggest ISPs fail to upgrade their networks or extend coverage.
In the meantime, Google faces greater immediate challenges from the likes of Apple and Facebook, as well as future challenges from the likes of Amazon.
Thursday, January 24, 2013
Who Does Google Really Compete With?
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
Mosaic, First Real Web Browser, Turns 20
While not the first graphical web browser, it was the most popular one of its time and still the model for today's browsers, many would say.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
Unlocking Your U.S. Phone Becomes a Crime Jan. 26, 2013
I'll be most of us did not see this coming: on Jan. 26, 2013, it becomes a federal crime for the buyer of a smart phone to unlock it, at least before the expiration of the contract, if there is one.
The change is because a smart phone operating system is considered copyrighted material under provisions of the Digital Millennium Copyright Act (DMCA).
In October 2012, the Librarian of Congress, who can determine exemptions to the DMCA, decided that unlocking mobile phones would no longer be allowed.
The rule apparently does not apply to devices sold unlocked in the first place, such as full retail price devices, or perhaps any smart phone sold to a user unlocked by the carrier itself.
Apparently, it will continue to be illegal to unlock a tablet or game console.
The legal foundation is that users only license, and do not own, the software on their devices. Some might be shocked to learn that the same legal principle underlies a "purchased" library of songs, as well.
Actually, it is unclear whether a user actually "owns" the songs in an iTunes library, or are only borrowed. In other words, what used to be a product now is sort of a subscription.
The change is because a smart phone operating system is considered copyrighted material under provisions of the Digital Millennium Copyright Act (DMCA).
In October 2012, the Librarian of Congress, who can determine exemptions to the DMCA, decided that unlocking mobile phones would no longer be allowed.
The rule apparently does not apply to devices sold unlocked in the first place, such as full retail price devices, or perhaps any smart phone sold to a user unlocked by the carrier itself.
Apparently, it will continue to be illegal to unlock a tablet or game console.
The legal foundation is that users only license, and do not own, the software on their devices. Some might be shocked to learn that the same legal principle underlies a "purchased" library of songs, as well.
Actually, it is unclear whether a user actually "owns" the songs in an iTunes library, or are only borrowed. In other words, what used to be a product now is sort of a subscription.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
AT&T Sells Record Number of Smart Phones in 4Q 2012
AT&T Wireless sold a record number of smart phones in the fourth quarter of 2012, selling 10.2 million devices, which AT&T says is the most ever sold by any U.S. service provider in a single quarter. In fact, 89 percent of postpaid phone sales were of smart phones.
AT&T also reported that wireless revenues grew 5.7 percent year over year, while wireless service revenues grew 4.2 percent.
AT&T had 780,000 wireless postpaid net adds, the largest increase in three years; with a 1.1 million increase in total net wireless subscribers.
But that growth also came at a price. Since AT&T subsidizes smart phones, and since so many customers purchased Apple iPhones that have the highest subsidy costs in the business, earnings took a hit. In fact, 84 percent of all smart phone sales were iPhones.
AT&T also reported that wireless revenues grew 5.7 percent year over year, while wireless service revenues grew 4.2 percent.
AT&T had 780,000 wireless postpaid net adds, the largest increase in three years; with a 1.1 million increase in total net wireless subscribers.
But that growth also came at a price. Since AT&T subsidizes smart phones, and since so many customers purchased Apple iPhones that have the highest subsidy costs in the business, earnings took a hit. In fact, 84 percent of all smart phone sales were iPhones.
Verizon, for its part, added a "highest-ever" 2.1 million net new wireless contract customers in the fourth quarter, outpacing AT&T's growth.
As expected, Verizon profit margins on mobile services dropped as smart phone subsidies grew.
And that is an issue: though service providers want to sell more data subscriptions, which mainly entails selling more smart phones, the device subsidies are a drag on earnings.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
Google to Test Small Cell Network?
For some who worry that Google might someday decide to become an ISP in a bigger way, using either mobile or fiber to the home approaches, here is one more development to stoke concern.
Google filed an application at the Federal Communications Commission seeking permission to test an experimental radio system near its Mountain View, Calif. campus, using as many as 50 base stations and 200 end user devices.
The base stations will be both indoors and outdoors, using a “small cell” design. Indoor sites will have a range of 100 meters to 200 meters, while outdoor cells will have a range of 500 meters to 1000 meters.
Only Google knows what it is testing here. But the specific frequencies requested are 2524 MHz to 2546 MHZ and 2567 MHz to 2625 MHz.
These are bands allocated to the “Educational Broadband Service” (EBS) and the “Broadband Radio Service” (BRS), which are used by Clearwire for its mobile broadband service.
Google has tested a variety of networks and network elements in the past, so the latest effort is not unusual. But Google has suggested spectrum in the 3.55 GHz to 3.65 GHz could be used as part of a shared small cell service.
Google also in the past has asked for permission to test unlicensed devices in the 2.4 GHz band, 5 GHz band, and the 76-77 GHz band, as well as white spaces.
Aside from Google Fiber, Google also has invested in municipal Wi-Fi tests, invested in Clearwire, sponsored airport Wi-Fi and promised a minimum bid for 700-MHz mobile spectrum as well, in 2007.
The point is that Google remains vitally interested in new ways to expand Internet access, especially high-bandwidth, low cost access.
Google says the initial base station deployment will occur inside 1210 Charleston Road, Mountain View (and possibly 1200 and 1220 Charleston Road), and consist of five to 10 base stations (mounted on ceilings, or walls next to the ceiling, six to eight meters above ground), and up to 40 user devices.
Three base stations will employ directional antennas (dual-slant, two-way multiple input/multiple output, with 17 dBi max antenna gain), mounted on walls and directed toward the building interior; of these, base station one will have a beam width of 65 degrees, a 45 degree horizontal orientation, and a -4 degree vertical orientation; base station two will have a beam width of 90 degrees, a 315 degree horizontal orientation, and a -4 degree vertical orientation; and base station three will have a beam width of 65 degrees, a 210 degree horizontal orientation, and a -4 degree vertical orientation.
Subsequent deployments will occur on building rooftops at 1200, 1210, or 1220 Charleston Road, or possibly other buildings located on the Google campus.
Omni-directional antennas will be mounted either on external building walls at roof height, or on antenna masts above rooftops (extending no more than six meters above the rooftop).
Directional antennas may be used. No building on the campus is higher than 25 meters above ground. Google plans to test up to 50 base stations and 200 user devices during the requested experimental license term, and requests authority to deploy in these quantities.
Each indoor base station will have a radius of approximately 100 meters to 200 meters. Each outdoor base station will have a radius of approximately 500 meters to 1000 meters.
Google filed an application at the Federal Communications Commission seeking permission to test an experimental radio system near its Mountain View, Calif. campus, using as many as 50 base stations and 200 end user devices.
The base stations will be both indoors and outdoors, using a “small cell” design. Indoor sites will have a range of 100 meters to 200 meters, while outdoor cells will have a range of 500 meters to 1000 meters.
Only Google knows what it is testing here. But the specific frequencies requested are 2524 MHz to 2546 MHZ and 2567 MHz to 2625 MHz.
These are bands allocated to the “Educational Broadband Service” (EBS) and the “Broadband Radio Service” (BRS), which are used by Clearwire for its mobile broadband service.
Google has tested a variety of networks and network elements in the past, so the latest effort is not unusual. But Google has suggested spectrum in the 3.55 GHz to 3.65 GHz could be used as part of a shared small cell service.
Google also in the past has asked for permission to test unlicensed devices in the 2.4 GHz band, 5 GHz band, and the 76-77 GHz band, as well as white spaces.
Aside from Google Fiber, Google also has invested in municipal Wi-Fi tests, invested in Clearwire, sponsored airport Wi-Fi and promised a minimum bid for 700-MHz mobile spectrum as well, in 2007.
The point is that Google remains vitally interested in new ways to expand Internet access, especially high-bandwidth, low cost access.
Google says the initial base station deployment will occur inside 1210 Charleston Road, Mountain View (and possibly 1200 and 1220 Charleston Road), and consist of five to 10 base stations (mounted on ceilings, or walls next to the ceiling, six to eight meters above ground), and up to 40 user devices.
Three base stations will employ directional antennas (dual-slant, two-way multiple input/multiple output, with 17 dBi max antenna gain), mounted on walls and directed toward the building interior; of these, base station one will have a beam width of 65 degrees, a 45 degree horizontal orientation, and a -4 degree vertical orientation; base station two will have a beam width of 90 degrees, a 315 degree horizontal orientation, and a -4 degree vertical orientation; and base station three will have a beam width of 65 degrees, a 210 degree horizontal orientation, and a -4 degree vertical orientation.
Subsequent deployments will occur on building rooftops at 1200, 1210, or 1220 Charleston Road, or possibly other buildings located on the Google campus.
Omni-directional antennas will be mounted either on external building walls at roof height, or on antenna masts above rooftops (extending no more than six meters above the rooftop).
Directional antennas may be used. No building on the campus is higher than 25 meters above ground. Google plans to test up to 50 base stations and 200 user devices during the requested experimental license term, and requests authority to deploy in these quantities.
Each indoor base station will have a radius of approximately 100 meters to 200 meters. Each outdoor base station will have a radius of approximately 500 meters to 1000 meters.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
Wednesday, January 23, 2013
Mobile Broadband Now Shapes Global "Speed" Metrics
A 2010 study by Ofcom, the U.K. communications regulator, found fixed network speeds were about four times faster than mobile speeds. The difference in page loading speeds was more dramatic. Fixed network web pages loaded 17 times faster than the mobile pages.
But simple logic also suggests that measures of broadband speed are becoming quite a bit more nuanced than in the past, as the “typical” form of broadband access becomes a “mobile” connection, not a fixed line connection.
That does not mean the absolute volume of data consumption is related in a linear way to the number of subscribers, only that “typical access speed” is a harder thing to describe, than once was the case. Some 84 percent of smart phone users say they use their smart phones to access the Internet, for example.

By the end of 2011, total global mobile subscriptions reached nearly six billion by end 2011, corresponding to a global penetration of 86 percent, according to the International Telecommunications Union.
Growth was driven by developing countries, which accounted for more than 80 percent of the 660 million new mobile subscriptions added in 2011. That is significant. To begin with, mobile connections typically run slower than fixed connections, and developing market connections tend to run slower than connections in developed markets.
That might explain why, in the third quarter of 2012, the global average connection speed declined 6.8 percent to 2.8 Mbps, and the global average peak connection speed declined 1.4 percent to 15.9 Mbps, says Akamai.
According to Ericsson, mobile data use has grown exponentially since about 2008.
But simple logic also suggests that measures of broadband speed are becoming quite a bit more nuanced than in the past, as the “typical” form of broadband access becomes a “mobile” connection, not a fixed line connection.
That does not mean the absolute volume of data consumption is related in a linear way to the number of subscribers, only that “typical access speed” is a harder thing to describe, than once was the case. Some 84 percent of smart phone users say they use their smart phones to access the Internet, for example.
By the end of 2011, total global mobile subscriptions reached nearly six billion by end 2011, corresponding to a global penetration of 86 percent, according to the International Telecommunications Union.
Growth was driven by developing countries, which accounted for more than 80 percent of the 660 million new mobile subscriptions added in 2011. That is significant. To begin with, mobile connections typically run slower than fixed connections, and developing market connections tend to run slower than connections in developed markets.
That might explain why, in the third quarter of 2012, the global average connection speed declined 6.8 percent to 2.8 Mbps, and the global average peak connection speed declined 1.4 percent to 15.9 Mbps, says Akamai.
According to Ericsson, mobile data use has grown exponentially since about 2008.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
Google Fiber "Is Not a Hobby"
Patrick Pichette Google CFO, said on Google’s recent earnings call that Google Fiber is not a hobby. That could mean lots of things, so ISPs should not necessarily make assumptions about what that statement means.
“We really think that we should be making good business with this opportunity and we are going to continue to look at the possibility of expanding, but right now we just got to nail because we are in the early days,” said Pichette.
“Not a hobby” could mean that Google does not intend to lose money on the venture, and is not simply spending money on a “hero” initiative that is not intended to directly sustain itself over the long term.
Contrast that with Apple’s statements some years ago that Apple TV was, in fact, a hobby, implying that commercial impact was not expected.
But “not a hobby” would unsettle other large ISPs much more if it implied Google was seriously entertaining the notion of becoming an ISP in its own right, on a bigger scale.
Those sorts of fears have been expressed in the past, about Google “becoming a telco.” But Google has become a handset supplier, on a limited scale. Google Voice does earn communications revenue. Google Docs does compete with Microsoft’s “Office” suite.
Google does operate a large global backbone network. Likewise, there are, from time to time, discussions of whether Google (or other big application providers) want to become mobile service providers.
And even at the recent Pacific Telecommunications Council meeting, at least a few attendees I spoke with did express concern that Google might in fact be considering a wider and more significant entry into either the local access or backbone transport markets.
In other words, there remains considerable unease about what Google might decide to do, in the communications business.
The concern might be overblown. But there is no doubt about what Google would prefer, and that is higher speeds for most end users and more investment in access networks by the leading ISPs to enable that.
Google’s challenge to leading ISPs is clear enough.
In a highly-competitive market, the low-cost provider tends, over time, to win. That is true with respect to large tier one telcos competing with large tier one cable operators, for example. You might argue that cable gains in high-speed access and fixed network market share provide a clear example.
Some now would argue that ISPs--both fixed network and mobile ISPs--need to match Google’s own costs, on a gigabyte per cents or gigabyte per dollar basis. How well that can be done, and if it can be done, is the question.
But Google has affected service provider thinking before. Remember several years ago when executives started to routinely say they had to “innovate at Google speed?” Doubtless, most would say no telco really is able to innovate that fast. But it might be argued that service providers do now innovate faster than before.
So it might not be unreasonable to argue that if Google continues to demonstrate new cost models for very high speed access, that service providers will respond.
Shifting to costs equivalent to Google’s costs might be a daunting prospect, but less daunting than what could happen if legacy revenue streams erode faster than new revenue replacements can be created.
It is one thing to argue that telcos, for example, need to incrementally reduce current operating costs. But that argue also hinges on a crucial assumption, namely that current revenue continues to grow on a relatively stable basis, while revenue losses from legacy products do not accelerate in a destabilizing way.
Some might argue that the risk of unexpected revenue trend deterioration is greater than most now assume. In that case, one way or the other, service providers will have to make further adjustments. That is one reason why Google hints that it might expand the Google Fiber program.
“We really think that we should be making good business with this opportunity and we are going to continue to look at the possibility of expanding, but right now we just got to nail because we are in the early days,” said Pichette.
“Not a hobby” could mean that Google does not intend to lose money on the venture, and is not simply spending money on a “hero” initiative that is not intended to directly sustain itself over the long term.
Contrast that with Apple’s statements some years ago that Apple TV was, in fact, a hobby, implying that commercial impact was not expected.
But “not a hobby” would unsettle other large ISPs much more if it implied Google was seriously entertaining the notion of becoming an ISP in its own right, on a bigger scale.
Those sorts of fears have been expressed in the past, about Google “becoming a telco.” But Google has become a handset supplier, on a limited scale. Google Voice does earn communications revenue. Google Docs does compete with Microsoft’s “Office” suite.
Google does operate a large global backbone network. Likewise, there are, from time to time, discussions of whether Google (or other big application providers) want to become mobile service providers.
And even at the recent Pacific Telecommunications Council meeting, at least a few attendees I spoke with did express concern that Google might in fact be considering a wider and more significant entry into either the local access or backbone transport markets.
In other words, there remains considerable unease about what Google might decide to do, in the communications business.
The concern might be overblown. But there is no doubt about what Google would prefer, and that is higher speeds for most end users and more investment in access networks by the leading ISPs to enable that.
Google’s challenge to leading ISPs is clear enough.
In a highly-competitive market, the low-cost provider tends, over time, to win. That is true with respect to large tier one telcos competing with large tier one cable operators, for example. You might argue that cable gains in high-speed access and fixed network market share provide a clear example.
Some now would argue that ISPs--both fixed network and mobile ISPs--need to match Google’s own costs, on a gigabyte per cents or gigabyte per dollar basis. How well that can be done, and if it can be done, is the question.
But Google has affected service provider thinking before. Remember several years ago when executives started to routinely say they had to “innovate at Google speed?” Doubtless, most would say no telco really is able to innovate that fast. But it might be argued that service providers do now innovate faster than before.
So it might not be unreasonable to argue that if Google continues to demonstrate new cost models for very high speed access, that service providers will respond.
Shifting to costs equivalent to Google’s costs might be a daunting prospect, but less daunting than what could happen if legacy revenue streams erode faster than new revenue replacements can be created.
It is one thing to argue that telcos, for example, need to incrementally reduce current operating costs. But that argue also hinges on a crucial assumption, namely that current revenue continues to grow on a relatively stable basis, while revenue losses from legacy products do not accelerate in a destabilizing way.
Some might argue that the risk of unexpected revenue trend deterioration is greater than most now assume. In that case, one way or the other, service providers will have to make further adjustments. That is one reason why Google hints that it might expand the Google Fiber program.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
Mobile Now Shapes "Average" Internet Access Speeds
What will dramatically-higher mobile broadband and mobile data plan adoption mean for global “average” Internet access speeds? The question already is starting to matter.
By the end of 2011, total global mobile subscriptions reached nearly six billion by end 2011, corresponding to a global penetration of 86 percent, according to the International Telecommunications Union.
Growth was driven by developing countries, which accounted for more than 80 percent of the 660 million new mobile subscriptions added in 2011.
If one assumes a typical mobile connection supports lower speed than a fixed network broadband connection, rapidly growing mobile Internet access will have a huge impact on “average” access speeds.
By end 2011, there were more than one billion mobile broadband subscriptions worldwide. More important is the rate of change. Mobile broadband grew at a 40 percent annual
rate in 2011. That rate will slow over time, of course, but at such rates, the base of users doubles in less than three years.
Also, compare mobile broadband to fixed network broadband subscriptions. At the end of 2011, there were 590 million fixed broadband subscriptions worldwide. In other words, there were nearly twice as many mobile broadband users as fixed network broadband users by the end of 2011.
Furthermore, fixed network broadband growth in developed countries was slowing (a five percent increase in 2011), where developing countries continue to experience high growth (18 percent in 2011).
As you might guess, fixed network broadband penetration remains low in some regions, such as Africa and the Arab states, with 0.2 percent and two percent adoption, respectively, by the end of 2011.
Also, in 2011, 30 million fixed broadband subscriptions were added in China alone, representing about half ofthe total fixed network subscriptions added worldwide, while fixed broadband penetration reached 12 percent in China.
One should therefore assume that comparing future “average” or “typical” broadband speeds to past data will be misleading. We might already be seeing that sort of impact.
In the third quarter of 2012, the global average connection speed declined 6.8 percent to 2.8 Mbps, and the global average peak connection speed declined 1.4 percent to 15.9 Mbps, says Akamai.
That statistic likely directly reflects the growing use of mobile networks Since access from mobile devices far outstrips access from fixed network connections, globally, and since mobile network top speeds are less than fixed networks, generally, a growing volume of mobile connections will affect overall “average speed.”
In 2010, global mobile penetration was nearing 80 percent. Early in 2012, global mobile penetration reached 85 percent.
All of that means “average” statistics about broadband access speeds will have to be considered in a more nuanced way from this point forward. As “most” Internet access happens from mobile devices, the “average” connection speed, either peak or average, is going to reflect the “slower” mobile speeds, compared to fixed network connections.
By the end of 2011, total global mobile subscriptions reached nearly six billion by end 2011, corresponding to a global penetration of 86 percent, according to the International Telecommunications Union.
Growth was driven by developing countries, which accounted for more than 80 percent of the 660 million new mobile subscriptions added in 2011.
If one assumes a typical mobile connection supports lower speed than a fixed network broadband connection, rapidly growing mobile Internet access will have a huge impact on “average” access speeds.
By end 2011, there were more than one billion mobile broadband subscriptions worldwide. More important is the rate of change. Mobile broadband grew at a 40 percent annual
rate in 2011. That rate will slow over time, of course, but at such rates, the base of users doubles in less than three years.
Also, compare mobile broadband to fixed network broadband subscriptions. At the end of 2011, there were 590 million fixed broadband subscriptions worldwide. In other words, there were nearly twice as many mobile broadband users as fixed network broadband users by the end of 2011.
Furthermore, fixed network broadband growth in developed countries was slowing (a five percent increase in 2011), where developing countries continue to experience high growth (18 percent in 2011).
As you might guess, fixed network broadband penetration remains low in some regions, such as Africa and the Arab states, with 0.2 percent and two percent adoption, respectively, by the end of 2011.
Also, in 2011, 30 million fixed broadband subscriptions were added in China alone, representing about half ofthe total fixed network subscriptions added worldwide, while fixed broadband penetration reached 12 percent in China.
One should therefore assume that comparing future “average” or “typical” broadband speeds to past data will be misleading. We might already be seeing that sort of impact.
In the third quarter of 2012, the global average connection speed declined 6.8 percent to 2.8 Mbps, and the global average peak connection speed declined 1.4 percent to 15.9 Mbps, says Akamai.
That statistic likely directly reflects the growing use of mobile networks Since access from mobile devices far outstrips access from fixed network connections, globally, and since mobile network top speeds are less than fixed networks, generally, a growing volume of mobile connections will affect overall “average speed.”
In 2010, global mobile penetration was nearing 80 percent. Early in 2012, global mobile penetration reached 85 percent.
All of that means “average” statistics about broadband access speeds will have to be considered in a more nuanced way from this point forward. As “most” Internet access happens from mobile devices, the “average” connection speed, either peak or average, is going to reflect the “slower” mobile speeds, compared to fixed network connections.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
Tuesday, January 22, 2013
200 Million Global LTE Subscribers in 2013, One Billion by 2016
That represents a compound annual growth rate of about 139 percent.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
What are the Right Metrics to Measure Service Provider Success?
“The financial community does not measure our industry correctly,” says Norman Fekrat, former VP and partner at IBM Global Business Services. And there will be consequences once analysts finally figure out that the current metrics, from the legacy voice business, do not accurately describe the actual financial results being generated by telcos globally.
The number of subscribers once was a meaningful metric. Because “subscribers” was useful, so was the concept of “churn,” reflecting a service provider’s ability to keep its customers.
These days, “revenue generating units” are reported by many service providers, because that simply makes more sense. Average revenue per user likewise made perfect sense in a long era where “subscribers” and “lines” were accurate and useful ways to measure business health.
Fekrat argues that the current metrics actually do not capture financial performance in ways that will matter as all services wind up as IP-bandwidth-based apps. In the era to come, where the fundamental network resource consumed by any app is “gigabytes,” profit will have to be measured, per service or application, in relationship to use of the network.
In a voice-centric business model, additional usage actually did not really affect “cost,” in terms of use of the network resource. That means the profit of a video entertainment service would have to be evaluated not only in terms of revenue, but also in terms of consumption of network resources.
The same would hold for voice, messaging, web surfing or any other application using the network. Part of the reason for Fekrat’s concern is that, just to keep profit margins where they currently are, assuming growing consumption of bandwidth, cost per gigabyte has to decline about 70 percent to 90 percent every three to four years.
Some of that cost reduction might already be happening, at least for buyers able to buy in some volume. Fekrat assumes wholesale capacity prices of about $4 to $5 per gigabyte. Some buyers or sellers might argue prices, on some routes, already are in the two cents to three cents per gigabyte range.
In other words, if the cost per gigabyte per service argument is valid, at least for the capacity part of the business, costs might already be falling fast enough to make operating, capital and other overhead costs more significant than network costs, at least where the core networks are concerned. Access networks might be a different matter, since traditional cost analysis might attribute as much as 90 percent of end-to-end cost to the access networks on either side of any session.
And Fekrat has one benchmark in mind: service provider network costs must, over time, match those of Google. That’s a very tall order, but wise advice, if you assume that, in a competitive market, over the long term, the lowest cost network wins.
The number of subscribers once was a meaningful metric. Because “subscribers” was useful, so was the concept of “churn,” reflecting a service provider’s ability to keep its customers.
These days, “revenue generating units” are reported by many service providers, because that simply makes more sense. Average revenue per user likewise made perfect sense in a long era where “subscribers” and “lines” were accurate and useful ways to measure business health.
Fekrat argues that the current metrics actually do not capture financial performance in ways that will matter as all services wind up as IP-bandwidth-based apps. In the era to come, where the fundamental network resource consumed by any app is “gigabytes,” profit will have to be measured, per service or application, in relationship to use of the network.
In a voice-centric business model, additional usage actually did not really affect “cost,” in terms of use of the network resource. That means the profit of a video entertainment service would have to be evaluated not only in terms of revenue, but also in terms of consumption of network resources.
The same would hold for voice, messaging, web surfing or any other application using the network. Part of the reason for Fekrat’s concern is that, just to keep profit margins where they currently are, assuming growing consumption of bandwidth, cost per gigabyte has to decline about 70 percent to 90 percent every three to four years.
Some of that cost reduction might already be happening, at least for buyers able to buy in some volume. Fekrat assumes wholesale capacity prices of about $4 to $5 per gigabyte. Some buyers or sellers might argue prices, on some routes, already are in the two cents to three cents per gigabyte range.
In other words, if the cost per gigabyte per service argument is valid, at least for the capacity part of the business, costs might already be falling fast enough to make operating, capital and other overhead costs more significant than network costs, at least where the core networks are concerned. Access networks might be a different matter, since traditional cost analysis might attribute as much as 90 percent of end-to-end cost to the access networks on either side of any session.
And Fekrat has one benchmark in mind: service provider network costs must, over time, match those of Google. That’s a very tall order, but wise advice, if you assume that, in a competitive market, over the long term, the lowest cost network wins.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
"Smart M2M" and "Smart ARPU"
It is by no means clear what a “smart” pipe strategy really is, compared to a “dumb pipe” or capacity play, in the retail telecommunications business. To be sure, it is obvious why communications executives find the term “dumb pipe” distasteful, as it implies “low value” or “low margin” or “low gross revenue.”
In truth, virtually all “smart pipe” strategies are built on largely “dumb pipe (best effort Internet access).” In that sense, all retail strategies now are a mix of “dumb pipe (best effort Internet access)” and applications (“smart pipe”). Any service provider selling video entertainment services or voice, for example, by definition is selling an application running on top of the pipe.
Some might say the National Broadband Network in Australia, or any other wholesale-only network services business is akin to a “dumb network” business strategy. But even there, when a wholesale voice service is sold, it is an application running on the network, not a true “dumb pipe” service.
That isn’t going to stop all sorts of service providers from selling or using “smart” as part of their retail branding strategy. Nor, in truth, is the notion incorrect. The point is that service providers all over the world are seriously engaged in a pursuit of new applications to create and sell that incorporate communications features enabled by their networks.
Telefónica Digital, for example, touts “Smart M2M,” a web-based platform for machine-to-machine (M2M) communications. How precisely any active mobile device could provide communications for a sensor function, without being a “smart” activity, is a subtle matter.
“Smart M2M” provides real time monitoring of traffic type, volume and current consumption, technical supervision of lines (maps of connected devices, advanced diagnostics) and localization, Telefónica Digital says.
The service includes fraud detection functionalities, including the ability to restrict communications between a list of given devices or the possibility to establish traffic caps.
NTT Docomo, for its part, now talks about “smart ARPU.”
Minoru Etoh, managing director with NTT Docomo, says Docomo now refers to new value added services including music and video on demand as “smart ARPU (average revenue per user),” which now accounts for about 10 percent of NTT Docomo revenue.
There already is only so much revenue service providers can earn from end users buying mobile broadband, said Etoh. Call that dumb pipe, best effort Internet access. But Docomo is pinning its future revenue growth on “smart ARPU” applications and services that are built on the assumption a customer is buying the “dumb” access services.
In truth, virtually all “smart pipe” strategies are built on largely “dumb pipe (best effort Internet access).” In that sense, all retail strategies now are a mix of “dumb pipe (best effort Internet access)” and applications (“smart pipe”). Any service provider selling video entertainment services or voice, for example, by definition is selling an application running on top of the pipe.
Some might say the National Broadband Network in Australia, or any other wholesale-only network services business is akin to a “dumb network” business strategy. But even there, when a wholesale voice service is sold, it is an application running on the network, not a true “dumb pipe” service.
That isn’t going to stop all sorts of service providers from selling or using “smart” as part of their retail branding strategy. Nor, in truth, is the notion incorrect. The point is that service providers all over the world are seriously engaged in a pursuit of new applications to create and sell that incorporate communications features enabled by their networks.
Telefónica Digital, for example, touts “Smart M2M,” a web-based platform for machine-to-machine (M2M) communications. How precisely any active mobile device could provide communications for a sensor function, without being a “smart” activity, is a subtle matter.
“Smart M2M” provides real time monitoring of traffic type, volume and current consumption, technical supervision of lines (maps of connected devices, advanced diagnostics) and localization, Telefónica Digital says.
The service includes fraud detection functionalities, including the ability to restrict communications between a list of given devices or the possibility to establish traffic caps.
NTT Docomo, for its part, now talks about “smart ARPU.”
Minoru Etoh, managing director with NTT Docomo, says Docomo now refers to new value added services including music and video on demand as “smart ARPU (average revenue per user),” which now accounts for about 10 percent of NTT Docomo revenue.
There already is only so much revenue service providers can earn from end users buying mobile broadband, said Etoh. Call that dumb pipe, best effort Internet access. But Docomo is pinning its future revenue growth on “smart ARPU” applications and services that are built on the assumption a customer is buying the “dumb” access services.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
Google Extends Olive Branch to French Publishers
Google has offered French publishers about 50 million euros for the right to index their content. The problem Google (and other search engines) face is that French government is threatening to pass new laws requiring such payments.
That news comes as Google reportedly also is paying Orange to "terminate" or deliver its content to Orange end users. That deal, in all likelihood, is not what it seems to be. Google operates one of the largest IP networks in the world, so that specific deal probably is not a payment to Orange to deliver traffic, but only a traditional carrier-to-carrier termination agreement.
The deal Google offered to the publishers includes the purchase of advertising space from Google, on paper and digital media, a commercial collaboration between publishers and search engine and the use by the publishers of Google's advertising platform AdSense.
Media owners rejected the offer, saying they wanted annual income of about 70 to 100 million euros.
The pressure from French publishers shows a possible crack in the traditional business relationship between some large application providers and some large media and telecom interests. Both of those industries want more of a share of Internet ecosystem revenue, and such fees as Google supposedly is paying are one way of achieving those objectives.
Policy issues aside, the French media issue is significant, as Google now increasingly is faced with a choice: create new business and commercial deals with business partners, even when it would, in principle, rather not do so, or have regulators and legislators potentially force it to do so anyway, on terms Google will have no control over, or ability to influence.
That news comes as Google reportedly also is paying Orange to "terminate" or deliver its content to Orange end users. That deal, in all likelihood, is not what it seems to be. Google operates one of the largest IP networks in the world, so that specific deal probably is not a payment to Orange to deliver traffic, but only a traditional carrier-to-carrier termination agreement.
The deal Google offered to the publishers includes the purchase of advertising space from Google, on paper and digital media, a commercial collaboration between publishers and search engine and the use by the publishers of Google's advertising platform AdSense.
Media owners rejected the offer, saying they wanted annual income of about 70 to 100 million euros.
The pressure from French publishers shows a possible crack in the traditional business relationship between some large application providers and some large media and telecom interests. Both of those industries want more of a share of Internet ecosystem revenue, and such fees as Google supposedly is paying are one way of achieving those objectives.
Policy issues aside, the French media issue is significant, as Google now increasingly is faced with a choice: create new business and commercial deals with business partners, even when it would, in principle, rather not do so, or have regulators and legislators potentially force it to do so anyway, on terms Google will have no control over, or ability to influence.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
NTT Docomo Chases "Smart ARPU"
No matter how much the term of art is criticized, "dumb pipe" is never far from the surface in the mobile or fixed network business. NTT Docomo, in fact, now uses the term "smart ARPU (average revenue per user)" to describe some of its new value-added services.
The very term implies that there is "dumb ARPU," namely vanilla mobile broadband, offering best effort only access. Nor is there complete agreement on the issue of whether differentiated end user quality of service is a potential source of such smart ARPU.
In fact, said Minoru Etoh, NTT Docomo managing director, offering best effort only access is operationally much simpler than offering tiers of service based on quality metrics. Many others of course believe it will be important to offer differentiated service, where it is possible.
Still, the problem with all the new services mobile operators are experimenting with is that there is still not so much agreement about what will be wind up being a “big” revenue stream, and what might not. Nor is there complete agreement on where the biggest opportunities might lie.
That uncertainty was much in view at a session on the mobile business at the Pacific Telecommunications Council where Yijing Brentano, Sprint VP, expressed optimism about prospects for mobile advertising, as did David Schropfer, The Luciano Group partner.
On the other hand, Minoru Etoh, managing director with NTT Docomo, and a venture capitalist in the audience, disagreed. “I’ve seen hundreds of business plans based on advertising,” the VC said. But Schropfer argued that the type of advertising makes a difference. Traditional formats can change with mobile.
“You can change behavior if you offer a coupon to me while I am in the store,” Schropfer said.
There was less disagreement with the notion that machine-to-machine services would be a significant opportunity, but even there the magnitude of the opportunity is uncertain.
M2M will be important, but only as a business customer service, with mobile service providers selling to automobile manufacturers, said Etoh and Brentano. And Schropfer even classified much of mobile commerce as an M2M opportunity. “M2M is the crux of where mobile commerce is going,” said Schropfer.
But Etoh was not convinced about the timing, and was uncertain revenue or adoption would be significant, any time soon.
Etoh said Docomo now refers to new value added services including music and video on demand as “smart ARPU (average revenue per user),” which now accounts for about 10 percent of NTT Docomo revenue.
There was much more agreement that new revenue sources are essential, though. in large part because there already is only so much revenue service providers can earn from end users buying mobile broadband, said Etoh.
Etoh also warned that Wi-Fi offload might not provide as much benefit for capacity relief in dense urban areas as many now expect.
Despite the rage for mobile offload using Wi-Fi, Docomo has found that Wi-Fi offload doesn’t work in very dense areas, with smallish macrocells, because there is too much interference between the Wi-Fi sites.
The very term implies that there is "dumb ARPU," namely vanilla mobile broadband, offering best effort only access. Nor is there complete agreement on the issue of whether differentiated end user quality of service is a potential source of such smart ARPU.
In fact, said Minoru Etoh, NTT Docomo managing director, offering best effort only access is operationally much simpler than offering tiers of service based on quality metrics. Many others of course believe it will be important to offer differentiated service, where it is possible.
Still, the problem with all the new services mobile operators are experimenting with is that there is still not so much agreement about what will be wind up being a “big” revenue stream, and what might not. Nor is there complete agreement on where the biggest opportunities might lie.
That uncertainty was much in view at a session on the mobile business at the Pacific Telecommunications Council where Yijing Brentano, Sprint VP, expressed optimism about prospects for mobile advertising, as did David Schropfer, The Luciano Group partner.
On the other hand, Minoru Etoh, managing director with NTT Docomo, and a venture capitalist in the audience, disagreed. “I’ve seen hundreds of business plans based on advertising,” the VC said. But Schropfer argued that the type of advertising makes a difference. Traditional formats can change with mobile.
“You can change behavior if you offer a coupon to me while I am in the store,” Schropfer said.
There was less disagreement with the notion that machine-to-machine services would be a significant opportunity, but even there the magnitude of the opportunity is uncertain.
M2M will be important, but only as a business customer service, with mobile service providers selling to automobile manufacturers, said Etoh and Brentano. And Schropfer even classified much of mobile commerce as an M2M opportunity. “M2M is the crux of where mobile commerce is going,” said Schropfer.
But Etoh was not convinced about the timing, and was uncertain revenue or adoption would be significant, any time soon.
Etoh said Docomo now refers to new value added services including music and video on demand as “smart ARPU (average revenue per user),” which now accounts for about 10 percent of NTT Docomo revenue.
There was much more agreement that new revenue sources are essential, though. in large part because there already is only so much revenue service providers can earn from end users buying mobile broadband, said Etoh.
Etoh also warned that Wi-Fi offload might not provide as much benefit for capacity relief in dense urban areas as many now expect.
Despite the rage for mobile offload using Wi-Fi, Docomo has found that Wi-Fi offload doesn’t work in very dense areas, with smallish macrocells, because there is too much interference between the Wi-Fi sites.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
Monday, January 21, 2013
Phablets Could be Big for Developing Markets
The controversy about phablets (some think it is a momentary fad , others think it is something more, and will grow) could have some implications for broadband usage in many parts of the developing world, irrespective of what it could mean for consumers who want a smart phone with a bigger screen.
And that implication is that users who already have demonstrated huge appetite for mobile phones, and will soon want to use the Internet on a more convenient screen, might gravitate to phablet devices as a sort of “best of both worlds” approach to devices.
We’ll have to wait and see, but the emergence first of smart phones and now tablets has begun to make concrete the notion that in many markets, the most-popular computer people use will be a mobile device of some type.
And that implication is that users who already have demonstrated huge appetite for mobile phones, and will soon want to use the Internet on a more convenient screen, might gravitate to phablet devices as a sort of “best of both worlds” approach to devices.
We’ll have to wait and see, but the emergence first of smart phones and now tablets has begun to make concrete the notion that in many markets, the most-popular computer people use will be a mobile device of some type.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
Sunday, January 20, 2013
What Comes After the PSTN?
Some would say it is misleading to talk of the “end of the public switched telephone network,” as that implies something more than a technology replacement (IP for TDM) as the industry earlier evolved from copper to optical fiber, or analog to digital switching, will happen.
But even a self-proclaimed optimist such as Becket notes that although “voice is not dead yet,” the end is coming, for voice as a major revenue source.
Executives already know the answer, it is fair to say. The answer begins with broadband access, but only begins there. Much more will have to follow, and the outcome is uncertain at the moment. But it builds on broadband.
In a real sense, the decommissioning of the PSTN, though a big event, is something veterans of the mobile industry are well acquainted with. That, in fact, is the meaning of the current transition to “fourth generation” networks.
The first generation U.S. analog network was shut down in 2008, for example. The second generation TDM network will be shut down in 2017, according to AT&T.
Still, there is a reasonable sense that something more than mere generations of outside plant, switching technology or protocols are at play for the fixed network. In part that might be because it hasn’t happened before, as it has in the mobile business.
The other obvious difference is that the mobile ecosystem, which requires tighter integration of networks, devices and apps, arguably will have more protection from “dumb pipe” scenarios that worry fixed network executives.
An open-ended question rhetorically asked by TeleGeography VP Stephan Beckert at the Pacific Telecommunications Council illustrates thinking about “the end of the PSTN.”
“Does anyone have a post-PSTN business model?” he asked. The question came in the context of a presentation about the international voice market. Mobile executives would not understand the question, since it is akin to asking “does anyone have a post-analog business model?”
Granted, the post-analog mobile business model did not have to contend with the existence of the Internet. And though mobile service providers are starting to deal with over the top alternatives to carrier services, they have not faced nearly the pressures on the fixed network business.
But the transition to IP, and the diminution of voice as the key revenue driver, probably already has an “answer.” The answer obviously is broadband. One way or the other, fixed network service providers will base their revenue models on broadband access and as many valuable carrier applications and partner relationships using that network, as is possible.
Entertainment video is the second most important application, beyond high speed access. Beyond that, much remains to be seen. But in a simple sense, the post-PSTN business model already can be seen: broadband is the foundation service.
On the other hand, the precise timing of voice as a sizable revenue stream is hard to predict. Service providers have any number of retail packaging techniques that could extend the carrier voice revenue opportunity for some time, even if usage begins to dwindle significantly.
Think of the way voice not is bundled with broadband and video entertainment to encourage people to keep voice service in order to get discounts on all three services. That doesn’t necessarily mean people use the voice service much, but they pay for it.
But even a self-proclaimed optimist such as Becket notes that although “voice is not dead yet,” the end is coming, for voice as a major revenue source.
Executives already know the answer, it is fair to say. The answer begins with broadband access, but only begins there. Much more will have to follow, and the outcome is uncertain at the moment. But it builds on broadband.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
Mega hits One Million Users in a Day
Mega, the new file sharing service from Kim Dotcom, has passed one million users, gained in a single day, Dotcom says. The launch, not a "re-launch" of Megaupload, still is about "content distribution," a business focus that got Megaupload into trouble over content piracy.
Dotcom says that will not be an issue for Upload. Content owners are certain not to be reassured. The service offers users 50 Gbytes of free content storage, and operates in that sense in a manner similar to Dropbox or Skydrive.
The new twist is that Upload is described as a privacy play. Since all data is encrypted, the service can claim that it has no idea what users are uploading and storing, or sharing. The user terms of service specifically forbid upload of copyrighted material, but, by design, Upload won't know what content is uploaded.
It's just another example of friction between IP-based technology and legal frameworks, between what can be done and what is supposed to be done. Even the privacy angle has a dual character. The site protects user data, which many will say is a good thing, for obvious reasons. But that same privacy also cloaks criminal and other anti-social activities.
Dotcom says that will not be an issue for Upload. Content owners are certain not to be reassured. The service offers users 50 Gbytes of free content storage, and operates in that sense in a manner similar to Dropbox or Skydrive.
The new twist is that Upload is described as a privacy play. Since all data is encrypted, the service can claim that it has no idea what users are uploading and storing, or sharing. The user terms of service specifically forbid upload of copyrighted material, but, by design, Upload won't know what content is uploaded.
It's just another example of friction between IP-based technology and legal frameworks, between what can be done and what is supposed to be done. Even the privacy angle has a dual character. The site protects user data, which many will say is a good thing, for obvious reasons. But that same privacy also cloaks criminal and other anti-social activities.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
Saturday, January 19, 2013
Is Usage-Based Internet Access Inherently Unfair?
Though understandable, given the “no incremental cost” nature of much Internet content, information and applications, one might argue the way many think about the Internet is out of sync with the way they think about most other products they buy and use.
Most of the criticism about usage-based pricing is that it somehow is "unfair." Much of the criticism takes the form of complaints about ISPs somehow taking advantage of consumers. It is argued there is no need for metering, for example.
In other cases, some critics imply or allege that metered pricing is simply a way for ISPs to make more money from their customers.
Are usage-based charging mechanisms inherently unfair and detrimental to continued development of the Internet? Some think so. And there is Internet precedent for such thinking, to be sure. AOL found usage exploded when it, and other dial-up access providers, shifted from metered usage to flat fee pricing.
One might object that this encouraged use of the Internet but at the “expense” of increased direct costs for Internet access providers. So there is good reason to argue that directly metered use of Internet access might actually discourage people from using the Internet.
But that isn’t generally the way usage is rated, these days. Consumers generally understand and seem comfortable with “buckets of usage” that provide cost predictability, but also allow users to buy less or more access in line with their needs.
‘
Usage based pricing might actually be a good thing for the overwhelming number of consumers, to the extent that lighter users pay less, heavier users pay more, and suppliers have accurate information about how much more capacity to add, where and when, which in turn ensures that investment is adequate to support anticipated growth of demand.
In fact, one might argue, the worse scenario is where usage and pricing are not related in some relatively direct way, as that distorts both demand and supply.
One frequently hears warnings about outsized growth of broadband access demand, the implication being that a crisis might develop if “something is not done.” Some predict that 1,000 times more mobile bandwidth will be needed by 2020, for example.
But both suppliers and consumers are rational about their bandwidth choices, when there is a clear link between consumption and out of pocket costs, and when consumers can act on that information.
Even if future supply were not an issue, it would still make sense to allow consumers to make choices about how much “Internet access” they really want to purchase, as that would send clear signals to suppliers about how much to invest in new capacity..
The problem with “unlimited” plans is that such retail pricing does not automatically send accurate supply and demand signals, and does not trigger the normal decision-making consumers always make when considering how much of any product to buy.
Nor do we often remember that demand for Internet access is dynamic, not static. Raise the price, and consumers will buy less, lower the price and they will buy more.
‘
To an extent, changes in device profiles also make a difference, as typical bandwidth consumption on a PC is far higher than on a smart phone or a tablet.
And users clearly are shifting Internet activities to smart phones and tablets. At some point, that could slow data consumption growth rates, even if, over time, bandwidth consumption grows.
Demand will grow, but probably less robustly than many forecasts predict. Mobile data consumption, even among smart phone users, is well below 1 Gbyte a month, according to Sandvine.

An analysis by the U.S. Federal Communications Commission suggested that, in the first half of 2009, the median fixed network (half used more, half used less) broadband user consumed almost two gigabytes of data per month. Mobile users consumed only hundreds of megabytes.
The 2009 study suggested that, overall, per-person usage is growing 30 percent to 35 percent per year. That doesn’t necessarily directly suggest how much an “account” or “home” might consumer, though.
The FCC study does not directly correlate a single person’s usage with the account details, as it is a “per-capita” measure. Such “per-person” measures are useful, but not entirely accurate if services are purchased “by location,” instead of “by person.”
n other words, a single user might have one access account, while a family might have three to five people sharing a single account.
As a rough metric, a typical 2.5-person household, sharing one account, might have consumed about six gigabytes a month, based on the 2009 data.
If the 30 percent annual growth rate remained intact through the end of 2012, that might imply 2014 median usage of about seven gigabytes per person, or 17.5 Gbytes per household account, using the 2.5 persons per home assumption.
Other 2010 estimates for current consumption were roughly in the same range as the 2009 FCC figures, adjusted for annual growth. Comcast said in December 2010 that a typical user consumed about two to four gigabytes a month, far below the 250 gigabyte cap for a Comcast residential account.
That would be right in line with the FCC’s base of two gigabytes, and a growth rate of 30 percent annually.
Actual data consumption for most users of fixed network broadband is not all that high, in other words. True, demand will grow. But so long as price signals can be sent, supply should satisfy demand.

Most of the criticism about usage-based pricing is that it somehow is "unfair." Much of the criticism takes the form of complaints about ISPs somehow taking advantage of consumers. It is argued there is no need for metering, for example.
In other cases, some critics imply or allege that metered pricing is simply a way for ISPs to make more money from their customers.
Are usage-based charging mechanisms inherently unfair and detrimental to continued development of the Internet? Some think so. And there is Internet precedent for such thinking, to be sure. AOL found usage exploded when it, and other dial-up access providers, shifted from metered usage to flat fee pricing.
One might object that this encouraged use of the Internet but at the “expense” of increased direct costs for Internet access providers. So there is good reason to argue that directly metered use of Internet access might actually discourage people from using the Internet.
But that isn’t generally the way usage is rated, these days. Consumers generally understand and seem comfortable with “buckets of usage” that provide cost predictability, but also allow users to buy less or more access in line with their needs.
‘
Usage based pricing might actually be a good thing for the overwhelming number of consumers, to the extent that lighter users pay less, heavier users pay more, and suppliers have accurate information about how much more capacity to add, where and when, which in turn ensures that investment is adequate to support anticipated growth of demand.
In fact, one might argue, the worse scenario is where usage and pricing are not related in some relatively direct way, as that distorts both demand and supply.
One frequently hears warnings about outsized growth of broadband access demand, the implication being that a crisis might develop if “something is not done.” Some predict that 1,000 times more mobile bandwidth will be needed by 2020, for example.
But both suppliers and consumers are rational about their bandwidth choices, when there is a clear link between consumption and out of pocket costs, and when consumers can act on that information.
Even if future supply were not an issue, it would still make sense to allow consumers to make choices about how much “Internet access” they really want to purchase, as that would send clear signals to suppliers about how much to invest in new capacity..
The problem with “unlimited” plans is that such retail pricing does not automatically send accurate supply and demand signals, and does not trigger the normal decision-making consumers always make when considering how much of any product to buy.
Nor do we often remember that demand for Internet access is dynamic, not static. Raise the price, and consumers will buy less, lower the price and they will buy more.
‘
To an extent, changes in device profiles also make a difference, as typical bandwidth consumption on a PC is far higher than on a smart phone or a tablet.
And users clearly are shifting Internet activities to smart phones and tablets. At some point, that could slow data consumption growth rates, even if, over time, bandwidth consumption grows.
Demand will grow, but probably less robustly than many forecasts predict. Mobile data consumption, even among smart phone users, is well below 1 Gbyte a month, according to Sandvine.
An analysis by the U.S. Federal Communications Commission suggested that, in the first half of 2009, the median fixed network (half used more, half used less) broadband user consumed almost two gigabytes of data per month. Mobile users consumed only hundreds of megabytes.
The 2009 study suggested that, overall, per-person usage is growing 30 percent to 35 percent per year. That doesn’t necessarily directly suggest how much an “account” or “home” might consumer, though.
The FCC study does not directly correlate a single person’s usage with the account details, as it is a “per-capita” measure. Such “per-person” measures are useful, but not entirely accurate if services are purchased “by location,” instead of “by person.”
n other words, a single user might have one access account, while a family might have three to five people sharing a single account.
As a rough metric, a typical 2.5-person household, sharing one account, might have consumed about six gigabytes a month, based on the 2009 data.
If the 30 percent annual growth rate remained intact through the end of 2012, that might imply 2014 median usage of about seven gigabytes per person, or 17.5 Gbytes per household account, using the 2.5 persons per home assumption.
Other 2010 estimates for current consumption were roughly in the same range as the 2009 FCC figures, adjusted for annual growth. Comcast said in December 2010 that a typical user consumed about two to four gigabytes a month, far below the 250 gigabyte cap for a Comcast residential account.
That would be right in line with the FCC’s base of two gigabytes, and a growth rate of 30 percent annually.
Actual data consumption for most users of fixed network broadband is not all that high, in other words. True, demand will grow. But so long as price signals can be sent, supply should satisfy demand.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
Why "Nobody" Worries About Phone Costs, Anymore
As recently as 2001, it was still possible to say, with a straight face, that “corporate phone bills are a budget buster.” A decade later, can it honestly be said that phone bills are a significant enterprise cost of doing business?
Possibly. Mobile calling now represents two thirds of all business calling minutes in the United Kingdom, for example. So one might argue that it is not voice calling costs, but possibly the cost of mobile subscriptions which are a significant issue for enterprises.
Mobile data charges might be said to be the big current issue, but even there, costs per megabyte have dropped from about 46 cents per megabyte in 2008 to about three cents per megabyte by 2012. That’s an order of magnitude drop in just four to five years.
But at least in developed markets, it is harder than ever to argue that communications costs, for landline voice, mobile voice, fixed network data or mobile data are a “big” cost item for most businesses or individuals. That doesn’t mean there are no problem areas, or that people will not complain.
Overall, as a percentage of total costs of doing business, or as a percent of consumer discretionary spending, mobile or other communications are not a big driver of personal or business spending, on a percentage of total spending.
There are some problem areas, in particular the cost of trans-border mobile calls and trans-border mobile data cost. But high costs always create an incentive for over the top alternatives, spur regulatory action to force lower costs and hence also will eventually become less an issue. In most other cases, communication costs simply are falling.
Whether for consumers or businesses, communication costs tend to be low single digits kinds of operating cost or personal spending categories.
People still gripe, of course. People still complain about the cost of mobile phone service or broadband access. One rarely hears much about the cost of consumer fixed network phone service, in part because the incremental cost, in a triple play bundle, is relatively slight.
Without a doubt, people and organizations will continue to benefit from better features and lower prices. People still will gripe. But communication costs, for the most part, just aren’t a big cost driver for most businesses or a burdensome expense for most consumers in developed economies.
Possibly. Mobile calling now represents two thirds of all business calling minutes in the United Kingdom, for example. So one might argue that it is not voice calling costs, but possibly the cost of mobile subscriptions which are a significant issue for enterprises.
Mobile data charges might be said to be the big current issue, but even there, costs per megabyte have dropped from about 46 cents per megabyte in 2008 to about three cents per megabyte by 2012. That’s an order of magnitude drop in just four to five years.
But at least in developed markets, it is harder than ever to argue that communications costs, for landline voice, mobile voice, fixed network data or mobile data are a “big” cost item for most businesses or individuals. That doesn’t mean there are no problem areas, or that people will not complain.
Overall, as a percentage of total costs of doing business, or as a percent of consumer discretionary spending, mobile or other communications are not a big driver of personal or business spending, on a percentage of total spending.
There are some problem areas, in particular the cost of trans-border mobile calls and trans-border mobile data cost. But high costs always create an incentive for over the top alternatives, spur regulatory action to force lower costs and hence also will eventually become less an issue. In most other cases, communication costs simply are falling.
Whether for consumers or businesses, communication costs tend to be low single digits kinds of operating cost or personal spending categories.
People still gripe, of course. People still complain about the cost of mobile phone service or broadband access. One rarely hears much about the cost of consumer fixed network phone service, in part because the incremental cost, in a triple play bundle, is relatively slight.
Without a doubt, people and organizations will continue to benefit from better features and lower prices. People still will gripe. But communication costs, for the most part, just aren’t a big cost driver for most businesses or a burdensome expense for most consumers in developed economies.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
Friday, January 18, 2013
FCC's "Gigabit City Challenge"
Federal Communications Commission Chairman Julius Genachowski has called for at least one gigabit community in all 50 U.S. states by 2015, and suggests that broadband providers, state and municipal community leaders figure out a way to make that happen, to create a critical mass of communities.
The FCC also plans to create a new online clearinghouse of best practices "to collect and disseminate information about how to lower the costs and increase the speed of broadband deployment nationwide, including to create gigabit communities."
The Gigabit City Challenge will of course face some obstacles. Some will say local governments, state governments and the FCC itself, which never have had the political appetite or power to compel massive "municipal broadband infrastructure" projects, will face tougher obstacles over the next couple of decades.
If a municipality really wants to build its own infrastructure, on a wide scale, in markets where strong cable and telco operations already exist, mobile service providers and satellite providers, there will be an obvious business model problem, namely that the market probably cannot support a new provider.
"Overbuilding" generally has proven to be a difficult business proposition, historically.
One might suppose that someday, one dominant provider in many markets might decide it makes sense to build and operate a wholesale network of this sort. That likely would not be a cable operator, given that industry's historic resistance to such notions.
Telcos have been no more willing, historically to trade away their right to use scarce infrastructure, either. Whether thinking might change some decades hence is hard to predict or foresee.
The other problem would seem to be that, even if the political will and political power could be amalgamated, it is not so clear that a large municipality, or even a state, could afford the indebtedness required to underwrite a large gigabit network.
That might have been feasible some decades ago. It certainly will not be a reasonable option over the next couple of decades, and maybe never again.
The FCC also plans to create a new online clearinghouse of best practices "to collect and disseminate information about how to lower the costs and increase the speed of broadband deployment nationwide, including to create gigabit communities."
The Gigabit City Challenge will of course face some obstacles. Some will say local governments, state governments and the FCC itself, which never have had the political appetite or power to compel massive "municipal broadband infrastructure" projects, will face tougher obstacles over the next couple of decades.
If a municipality really wants to build its own infrastructure, on a wide scale, in markets where strong cable and telco operations already exist, mobile service providers and satellite providers, there will be an obvious business model problem, namely that the market probably cannot support a new provider.
"Overbuilding" generally has proven to be a difficult business proposition, historically.
One might suppose that someday, one dominant provider in many markets might decide it makes sense to build and operate a wholesale network of this sort. That likely would not be a cable operator, given that industry's historic resistance to such notions.
Telcos have been no more willing, historically to trade away their right to use scarce infrastructure, either. Whether thinking might change some decades hence is hard to predict or foresee.
The other problem would seem to be that, even if the political will and political power could be amalgamated, it is not so clear that a large municipality, or even a state, could afford the indebtedness required to underwrite a large gigabit network.
That might have been feasible some decades ago. It certainly will not be a reasonable option over the next couple of decades, and maybe never again.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
Subscribe to:
Comments (Atom)
Access Network Limitations are Not the Performance Gate, Anymore
In the communications connectivity business, mobile or fixed, “more bandwidth” is an unchallenged good. And, to be sure, higher speeds have ...
-
We have all repeatedly seen comparisons of equity value of hyperscale app providers compared to the value of connectivity providers, which s...
-
It really is surprising how often a Pareto distribution--the “80/20 rule--appears in business life, or in life, generally. Basically, the...
-
One recurring issue with forecasts of multi-access edge computing is that it is easier to make predictions about cost than revenue and infra...