It might be way too early to say sentiment about municipal broadband, in U.S. state legislatures, has shifted, but the defeat of a bill in the Georgia legislature that would have banned municipal broadband networks could indicate movement.
The bill reportedly would have outlawed municipal broadband networks where a private service supplier already offers service. That would be a relatively rare reversal, as 19 states have some restrictions on municipal broadband, according to the Institute for Local Self-Reliance
There are legitimate issues. Many would say government entities generally should not compete with private entities using tax and other advantages a non-profit entity can take advantage of.
On the other hand, competition in the Internet service provider business is generally seen as promoting end user welfare.
And as a growing number of non-traditional access methods indicate, there actually are new models other than telco, cable, satellite or independent ISP models. The Fon initiative, for example, is showing that "user-contributed" access networks are feasible in some instances.
Perhaps the Georgia legislature is signaling something bigger, namely a willingness to allow more experimentation about broadband services, and who can provide them.
Monday, March 11, 2013
Does Georgia Decison Signal a Turn of Sentiment for Municipal Broadband?
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
For Service Providers, Tablets Might Not Matter, Video Does
Bell Labs predicts that, by 2020, consumers in the United States alone will consume seven hours of video each day, compared to 4.8 hours in 2012,, and will increasingly consume this additional video on tablets, both at home and on the go. Those figures include consumption of standard linear TV, time shifted video and “on demand” video, as well as use of video communications.
As you might guess, on demand programming will be key. Some 70 percent of daily video consumption will be of on-demand sources, compared with 33 percent “live” content. Overall, Internet video consumption will grow by a factor of 12, Bell Labs predicts.
The total time spent watching video likely will take the form of multi-tasking, so users might “watch” seven hours of video in five hours, including situations where a TV is on and a user is engaged in a video call as well.
The proportion of time spent watching video-on-demand services and web-based video will
grow from 33 percent to 77 percent, meaning the relative share of viewing time for linear TV will drop from 66 percent to about 10 percent.
Some 10.5 percent of video on demand and 8.5 percent of over the top video consumption will
occur at the peak hour of 8:00 p.m.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
"Post-PC" Sales Trends
In 2012, global PC shipments dropped 3.7 percent, year over year, according to IDC.
IDC now expects 2013 PC shipments. to decline by 1.3 percent, as well. Microsoft and Intel had been hoping that the Windows 8 launch would provide sales momentum, but IDC says that failed to happen.
Christmas and holiday sales were disappointing, IDC says. Also, information technology budgets were tight in the second half of 2012. All of that contributed to a year-over-year decline of 8.3 percent in fourth quarter PC shipments, the most substantial decline recorded for a holiday quarter, IDC maintains.
Emerging market growth also is declining. In 2012 was the first year that emerging markets saw a volume decline. IDC expects 2013 will see sales growth of less than one percent, continuing at about that rate through 2017.
In developed markets, 2013 will mark the third consecutive year of volume declines. IDC expects limited growth in 2014 and 2015 with PC sales declines in later years.
IDC now expects 2013 PC shipments. to decline by 1.3 percent, as well. Microsoft and Intel had been hoping that the Windows 8 launch would provide sales momentum, but IDC says that failed to happen.
Christmas and holiday sales were disappointing, IDC says. Also, information technology budgets were tight in the second half of 2012. All of that contributed to a year-over-year decline of 8.3 percent in fourth quarter PC shipments, the most substantial decline recorded for a holiday quarter, IDC maintains.
Emerging market growth also is declining. In 2012 was the first year that emerging markets saw a volume decline. IDC expects 2013 will see sales growth of less than one percent, continuing at about that rate through 2017.
In developed markets, 2013 will mark the third consecutive year of volume declines. IDC expects limited growth in 2014 and 2015 with PC sales declines in later years.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
Sometimes HSPA+ is as Fast As Some LTE Networks
Though controversy about what networks can legitimately be called “fourth generation” has been an issue, in some cases, some 3G networks can provide access speeds so comparable to 4G Long Term Evolution that most users could not tell the difference.
Though that should not continue to be the case always, at the moment, some 3G services offer access speeds quite comparable to LTE, Rootmetrics tests in the first half of 2012 suggested.
In those tests, Verizon delivered 77.4 percent of their downloads at speeds above 5 Mbps. Verizon also offered the least percentage of tests in the “slow” bucket.
“Verizon was the most consistent carrier for delivering fast speeds and also the most consistent at avoiding the slowest speeds,” RootMetrics found.
But T-Mobile USA’s performance using an HSPA+42 network was quite strong.
Though AT&T edged ahead of T-Mobile, the distance between the carriers was small, Rootmetrics says. T-Mobile USA speeds were often closer to T-Mobile than to Verizon.
Compare, for instance, how often each of these three carriers delivered speeds above 5 Mbps: Where Verizon delivered speeds above 5 Mbps 77.4 percent of the time, AT&T did so in 48.1 percent of the tests.
T-Mobile USA surpassed 5 Mbps in 46.7 percent of the tests.
Also, performance by Sprint and MetroPCS shows the importance of adding LTE service. Both Sprint and MetroPCS, nearly 70 percent of the time, tested in the “slowest bucket.”
On the other hand, Sprint proved much better at the top end of the tests. MetroPCS delivered speeds above 5 Mbps 0.9 percent of the time, while Sprint did so in 17.2 percent of the tests.
Though that should not continue to be the case always, at the moment, some 3G services offer access speeds quite comparable to LTE, Rootmetrics tests in the first half of 2012 suggested.
In those tests, Verizon delivered 77.4 percent of their downloads at speeds above 5 Mbps. Verizon also offered the least percentage of tests in the “slow” bucket.
“Verizon was the most consistent carrier for delivering fast speeds and also the most consistent at avoiding the slowest speeds,” RootMetrics found.
But T-Mobile USA’s performance using an HSPA+42 network was quite strong.
Though AT&T edged ahead of T-Mobile, the distance between the carriers was small, Rootmetrics says. T-Mobile USA speeds were often closer to T-Mobile than to Verizon.
Compare, for instance, how often each of these three carriers delivered speeds above 5 Mbps: Where Verizon delivered speeds above 5 Mbps 77.4 percent of the time, AT&T did so in 48.1 percent of the tests.
T-Mobile USA surpassed 5 Mbps in 46.7 percent of the tests.
Also, performance by Sprint and MetroPCS shows the importance of adding LTE service. Both Sprint and MetroPCS, nearly 70 percent of the time, tested in the “slowest bucket.”
On the other hand, Sprint proved much better at the top end of the tests. MetroPCS delivered speeds above 5 Mbps 0.9 percent of the time, while Sprint did so in 17.2 percent of the tests.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
Sunday, March 10, 2013
AT&T to Buy 25% of Reliance Jio?
If true, the deal could signify both AT&T's interest in one of the biggest global markets, but also an indicator that future prospects in its home market might not be so compelling, compared to moves offshore.
The firm has ambitions to become the number one in India telecom market.
The Indian company is the nation's second-largest wireless carrier with 105 million subscribers and a market capitalization around $7.4 billion.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
Saturday, March 9, 2013
Mobile Data Offload Growing "Faster than Expected"
In 2012, Cisco estimates, mobile offload represented about 33 percent of total mobile traffic, on a global level. As recently as 2011, Cisco estimated mobile offload would comprise 22 percent in 2016.
In 2013, offload will grow to 38 percent of total data consumption.
By 2016, Cisco estimates about 46 percent of global mobile data traffic will be offloaded to fixed networks. That represents a “dramatic shift” of mobile traffic offloaded to fixed networks, Cisco says.
Some might speculate that offload could be a bigger factor. Offloading is even more pronounced in the United States, where mobile offload will account for 66 percent of total mobile traffic in 2017.
Mobile data offload has grown faster than expected at least in part because Internet service providers intentionally are encouraging users to do so. Mobile service providers do so to maintain capacity on their networks, while fixed network providers do so to create “wireless extensions” of their fixed access services.
Consumers have a vested interest in using mobile offload to avoid stressing their data plans, especially as video has become the dominant driver of data consumption.
As smart phones increasingly are used for content consumption, not talking or texting, the value of mobile offload is bound to grow.
And that ultimately could create some new opportunities for untethered devices and services that replicate 80 percent or more of the value of a smart phone but without the need for traditional voice or texting plans.
And though people have been speculating for more than a decade about whether dense Wi-Fi networks could “compete” with mobile networks, the possibility of doing so actually is growing as the primary applications shift to content consumption, while voice and texting become available as over the top apps, and the density of Wi-Fi nodes increases.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
Friday, March 8, 2013
AT&T Says Unlocking Not a Big Deal
Some will see AT&T’s comments about device unlocking as posturing, but it is not entirely clear that device unlocking is something the leading mobile service providers could not live with or even support.
“AT&T’s policy is to unlock our customers’ devices if they’ve met the terms of their service agreements and we have the unlock code,” says Joan Marsh on the AT&T Policy Blog. “ It’s a straightforward policy, and we aim to make the unlocking process as easy as possible.”
Federal law makes it unlawful to circumvent technological measures employed by copyright owners to protect their property, including software. Under the Digital Millennium Copyright Act (DMCA), the Librarian of Congress conducts a periodic review to determine whether or not users of copyrighted work – in this case device owners – will be adversely affected.
On October 28, 2012, as part of the periodic review, the Librarian issued a new ruling on the mobile handset exception which narrowed the unlocking exemption that it had previously granted.
Under the latest interpretation, the unlocking must be initiated by the owner of the device (not a bulk reseller) who also owns the copy of the software on the device, the device must have been purchased within a specific time window, the wireless carrier must have failed to act with a reasonable time period on a request to unlock the device and the unlocking must be requested to permit connection to another carrier’s network.
The new interpretation “has very little impact on AT&T customers,” Marsh says.
“If we have the unlock code or can reasonably get it from the manufacturer, AT&T currently will unlock a device for any customer whose account has been active for at least sixty days; whose account is in good standing and has no unpaid balance; and who has fulfilled his or her service agreement commitment,“ Marsh says.
“ If the conditions are met we will unlock up to five devices per account per year,” says Marsh.
So “the Librarian’s ruling will not negatively impact any of AT&T’s customers,” says Marsh.
Some will say that is fine, but what they really want is unlocked phones at the start of a relationship with a service provider. Some might say that often is possible. Others might say the financial advantages are structured to make such practices nonsensical.
For example, if a user bringing an unlocked device has to pay the same monthly fees as a customer whose fees include a phone subsidy and a two-year contract, then there is no real financial break for supplying one’s own phone.
Unlocking, per se, seems not to be the issue. The ability to buy a user an unlocked phone “on any mobile network (consistent with air interface capabilities of the device)” with the benefit of a lower monthly service plan seems to be the real issue.
“AT&T’s policy is to unlock our customers’ devices if they’ve met the terms of their service agreements and we have the unlock code,” says Joan Marsh on the AT&T Policy Blog. “ It’s a straightforward policy, and we aim to make the unlocking process as easy as possible.”
Federal law makes it unlawful to circumvent technological measures employed by copyright owners to protect their property, including software. Under the Digital Millennium Copyright Act (DMCA), the Librarian of Congress conducts a periodic review to determine whether or not users of copyrighted work – in this case device owners – will be adversely affected.
On October 28, 2012, as part of the periodic review, the Librarian issued a new ruling on the mobile handset exception which narrowed the unlocking exemption that it had previously granted.
Under the latest interpretation, the unlocking must be initiated by the owner of the device (not a bulk reseller) who also owns the copy of the software on the device, the device must have been purchased within a specific time window, the wireless carrier must have failed to act with a reasonable time period on a request to unlock the device and the unlocking must be requested to permit connection to another carrier’s network.
The new interpretation “has very little impact on AT&T customers,” Marsh says.
“If we have the unlock code or can reasonably get it from the manufacturer, AT&T currently will unlock a device for any customer whose account has been active for at least sixty days; whose account is in good standing and has no unpaid balance; and who has fulfilled his or her service agreement commitment,“ Marsh says.
“ If the conditions are met we will unlock up to five devices per account per year,” says Marsh.
So “the Librarian’s ruling will not negatively impact any of AT&T’s customers,” says Marsh.
Some will say that is fine, but what they really want is unlocked phones at the start of a relationship with a service provider. Some might say that often is possible. Others might say the financial advantages are structured to make such practices nonsensical.
For example, if a user bringing an unlocked device has to pay the same monthly fees as a customer whose fees include a phone subsidy and a two-year contract, then there is no real financial break for supplying one’s own phone.
Unlocking, per se, seems not to be the issue. The ability to buy a user an unlocked phone “on any mobile network (consistent with air interface capabilities of the device)” with the benefit of a lower monthly service plan seems to be the real issue.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
Just a nice graphic for International Women's Day
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
The Value Driving Small Business to Mobile Payments
One issue proponents of many forms of mobile payment have had to grapple with is the issue of "value" for the end user, when the success of any such venture hinges on making several classes of end users happy, all at the same time.
That ecosystem includes people who buy things, the retailers who sell things, the financial institutions providing the end user accounts and the processors who handle the transactions, Unless each segment sees clear value, it is tough to create the new business.
PaySimple, Intuit, GoPayments, PayPal Here, Square and Flint Mobile are solving one key element of the "value" question for some retailers, especially smaller businesses that always have cash flow issues.
A primary benefit of retailer mobile payments systems is that sales are converted into cash inflows, within a day. That doesn't change the value proposition for shoppers, necessarily. Nor do such systems always and necessarily work to the advantage of banks or settlement brands.
But getting paid right away is enough value for small businesses to drive adoption fast. Up to this point, convincing many retailers to invest money in mobile payments has been a tough sell.
For small businesses, that increasingly is not the case. The cost of terminals is not much of an issue. The value does not have to hinge on "lower fees" for taking credit card or debit card payments. Just getting paid fast is the driver.
That ecosystem includes people who buy things, the retailers who sell things, the financial institutions providing the end user accounts and the processors who handle the transactions, Unless each segment sees clear value, it is tough to create the new business.
PaySimple, Intuit, GoPayments, PayPal Here, Square and Flint Mobile are solving one key element of the "value" question for some retailers, especially smaller businesses that always have cash flow issues.
A primary benefit of retailer mobile payments systems is that sales are converted into cash inflows, within a day. That doesn't change the value proposition for shoppers, necessarily. Nor do such systems always and necessarily work to the advantage of banks or settlement brands.
But getting paid right away is enough value for small businesses to drive adoption fast. Up to this point, convincing many retailers to invest money in mobile payments has been a tough sell.
For small businesses, that increasingly is not the case. The cost of terminals is not much of an issue. The value does not have to hinge on "lower fees" for taking credit card or debit card payments. Just getting paid fast is the driver.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
EC Mobile Antitrust Probe Ends
But the EC now has concluded that since such standards work now is conducted by the GSMA and other standards bodies, there is no immediate problem.
To be sure, all standards ultimately benefit some contestants and market participants more than others, especially when a standard plays to one specific technology approach that becomes an "industry" standard.
That sort of "bias" cannot be completely eliminated. But the investigation points out how careful dominant service providers have to be when trying to develop new services and apps that require scale. Mobile payments and mobile wallets provide one other example.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
U.S. Telecom Business (Overall) is In Surprisingly Good Shape
If you prefer video, an have 40 minutes, you can hear a discussion at a high level of the 2013 Telecommunications Industry Association “Market Review and Forecast.” It’s always useful.
You can watch here.
If you don’t have 40 minutes, here are a couple trends you might find noteworthy. The single biggest takeaway is that the U.S. business is outpacing most other regions, in terms of growth and revenue.
That does not mean every sector of the U.S. business is doing so well, only that the entire business, lead by AT&T and Verizon, is doing surprisingly well. The trend over the last decade has been for the market share leaders to do far better than smaller providers. So scale clearly matters.
There are two other developments of note. Among them: U.S. mobile data revenues have surpassed voice for the first time. You knew it was coming, and it has finally happened.
In 2012, for the first time in the history of U.S. mobile communications, customers spent more money on mobile data services ($94.8 billion) than on mobile voice services ($92.4 billion), the Telecommunications Industry Association now says.
That primarily is significant for those in the service provider segment of the business.
At a high level, the key managerial task is growing the essential data revenues business, which represents the future, while maintaining voice revenues at the highest possible level, for as long as possible.
For many on the supplier side of the industry, the key trend is that U.S.service provider capital investment is accelerating, and will grow at a 34 percent rate over the 2013 to 2016 period.
In 2012, U.S. wireline spending was $39.1 billion, compared with $27 billion for wireless infrastructure. By 2016, wireline spending is expected to climb to $44.4 billion, while wireless will reach $38.4 billion.
If the prediction proves correct, industry suppliers are headed for four “fat” years, after that recent “lean” years.
The TIA reports that although overall global telecom industry revenue growth decelerated to seven percent, down three percent from 2011 levels, revenue growth actually accelerated in the U.S. market from 5.9 percent in 2011 to 6.2 percent in 2012.
TIA predicts this trend will accelerate in the years ahead – with mobile data spend hitting $118.6 billion in 2013 (versus $86.4 billion for voice) and $184 billion by 2016 (versus $70.1 billion for voice).
Additionally, U.S. wireless penetration jumped over 100 percent in 2012, growing to 102.5 percent for the year. TIA predicts that wireless carriers will add 40.3 million subscribers over the next four years, for a penetration of 111.3 percent in 2016.
TIA predicts U.S. revenue growth rates of 7.1 percent in 2013 and 6.8 percent in 2014, while international markets will see rates of 7.9 and 6.5 percent, respectively.
You can watch here.
If you don’t have 40 minutes, here are a couple trends you might find noteworthy. The single biggest takeaway is that the U.S. business is outpacing most other regions, in terms of growth and revenue.
That does not mean every sector of the U.S. business is doing so well, only that the entire business, lead by AT&T and Verizon, is doing surprisingly well. The trend over the last decade has been for the market share leaders to do far better than smaller providers. So scale clearly matters.
There are two other developments of note. Among them: U.S. mobile data revenues have surpassed voice for the first time. You knew it was coming, and it has finally happened.
In 2012, for the first time in the history of U.S. mobile communications, customers spent more money on mobile data services ($94.8 billion) than on mobile voice services ($92.4 billion), the Telecommunications Industry Association now says.
That primarily is significant for those in the service provider segment of the business.
At a high level, the key managerial task is growing the essential data revenues business, which represents the future, while maintaining voice revenues at the highest possible level, for as long as possible.
For many on the supplier side of the industry, the key trend is that U.S.service provider capital investment is accelerating, and will grow at a 34 percent rate over the 2013 to 2016 period.
In 2012, U.S. wireline spending was $39.1 billion, compared with $27 billion for wireless infrastructure. By 2016, wireline spending is expected to climb to $44.4 billion, while wireless will reach $38.4 billion.
If the prediction proves correct, industry suppliers are headed for four “fat” years, after that recent “lean” years.
The TIA reports that although overall global telecom industry revenue growth decelerated to seven percent, down three percent from 2011 levels, revenue growth actually accelerated in the U.S. market from 5.9 percent in 2011 to 6.2 percent in 2012.
TIA predicts this trend will accelerate in the years ahead – with mobile data spend hitting $118.6 billion in 2013 (versus $86.4 billion for voice) and $184 billion by 2016 (versus $70.1 billion for voice).
Additionally, U.S. wireless penetration jumped over 100 percent in 2012, growing to 102.5 percent for the year. TIA predicts that wireless carriers will add 40.3 million subscribers over the next four years, for a penetration of 111.3 percent in 2016.
TIA predicts U.S. revenue growth rates of 7.1 percent in 2013 and 6.8 percent in 2014, while international markets will see rates of 7.9 and 6.5 percent, respectively.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
Unlicensed Spectrum and "Fairness"
Most people, if asked, will tend to say that all competitions ought to be "fair." But what "fair" means, in practice, is harder to describe. A foot race should have all contestants running the same distance, for example. That sounds fair.
But what if the objective is to "normalize" a competition for a full range of contestants with highly-varied skills? In that case, a "handicap" system, as used in golf, might be necessary.
Something of the same conundrum might be said to exist for broadband access services. A notion of "fairness" might suggest that all licensees in a field abide by the same rules. But that rarely happens in our modern IP communications business.
Competitors always argue, and regulatory officials typically agree, that the former monopoly provider has such entrenched advantages that handcuffs need to be kept on the incumbent, at least until such time as the competitors have had a chance to become established.
But we also have the example of industries competing directly under "unequal" rules, such as cable TV and telcos, for example, where cable operators have no wholesale obligations and leading telcos do have such obligations. That is beginning to change, slowly.
But the philosophical issues remain highly charged. Many do not believe, for example, that non-profit entities should be able to compete with for-profit entities when the non-profits can use their tax advantages to do so.
A non-profit government entity should not, in this view, be able to compel purchase of products, or use its taxing authority to raise capital, borrow at favored rates, or employ other advantages no for-profit competitor can match.
That might be one attraction for wider availability of unlicensed spectrum: it can provide the basis for greater competition without raising those other competitive issues. That doesn't mean existing competitors will agree, only that some thorny issues are avoided if non-licensed spectrum is made available.
Millimeter waves in the spectrum from 30 GHz to 300 GHz have not traditionally been usable for communications, even very short range (local distribution between a decoder and a TV, a mouse and a PC, a smart phone an a payment terminal).
Better coding and abundant cheap processing now makes those frequencies usable, in some cases, for the first time.
For some of us, the question is whether any of those frequencies will be usable either for wireless backhaul or access purposes. The big problems lie with the physics. Waves at those frequencies just don't travel that far through air, limiting their effectiveness for network access.
As a figure of merit, assume that a 3 decibel gain represents 100 percent more signal strength, while a 3 dB loss cuts signal strength by 50 percent. As always is the case with free space energy, there is less attenuation at lower frequencies, so 30 GHz to 40 GHz looks interesting, from an access perspective.
The 80 GHz to 100 GHz range looks interesting as well, from an attenuation perspective. Because of the physics of radios, tiny antennae work well at these frequency ranges. There are line of sight and transmit power issues.
In many cases, the other issue is access to the core network (middle mile access). A robust local access network is only as good as the bandwidth and pricing of the connections to the backbone networks.
But maybe everything can align. Unlicensed spectrum, smart people doing the algorithms, lots of people willing to share to build a big "public" or "commmunity" network, a sustainable revenue model and adequate middle mile connections.
To be sure, incumbent service providers will not wish for such a scenario. But the problem probably is just hard enough, and small enough, in terms of revenue impact, to allow some room for experimentation.
And, one hopes, experimentation could lead to new ways of supplying access, without upsetting notions of fairness.
But what if the objective is to "normalize" a competition for a full range of contestants with highly-varied skills? In that case, a "handicap" system, as used in golf, might be necessary.
Something of the same conundrum might be said to exist for broadband access services. A notion of "fairness" might suggest that all licensees in a field abide by the same rules. But that rarely happens in our modern IP communications business.
Competitors always argue, and regulatory officials typically agree, that the former monopoly provider has such entrenched advantages that handcuffs need to be kept on the incumbent, at least until such time as the competitors have had a chance to become established.
But we also have the example of industries competing directly under "unequal" rules, such as cable TV and telcos, for example, where cable operators have no wholesale obligations and leading telcos do have such obligations. That is beginning to change, slowly.
But the philosophical issues remain highly charged. Many do not believe, for example, that non-profit entities should be able to compete with for-profit entities when the non-profits can use their tax advantages to do so.
A non-profit government entity should not, in this view, be able to compel purchase of products, or use its taxing authority to raise capital, borrow at favored rates, or employ other advantages no for-profit competitor can match.
That might be one attraction for wider availability of unlicensed spectrum: it can provide the basis for greater competition without raising those other competitive issues. That doesn't mean existing competitors will agree, only that some thorny issues are avoided if non-licensed spectrum is made available.
Millimeter waves in the spectrum from 30 GHz to 300 GHz have not traditionally been usable for communications, even very short range (local distribution between a decoder and a TV, a mouse and a PC, a smart phone an a payment terminal).
Better coding and abundant cheap processing now makes those frequencies usable, in some cases, for the first time.
For some of us, the question is whether any of those frequencies will be usable either for wireless backhaul or access purposes. The big problems lie with the physics. Waves at those frequencies just don't travel that far through air, limiting their effectiveness for network access.
As a figure of merit, assume that a 3 decibel gain represents 100 percent more signal strength, while a 3 dB loss cuts signal strength by 50 percent. As always is the case with free space energy, there is less attenuation at lower frequencies, so 30 GHz to 40 GHz looks interesting, from an access perspective.
The 80 GHz to 100 GHz range looks interesting as well, from an attenuation perspective. Because of the physics of radios, tiny antennae work well at these frequency ranges. There are line of sight and transmit power issues.
In many cases, the other issue is access to the core network (middle mile access). A robust local access network is only as good as the bandwidth and pricing of the connections to the backbone networks.
But maybe everything can align. Unlicensed spectrum, smart people doing the algorithms, lots of people willing to share to build a big "public" or "commmunity" network, a sustainable revenue model and adequate middle mile connections.
To be sure, incumbent service providers will not wish for such a scenario. But the problem probably is just hard enough, and small enough, in terms of revenue impact, to allow some room for experimentation.
And, one hopes, experimentation could lead to new ways of supplying access, without upsetting notions of fairness.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
Thursday, March 7, 2013
Execs Think "New IP Services" Will Drive Revenue in 2013. Really?
There is something a bit more than a little familiar about the ways surveyed industry leaders are thinking about Long Term Evolution and 4G services. Namely, polled executives logically, but perhaps somewhat critically, seem to believe that “new IP-based services” will be a “main driver of revenue for operators in 2013.”
The SAP sponsored survey of attendees to the GSMA Mobile World Congress found that 36 percent of respondents believe “improved data speed” were among the reasons to offer LTE, while 30 percent believed “offering new IP-based services” was a primary driver for launching LTE.
No rational person would deny the soundness of those opinions. But some might recall what people were saying about 3G, and recall that exactly the same things were said.
It isn’t that the hopes are unrealistic, or out of place. But it would also be fair to say that it is unlikely “new IP-based services” really will become a “main driver of revenue for operators in 2013.”
That, some of us might say, is completely wishful thinking, unless people start defining “old things as new things” so that legacy revenue is counted as “IP new services revenue.”
Some 58 percent of respondents believe that “new offerings, including rich communication services (RCS) and 4G/LTE and increased data usage by smartphone users, will drive operator revenue.”
And there you see the problem. Many service providers are going to offer RCS services at no charge. So there is no new revenue. For those who do plan to charge, one might predict, with no further information, that any such new revenue will be rather small in magnitude.
Defining “4G/LTE” as a “new IP service” really only substitutes 4G broadband access for 3G-based broadband access. You can call the 4G services “a new IP service” in the same way that service providers define 20 Mbps “standard access” and 50 Mbps or 100 Mbps as “gold service.”
Is that really a “new IP service,” or a different tier of the same service. Sure, there are different product codes, but you get the point.
About a third of respondents based in Europe and North America expected that “new services” such as video broadcast and movies on demand would be the key to generating revenue in 2013.
To be fair, a service or product is “new” for a particular service provider who hasn’t offered it before. But do you really consider entertainment video to be a “new IP application?” It might be an “existing IP app we haven’t sold before,” but it isn’t quite what some might have in mind.
But 40 percent of Asia-based respondents placed much greater significance on increased data usage by smart phone users.
Don’t get me wrong. It’s a good thing to make more revenue by selling faster broadband access. Personally, I hate having to use 3G when I am used to 4G.
But I wouldn’t say 4G is a “new IP service.” It’s a better version of an access service.
Will “new IP services” generating significant revenue emerge? Eventually. Will truly “new” services generate significant revenue in 2013. That is doubtful. People are having a hard time demonstrating something truly new.
Faster is better. Lower latency is better. Better experience is also worthwhile. But some might argue the really new things are off in the distance someplace.
The SAP sponsored survey of attendees to the GSMA Mobile World Congress found that 36 percent of respondents believe “improved data speed” were among the reasons to offer LTE, while 30 percent believed “offering new IP-based services” was a primary driver for launching LTE.
No rational person would deny the soundness of those opinions. But some might recall what people were saying about 3G, and recall that exactly the same things were said.
It isn’t that the hopes are unrealistic, or out of place. But it would also be fair to say that it is unlikely “new IP-based services” really will become a “main driver of revenue for operators in 2013.”
That, some of us might say, is completely wishful thinking, unless people start defining “old things as new things” so that legacy revenue is counted as “IP new services revenue.”
Some 58 percent of respondents believe that “new offerings, including rich communication services (RCS) and 4G/LTE and increased data usage by smartphone users, will drive operator revenue.”
And there you see the problem. Many service providers are going to offer RCS services at no charge. So there is no new revenue. For those who do plan to charge, one might predict, with no further information, that any such new revenue will be rather small in magnitude.
Defining “4G/LTE” as a “new IP service” really only substitutes 4G broadband access for 3G-based broadband access. You can call the 4G services “a new IP service” in the same way that service providers define 20 Mbps “standard access” and 50 Mbps or 100 Mbps as “gold service.”
Is that really a “new IP service,” or a different tier of the same service. Sure, there are different product codes, but you get the point.
About a third of respondents based in Europe and North America expected that “new services” such as video broadcast and movies on demand would be the key to generating revenue in 2013.
To be fair, a service or product is “new” for a particular service provider who hasn’t offered it before. But do you really consider entertainment video to be a “new IP application?” It might be an “existing IP app we haven’t sold before,” but it isn’t quite what some might have in mind.
But 40 percent of Asia-based respondents placed much greater significance on increased data usage by smart phone users.
Don’t get me wrong. It’s a good thing to make more revenue by selling faster broadband access. Personally, I hate having to use 3G when I am used to 4G.
But I wouldn’t say 4G is a “new IP service.” It’s a better version of an access service.
Will “new IP services” generating significant revenue emerge? Eventually. Will truly “new” services generate significant revenue in 2013. That is doubtful. People are having a hard time demonstrating something truly new.
Faster is better. Lower latency is better. Better experience is also worthwhile. But some might argue the really new things are off in the distance someplace.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
New Ways Milliseconds Can Matter
Milliseconds long have mattered for high frequency traders, since a time lead of that magnitude can translate into hundreds of thousands of dollars of extra profit on a trade.
Such algorithmic trading is handled by computers, not live humans, so trading speeds are limited by processors, software and sometimes even the distance between two computers that are parties to executing a trade.
But Adobe now believes such differences of milliseconds might someday be important for digital marketing platforms as well. It is a bit of hyperbole at the moment.
But Adobe argues that such speed advantages may separate winners and losers, according to Brad Rencher, Adobe SVP.
Adobe is aiming to build a millisecond of a lead on the competition. It turns out that the millisecond is the one that occurs between the last piece of data a consumer “gives“ a system and the content with which the system responds.
What happens in that millisecond? The system needs to correlate, manipulate, measure and analyze all the various pockets of data is has on the consumer and then choose, assemble and display the relevant content to her.
The content that these algorithms choose and that the infrastructure renders, must encourage the consumer to take a preferred action, especially buying something.
The analogy is not quite perfect, since milliseconds are not relevant for human cognition or perception. But milliseconds might be a reasonable of describing latency as computers experience it when crunching enormous quantities of data before presenting some sort of solution in a marketing context.
In other words, in the big data environment, when evaluating data and then assembling offers, for example, milliseconds might matter when trying to find meaningful signals about what people are doing right now, where they are, and what device they are using, and what they might want.
That might also imply that milliseconds could matter, in new ways, for communication networks other than those supporting high frequency trading systems.
Such algorithmic trading is handled by computers, not live humans, so trading speeds are limited by processors, software and sometimes even the distance between two computers that are parties to executing a trade.
But Adobe now believes such differences of milliseconds might someday be important for digital marketing platforms as well. It is a bit of hyperbole at the moment.
But Adobe argues that such speed advantages may separate winners and losers, according to Brad Rencher, Adobe SVP.
Adobe is aiming to build a millisecond of a lead on the competition. It turns out that the millisecond is the one that occurs between the last piece of data a consumer “gives“ a system and the content with which the system responds.
What happens in that millisecond? The system needs to correlate, manipulate, measure and analyze all the various pockets of data is has on the consumer and then choose, assemble and display the relevant content to her.
The content that these algorithms choose and that the infrastructure renders, must encourage the consumer to take a preferred action, especially buying something.
The analogy is not quite perfect, since milliseconds are not relevant for human cognition or perception. But milliseconds might be a reasonable of describing latency as computers experience it when crunching enormous quantities of data before presenting some sort of solution in a marketing context.
In other words, in the big data environment, when evaluating data and then assembling offers, for example, milliseconds might matter when trying to find meaningful signals about what people are doing right now, where they are, and what device they are using, and what they might want.
So milliseconds might matter if that data has be correlated with what is known about the customer from all available sources of data (CRM, social) to create a more granular view of a particular person’s values, past behavior and buying preferences. Then the objective is to use that understanding to predict what content must be delivered, immediately and in context to achieve a commercial objective.
In that sense, milliseconds might matter, even if humans cannot apprehend delays that small.
That might also imply that milliseconds could matter, in new ways, for communication networks other than those supporting high frequency trading systems.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
Kentucky Deregulates, AT&T Invests
Some observers do worry that a growing wave of deregulation regarding universal service obligations on dominant telcos is going to be bad for consumers. In Kentucky, for example, Senate bill SB88 repeals statewide service obligations on incumbent telcos regarding landline services, which formerly had to be provided statewide.
Under provisions of the new bill, incumbents must continue providing basic local exchange services to residences where those residence currently exist, and when such residences are located in areas with less than 5000 housing units.
Incumbents do not have to provide basic local exchange services when the carrier offers an alternative telephone service, when there are at least two providers offering telephone services in the area or when there is at least one provider of broadband service that is capable of delivering telephone service.
It might not be coincidental that AT&T has announced it now will spend between $600 and $800 million during 2013 to 2015 to support its current network capabilities and expand access to broadband services in Kentucky.
Perhaps observers should not worry so much. There appear to be no shortage of third party, independent service providers more than happy to compete for customers in Kentucky and elsewhere.
If the fear is that AT&T will withdraw from some markets, that only increases the market opportunity for other providers eager to fill the gap. One only has to spend time with wireless ISPs to understand that.
Gary Kim was cited as a global "Power Mobile Influencer" by Forbes, ranked second in the world for coverage of the mobile business, and as a "top 10" telecom analyst. He is a member of Mensa, the international organization for people with IQs in the top two percent.
Subscribe to:
Comments (Atom)
If You are Uncomfortable with Paradox and Mystery, You Might be Uncomfortable with Life!
One of the most distinctive habits of Catholic thought is its refusal to resolve complex moral questions by choosing one pole of a tension a...
-
We have all repeatedly seen comparisons of equity value of hyperscale app providers compared to the value of connectivity providers, which s...
-
It really is surprising how often a Pareto distribution--the “80/20 rule--appears in business life, or in life, generally. Basically, the...
-
One recurring issue with forecasts of multi-access edge computing is that it is easier to make predictions about cost than revenue and infra...
