Thursday, February 16, 2017

Why New Forms of Spectrum Sharing Are Important

Spectrum sharing is highly unusual in the telecom business, with one exception: Wi-Fi. In the unlicensed Wi-Fi bands, users and service providers have been free to offer access, so long as interference is avoided, to the extent the standards allow. Basically, devices must limit output power, and interference therefore becomes a matter of how many devices want to use any particular block of spectrum at a particular place and time. 

Nevertheless, spectrum sharing is about to have much greater impact, as a growing range of countries are looking at how new forms of spectrum sharing can efficiently and effectively increase the amount of usable communications spectrum.

The difference is that the new forms of sharing will allow new users access, on a conditional basis, to spectrum already licensed to other users, typically government entities. The basic idea: when the licensed user has surplus bandwidth not used or lightly used, it will be possible for secondary users to receive licenses to use that same spectrum, so long as that use does not impinge on primary licensee usage. 

In at least some cases, as in the U.S. market, once primary and secondary licenses are issued, it might still be possible for available remaining spectrum to be used, opportunistically, by a third tier of users who have no guaranteed access rights. In such instances, access will operate as does Wi-Fi: best effort only, and only to the extent primary and secondary users have no immediate need for access. 

It is possible the first impact will be felt in a few licensed bands (3.5 GHz in the United States, 3.2 GHz in Europe), where regulators and users will essentially test the workability of such shared spectrum systems. Beyond that, it is likely that sharing also will happen in at least some new millimeter wave bands not presently used for communications purposes. 

Spectrum matters because communications matters, and wireless and mobile communications now dominate all communications globally.

Spectrum sharing matters because communications spectrum is a scarce asset, and demand is growing very fast, both because billions of new Internet access users will come online, and because new Internet apps and devices consume vastly more bandwidth.

There is, for example, almost no uncommitted communications spectrum available in the sub-2-GHz range.

So flexibility and efficiency gains are to be welcomed. Traditionally, spectrum has been licensed to specific users, typically with limitations on what they can do with that spectrum as well as technology-prescribed conditions as well.

That inflexibility is an issue when demand changes faster than regulation, which is to say nearly always.

Though there is an expectation that much spectrum in millimeter bands (3 GHz to 300 GHz) can be allocated for communications purposes, most of that spectrum will be severely “short range,” and hence best suited for indoor or small cell applications.

Global mobile data traffic grew 69 percent in 2014, and each succeeding mobile generation seems to grow consumption by an order of magnitude, according to Cisco estimates. Long Term Evolution (4G) devices consume an order of magnitude more data than a non-LTE device, for example.

Any smartphone tends to lead to consumption of 37 times the data of a feature phone, according to Cisco. And smartphones are becoming the standard global device. Where today 28 percent of customers use smartphones, that will grow to perhaps 52 percent by 2018.

Use of Internet access plans might reach 84 percent by 2020, according to Ericsson.

All of that means spectrum matters, as both the number of users, and the amount of bandwidth consumed by those users will grow an order of magnitude in five years.

Some idea of the value of such spectrum is easy to illustrate. It has been estimated that  the value of licensed U.S. mobile spectrum is $500 billion, for example. Likewise, it has been estimated that the value of U.S. Wi-Fi spectrum alone represents $140 billion in value.

To be sure, spectrum sharing also introduces a new element of business model uncertainty, because spectrum sharing can replace a large measure of scarcity with a large measure of abundance.

And abundance means lower value for licensed spectrum, even as it increases the range of sustainable business models that can be built on spectrum.

Nearly all of the most-useful communications spectrum already has been allocated, and much spectrum is inefficiently used.

Today, the U.S. government, for example, possesses almost 60 percent of radio spectrum and possesses over half—1500 MHz—of the valuable 300 MHz to 3 GHz spectrum useful for terrestrial wireless and mobile communications.

Much of that spectrum is lightly used or even not used. At a time when most observers believe people, organizations and businesses will need vastly more Internet and communications capacity, that is a waste of scarce resources.

So the thought naturally occurs: can those users continue to have communications functionality while allowing others to create commercial services? Traditionally, that would have required high cost and much time, neither desirable when Internet and communications demand changes so rapidly.

In fact, a 2012 National Telecommunications and information Administration report found that moving Federal users completely out of the 1755-1850 MHz band would cost  approximately $18 billion and take 10 years.

And that is the reason spectrum sharing is so important. It holds the promise of communications abundance.

Spectrum sharing is a more-efficient way to maximize use of scarce resources, at less cost and delay than required to clear spectrum the old-fashioned way.

Spectrum Sharing Now is Commercially Feasible

Spectrum sharing now is practical because we are able to apply cheap and sophisticated signal processing to communications tasks. As a result, virtually all communications spectrum can be used more efficiently and effectively.

Cheap and sophisticated signal processing allows commercial use of millimeter wave spectrum (3 GHz to 300 GHz) for the first time. The same advances allow us to use existing spectrum more efficiently, moving beyond simple frequency or spatial separation.

Those methods work, but also create fallow resources. Since nobody but the licensee can use the capacity, when the licensee is not using spectrum, nobody else can use it, either. In some cases, as in the United Kingdom and United States, as little as 10 percent of spectrum gets used. In other cases, none of the capacity is used.

Two fundamental approaches now are feasible to allow many users to share capacity without causing interference to existing licensed users, but also vastly expanding the amount of capacity available to support communications and apps.

Devices themselves, or databases, are able to sense or predict where interference would occur, and then shift access operations to non-interfering frequencies or channels. Cognitive radio is an example of the former approach; databases an example of the latter approach.

In other words, where we traditionally have used “command and control” methods–giving certain entities exclusive rights to use certain channels or blocks of spectrum–it is commercially feasible to use other methods that efficiently reclaim unused spectrum.

As Google Principal Wireless Architect Preston Marshall has noted, traditionally we had to isolate users, usages or technologies in order to protect against interference. That is no longer are the only choices.

Licensing traditionally has used spatial division (different frequencies or geographies) to prevent interference. Today, we can use sensing or databases to allow users to share any specific block of spectrum or channels, while still avoiding interference.

The implications are very clear: though physical spectrum is a scarce resource, we often use such resources inefficiently. As Ofcom has noted, in many cases licensed or unlicensed spectrum actually is used at about 10 percent of theoretical maximums.

Spectrum Sharing Can Take Many Forms

Spectrum sharing is the simultaneous usage of a specific radio frequency band in a specific geographical area by a number of independent entities.  Simply, it is the “cooperative use of common spectrum” by multiple users.

Spectrum sharing also can take many forms, coordinated and uncoordinated. Coordinated forms include:
  • capacity sharing between business entities (roaming, wholesale, pooling of assets)
  • TV white spaces (database determines what you may use, when and where)
  • spatial sharing between business entities (you use here, I use there)
  • priority sharing between entities (I have first rights, you have secondary rights) Licensed shared access or authorized shared access are examples
  • license assisted access (bonding of mobile and Wi-Fi assets)
  • cognitive radio (devices determine how to avoid interference)

Uncoordinated forms of access historically is best illustrated by Wi-Fi.

The point is that spectrum sharing can take a number of forms, some confined to contracts and agreements between economic actors while others arguably are more profound.

One might argue that liberalized leasing or trading rules represent a simple case for spectrum sharing.

Forms of sharing that enable shared use of currently-licensed spectrum arguably are among the most innovative.

Someplace in the middle are use of cognitive radio or database approaches to allow shared use of new spectrum, whether licensed or license-exempt.

In some cases, sharing is a business arrangement between entities. Historically, mobile virtual network operator wholesale is a form of sharing. So too is “roaming,” in a sense. In other cases, mobile operators might agree to pool and share licensed spectrum assets.

The arguably more important forms of spectrum sharing use new technology to intensify the use of existing spectrum, such as LSA that allows many users to share a specific block of spectrum.

The concept is to free up capacity quickly by allowing commercial users access to currently-licensed spectrum on a secondary basis, while licensed users continue to retain priority use of their spectrum.

The advantage is that such sharing avoids the huge time and expense of relocating existing users so other users can move in.

So far, thinking has centered around such sharing 2.3 GHz in some regions and 3.5 GHz in other regions.

Licensed shared access (LSA) and authorized shared access (ASA) illustrate the concept.
Such sharing allows licensed services to share spectrum in a band with new users without disrupting existing users, while still increasing the amount of spectrum available for other users.
The new form of licensing is under formal review in the United States and European Union, and will be addressed by the International Telecommunications Union.

This is important for a number of reasons, the most important reason being that it is less disruptive than moving users from their current bands to give access to new users. Not only does this approach save the significant costs for relocating users and their access gear from one frequency to another, it also creates new capacity much faster than any relocation approach requires.

Under the licensed shared access approach, additional users can use the spectrum (or part of the spectrum) in accordance with sharing rules that protect incumbents.

Such approaches almost always will require incentives for the incumbent users to permit sharing.
That might include direct payments from the new user or the regulator, payments to upgrade equipment or take other costly actions than would facilitate sharing or savings on fees paid to the regulator for underused spectrum.

In Europe, such sharing likely will emerge first in the 2.3 GHz band, to support mobile services. LSA is being worked on in France, Finland, Italy and the Netherlands.
The United States is developing an approach to sharing in the 3.5 GHz band, as well. In the U.S. model, a three-layer model is envisioned, with protected incumbent access, priority access (some interference protection) and general authorized access (opportunistic access without interference protection).

An Era of Abundance and Change

Spectrum sharing is one method by which vast amounts of new communications spectrum–hundreds of megaHertz of spectrum–can be made available, faster and more affordably than would be the case if current users were relocated.  

The big coming change is that abundant and affordable computing now makes possible the use of spectrum that was commercially unusable in the past, and also shared use of spectrum that is inefficiently used at present, without moving existing licensed users, something that is both expensive and time consuming.

Spectrum is valuable. That will continue to be the case. Exclusive rights to use spectrum creates the foundation for commercial applications and also confers business advantage.
But exclusivity increasingly will be challenged.

What will spectrum sharing mean for Internet service providers and consumers? How might industry dynamics and the supply of Internet access services change? Who wins, who loses?
Those are the sorts of spectrum sharing issues policymakers, ISPs and their suppliers must confront, and why Spectrum Futures–a forum for “whole ecosystem” consider of those issues,  exists.

Simply, spectrum sharing affects the future of telecommunications and all businesses built on the use of communications.

So spectrum sharing directly encourages and shapes whole business models, partly by increasing the amount of spectrum; partly by reshaping the value of licensed spectrum and partly by creating space for new business models potential built on either cheaper spectrum or new unlicensed spectrum.

To the extent that use of licensed spectrum has underpinned mobile service provider (and other provider) business models, increased reliance on shared spectrum and license-exempt spectrum will reshape the fortunes of whole industries.

“The norm for spectrum use should be sharing, not exclusivity,” according to the President’s  Council of Advisors on Science and Technology report.

Be clear on this matter: spectrum sharing can be viewed as good public policy because it more efficiently makes available lots more spectrum for Internet access, mobile communications and other applications humans, governments, companies and industries find useful.

Spectrum sharing also represents a revolution in spectrum policy, a challenge to business models based on spectrum scarcity and an opportunity for business models based on sustainability.
We might all readily agree that freeing up lots of new spectrum, unlicensed or at low cost, will be directly helpful to the project of getting billions of new people connected to the Internet, not to mention the future business of connecting sensors and devices to the Internet.

What also matters, though, is the sustainability of business models that support Internet access providers. To the extent that scarcity underpins business models, less scarcity might be a threat of some magnitude.

Conversely, to the extent that license-exempt access supports many other endeavors, more spectrum–licensed or unlicensed–creates additional possibilities across the ecosystem.
And as we already have seen with Wi-Fi offloading of mobile device Internet access, the implications are subtle and complex.

Mobile and fixed service providers now see license-exempt access as part of the access infrastructure, even if larger amounts of unlicensed communications spectrum also might be competitors to mobile access.

Conversely, access to unlicensed spectrum also underpins other business models, including models that might envision use of license-exempt spectrum to create substitutes for some “mobile” services, at some times and some places.

To this point, mobile phone services (among others) have exemplified the former; Wi-Fi the latter. But something very new is happening: licensed and unlicensed capacity are being used in new ways to support all sorts of business models.

As a general rule, we should assume that both licensed and license-exempt communications spectrum will be supported in the future. The issue is how much of each will be used.
Ofcom, the United Kingdom communications regulator, identifies three areas where spectrum sharing will be important:
  • indoor use, generally Wi-Fi
  • outdoor use, generally mobile
  • Internet of Things (IoT), on a variety of platforms
That might be too limited a list. But you get the idea: spectrum sharing is significant because it allows relatively rapid and affordable increases in communications spectrum, without the time and expense of relocating existing users.

Spectrum sharing is important because it also allows more efficient use of new bands of spectrum, often without the expense and overhead of command-and-control mechanisms.

Spectrum Sharing Matters

Spectrum sharing matters because communications spectrum is a scarce asset, and demand is growing very fast, both because billions of new Internet access users will come online, and because new Internet apps and devices consume vastly more bandwidth.

Even as national regulators release new blocks of spectrum for communications use, we also can use new technologies to improve the usage of valuable licensed communications spectrum, without the disruption of relocating existing users.

In one sense, spectrum is artificially scarce, the result of “command and control” licensing. In other words, when spectrum can only be used by one set of users, and those users do not use the assets, the capacity is “wasted.”

That might especially be the case for licensed government spectrum, where users do not have any economic incentives to maximize use of the asset.  

Efficient use of latent and already allocated bandwidth is possible and necessary. Consider present allocations of spectrum.

Total mobile spectrum in the United States is 608 MHz, for example. In France, 555 MHz is available. In Germany 615 MHz is available; in Italy 540 MHz; in Japan 500 MHz; in Spain 540 MHz. In India 220 MHz is available for mobile communications.

In the United States, that works out to 2.1 Hertz per subscriber; in France 9.3 Hertz per subscriber; in Spain 11.8 Hertz per subscriber. In Germany, 6.2 Hertz per subscriber is available.
In India, just 0.2 Hertz per subscriber is available. And that is in a market where voice connections on second generation networks dominate. As 3G and 4G networks come online, and more customers use mobile Internet access services, bandwidth needs will grow an order of magnitude initially.

Beyond that, the impact of smartphones, Internet access and changing application consumption is clear: bandwidth requirements continually increase. In fact, consumer mobile data consumption has grown at 57 percent annually.

At growth rates that high, a variety of remedies are necessary, but more efficient use of existing spectrum must be part of the solution. Spectrum sharing is key in that regard.

Wednesday, February 15, 2017

Huntsville Now Will Have 4 Gigabit Providers

With the activation of gigabit internet access by Mediacom in Huntsville, Ala., four providers will be offering gigabit internet access in various parts of the city, including AT&T, Google Fiber and WoW.

As so often happens in competitive markets, there is no such thing as permanent competitive advantage. As Google Fiber starting building gigabit markets, telco and cable companies, and some overbuilders, responded in kind, though in most markets, only the cable operator offers ubiquitous gigabit service across its whole footprint.  

What remains to be seen is if--or whether--market share actually shifts, after all the upgrades are completed and full marketing has had time to produce results. If CenturyLink’s experience is replicated, the biggest impact might not be significant shifts in market share. There might be some share shifts, but perhaps nothing too dramatic.

Instead, most of the suppliers might find that existing customers upgrade to higher-speed tiers, but perhaps not to full gigabit speeds. CenturyLink, for example, has found that after it begins marketing gigabit internet access, the biggest change is an upsurge of order for 40 Mbps service.

Perhaps ironically, the impact of gigabit access competition in Huntsville will not so much be to shift market share, but to incentivize speed upgrades by customers to new services that are faster than what they had been buying, but  that fall short of full gigabit speeds. The revenue impact then depends on how ISPs adjust prices in the wake of gigabit tiers being added. Modest upward movement in average revenue per account is likely; big changes are not likely.

In "Winner Take All" Markets, Even "Winning" Can Mean "Losing"

Verizon has what some would consider ambitious goals for its digital advertising business, which can be summarized as becoming the choice for advertisers who do not want to use Facebook or Google.

That is why Verizon is buying Yahoo. Verizon will need much more scale to become a credible alternative to Facebook and Google, which together claim nearly 80 percent of all mobile advertising revenue.

It might not make much sense to argue that Verizon aims to become the number-three provider, for a number of reasons. Tier one providers never claim they are aiming to be “number three” in any big market. Also, what it might eventually mean to be the default choice of advertisers not preferring Google or Facebook is a bit unclear. Right now, it might only take five percent market share to be “number three” in mobile advertising. Verizon would never aim so low.

Would seven percent share be a reasonable expectation? Yes. Most stable markets, over time, tend to have a structure where, whatever the share of the  the number-one provider, number two has about half that share, while number three has half the share of the number-two provider.

Were it to complete the purchase of Yahoo, Verizon would have a shot at becoming the number-three provider, but Verizon in all likelihood would have to add even more assets to secure that position.

If, eventually, the leader of the mobile ad market has 40 percent share, one would then expect number two to have about 20 percent share, while number three has 10 percent share. Real-world markets always tend to diverge somewhat from the predicted pattern, but stable markets often take that shape.

But those patterns also suggest why it will be difficult for Verizon to do much better than achieve the number-three market share position. Many “digital” markets are widely considered to be “winner take all” affairs, where leadership can well represent market share much more than 40 percent.

That seems to hold true for e-commerce, search, social media and other emerging markets (lodging apps, ride hailing).

So unless Facebook and Google really mess things up, it will be hard, under the best of conditions, for Verizon to challenge either for leadership in the mobile advertising platform business.


source: eMarketer

Old Lines are Blurring, Regulation Has to Follow

One persistent issue for communications and media regulators, policymakers and antitrust officials is that technology and business models have rearranged our notions of “who” service providers are, “what” a service is, “where, when and how” such services are provided and then how to change our notions of “who” operates in a particular industry; how to draw the boundaries and “what” needs regulating.

Those problems are going to evolve further as boundaries between communications, content, shopping and transaction processing continue to morph.

It remains true that regulations treat voice over IP differently, at times, than “carrier voice.” Cable TV companies are regulated differently than telcos or third-party internet service providers who also sell entertainment video or voice services. Google is an ISP, a mobile services and voice provider. Facebook provides messaging, Apple supplies video calling, while Amazon sells entertainment video.

And though spectrum usable for communications historically has been quite scarce, that might not be true in the future. Nor will the potential users of such spectrum--to create services and apps--be so limited as in the past.

As  Amazon and Google voice-activated home appliances could be used as speakerphones, Apple could emerge as a bigger player in content services and Comcast and Charter enter the mobile business, the boundaries between industries will blur.

As we soon will see in the U.S. market, that is likely to pose problems for regulators asked to approve mergers where traditional application of industry concentration metrics is complicated.

The longer-term implications are that it might not be possible--or even necessary--to regulate in the older ways (by industry segment, by technology, by supplier, by business model). Interference protection, in the use of spectrum areas, might become more vital than licensing restrictions.

It might not be easy  to measure industry concentration when apps and services are provided across traditional industry lines, using many technologies and business models, in different volumes.

Tuesday, February 14, 2017

OTT Provides Better Video Business Model Than Linear Video, CenturyLink Believes

LInear video always has been a tough business model for any small telco. That also seems to be true even for firms the size of AT&T, Verizon and CenturyLink, for several reasons.

Verizon earns modest amounts of total revenue from anything other than mobility services. Mobility generally contributes as much as 70 percent of total revenue, with fixed network business and consumer services representing 30 percent. AT&T earns more total revenue from its fixed network than does Verizon, but consumer video service revenue mostly is generated by the DirecTV operation, not fixed network video.

AT&T only had about five percent market share in the linear video subscription business prior to its acquisition of DirecTV. Now AT&T is the single largest linear video provider, but on the strength of satellite delivery, on market share of fixed network services.

So video services represent growing amounts of revenue, but not at a significant  level. “We don't look to video as a significant revenue and EBITDA contributor in 2017,” CenturyLink has said.

Also, all consumer revenues only contribute about 25 percent of CenturyLink total revenues, which mostly are earned selling services to business customers.


“If you look at our Prism product, as you know, we've talked about content costs have really gone out of sight the last couple of years,” said Glen Post, CenturyLink CEO. “If you look at the margins, sometimes actually negative margins,” said Glen Post, CenturyLink CEO.

In addition to the “cost of goods sold” problem, CenturyLink costs include truck rolls and installation capital and operating cost.  

“With over-the-top product, we don't have to make a truck roll,” said Post. “We have much wider availability due to lower bandwidth requirements of over-the-top.”

In other words, the business model for over the top TV is better than for linear video.

Why Telcos Do Not Sell Mobile PBX Services: Market is Way Too Small

Every now and then, someone advocates a “mobile PBX” service that would allow enterprises and other businesses and organizations to map organization phone numbers to personal cell phones, probably using a virtualized switching scheme that does away with the need to own a business phone system.

Some suggest mobile service providers are “dumb” for not offering such a service and capability. Maybe not. For starters, the incremental revenue opportunity might be relatively small, and the relative hassle and cost relatively high.

The global market for sales of business phone systems is probably in the neighborhood of $6.4 billion annually.

That is a fairly small market for all telcos globally, if one assumes that the actual revenue earned by a telco selling such a system would be some fraction of the sales price. Assume a 10-percent profit margin on the direct value of sold merchandise.

That means the global revenue (before overhead and sales costs) is about $640 million annually. That is way too small a business for even a single tier-one provider to want to tackle, as that is global revenues. The cost of setting up a global sales and service organization of that magnitude would be prohibitive.

Some might argue that is not a problem, as a virtual PBX service would net recurring revenues of greater interest. But the problem there is the same issue that has likely prevented even more organizations from buying phone systems. After some point, buying multiple lines of service becomes more expensive than buying a business phone system, as suppliers of managed business phone services can readily attest.

If you want to know why mobile service providers have not moved sooner to create mobile PBX services, the business model explains why.

The market is simply too small to bother with. Much the same problem is encountered with any proposed mobile unified communications offer. The market simply is way too small for a telco to attempt to provide.



IoT Will Require Mobile Network Overlay

Much as some mobile industry executives might prefer a single network platform that supports all conceivable apps--low bandwidth and high bandwidth;  latency insensitive and latency dependent; low cost per bit and higher cost per bit; human and machine users; mobile and fixed--that does not seem likely to happen in the 5G era, Cisco believes.

It might well develop that low-bandwidth sensors, requiring unusually low connectivity prices, are best supported by an overlay IoT network, rather than some variant of either 4G or 5G networks.

Cisco sees specialized low power wide area (LPWA) networks will grow from seven percent in 2016 to 31 percent of device connections by 2021, with mobile operators deploying such networks as an overlay.

Globally, M2M connections will grow from 780 million in 2016 to 3.3 billion by 2021, a 34-percent compound annual growth rate, Cisco forecasts.

 



Wearable devices might normally be considered the category of IoT appliances that use mobile network connections the most. Maybe not. By 2021, Cisco estimates there will be 929 million wearable devices globally, growing nearly threefold from 325 million in 2016 at a CAGR of 23 percent.

But only about seven  percent will have embedded cellular connectivity by 2021, up from three percent in 2016.
C11-738429-00_figure17.jpg


in 5G Era, Will Wi-Fi Offload Still be So Important? Probably Not

Offload of mobile device connections from the mobile network to Wi-Fi has been a generally-growing trend for some time, steadily increasing from 2G to 3G to 4G. But that might well change with 5G, where so much capacity, and lower costs, might provide clear incentives to remain connected on the mobile network, most of the time.

Some 31 percent of mobile device traffic was offloaded on 2G networks, 45 percent on 3G networks and 66 percent on 4G. That might well reverse in the 5G era, where Cisco suggests 48 percent of mobile device traffic will be offloaded. In some markets, where tariffs are encouraging, and capacity is not an issue, the offload percentage might be far lower than that.

C11-738429-00_figure20.jpg

In 2016, 63 percent of all traffic from mobile-connected devices was offloaded to the fixed network by means of Wi-Fi devices and femtocells each month, Cisco says.

Of all IP traffic (fixed and mobile) in 2021, 50 percent will be Wi-Fi, 30 percent will be wired and 20 percent  will be mobile.

Cisco argues that offload volume is determined by smartphone penetration, dual-mode share of handsets, percentage of home-based mobile Internet use, and percentage of dual-mode smartphone owners with Wi-Fi fixed Internet access at home.

Some of us might argue that, in addition to those issues, retail tariffs and network capacity will play a key role, perhaps even a decisive role. The reason is that consumers are rational where it comes to paying for access. If remaining on the mobile network provides reasonable user experience at reasonable cost, the incentive to offload is reduced (even if it is very easy, and seamless, for devices to switch.

That noted, not all consumers will be on 5G connections in 2021, so incentives to offload will still exist.
C11-738429-00_figure19.jpg

For that reason, the amount of traffic offloaded from smartphones will be 64 percent by 2021, and the amount of traffic offloaded from tablets will be 72 percent.

Cisco notes that data caps and costs are issues. “Some have speculated that Wi-Fi offload will be less relevant after 4G networks are in place because of the faster speeds and more abundant bandwidth,” Cisco’s Visual Networking Index staff notes. “However, 4G networks have attracted high-usage devices such as advanced smartphones and tablets, and now 4G plans are subject to data caps similar to 3G plans.”

“For these reasons, Wi-Fi offload is higher on 4G networks than on lower-speed networks, now and in the future according to our projections,” Cisco staff say.

But the same report also suggests that, on 5G networks, Wi-Fi offloading will decrease. “As 5G is being introduced, plans will be generous with data caps and speeds will be high enough to encourage traffic to stay on the mobile network instead of being offloaded, so the offload percentage will be less than 50 percent,” they say.

Wi-Fi and mobile traffic both are growing faster than fixed traffic (traffic from devices connected to the network through Ethernet).

Fixed traffic will fall from 52 percent of total IP traffic in 2015 to 33 percent by 2020, as a result.

C11-738429-00_figure23.jpg


Video Drives 75% of Mobile Data Traffic

More than 75 percent of the world’s mobile data traffic will be video by 2021, growing 900 percent between 2016 and 2021, accounting for 78 percent of total mobile data traffic by the end of the forecast period.

The average smartphone will generate 6.8 GB of traffic per month by 2021, a fourfold increase over the 2016 average of 1.6 GB per month. By 2021, aggregate smartphone traffic will be seven times greater than it is today, with a CAGR of 48 percent.

Asia Pacific will account for 47 percent of global mobile traffic by 2021, according to Cisco. But the Middle East and Africa will have the strongest mobile data traffic growth of any region with a 65 percent compound annual growth rate (CAGR). This region will be followed by Asia Pacific at 49 percent and Latin America at 45 percent CAGR.


China’s mobile traffic will surpass that of the United States by the end of 2017. China’s mobile traffic will reach 1.9 exabytes per month by the end of 2017, and mobile traffic in the United States will reach 1.6 exabytes per month, sys Cisco.

Average Mobile Speeds to Grow by Order of Magnitude Next 5 Years

By 2021, more people will be using mobile phones (5.5 billion) than bank accounts (5.4 billion), running water (5.3 billion), or landline phones  (2.9 billion), according to the 11th annual Cisco Visual Networking Index.

Mobile data traffic seven-fold over the next five years, as a direct result.

Mobile network connection speeds, on average, will increase from 6.8 Mbps in 2016 to 20.4 Mbps by 2021.

Machine-to-machine (M2M) connections will represent 29 percent (3.3 billion) of total mobile connections -- up from five percent (780 million) in 2016.

M2M will be the fastest growing mobile connection type as global Internet of Things (IoT) applications continue to gain traction in consumer and business environments, Cisco predicts.

Mobile video will increase 8.7-fold from 2016 to 2021 and will have the highest growth rate of any mobile application category, representing 78 percent of all mobile traffic by 2021.
In 2016, 60 percent of total mobile data traffic was offloaded; by 2021, 63 percent of total mobile data traffic will be offloaded, Cisco predicts.
In part, that might happen because, globally, total public Wi-Fi hotspots (including homespots) will grow six-fold from 2016 (94.0 million) to 2021 (541.6 million).
Wi-Fi traffic from both mobile devices and Wi-Fi-only devices together will account for almost half (49 percent) of total IP traffic by 2020, up from 42 percent in 2015.
What might be interesting is whether that reliance on Wi-Fi might actually decrease, in some regions, as tariffs and mobile network capabilities grow so much that users see less benefit to offloading. In the 3G era, offload made sense because user experience was better. In the 4G era, offloading makes sense mostly to avoid mobile data usage charges, as experience generally is better on the 4G network. In the 5G era, it is conceivable, at least in many areas, that no economic or performance incentive will exist for traffic offload.
source: Cisco

Three Leading U.S. Mobile Operators Launch New Promotions

When one major supplier--especially a service provider positioned as a premium brand-- announces new mobile promotional deals, in a highly-competitive market, it does not take long for the other contestants to respond.

In the U.S. market, Sprint launched new promotions. Then Verizon and T-Mobile US responded. Now the issue is whether AT&T will decide it has to move also, and what form that response might take, if AT&T concludes that its core postpaid, multi-line account base is at risk.

The most-recent wave of promotions has seen the return of some form of unlimited-use plan to all four leading mobile operators, zero rating of video and price discounting.

The Sprint unlimited plans offer unlimited mobile data, talk and text messaging for $50 per month for the first line (with automatic payment); two lines for just $90 a month; and the third, fourth and fifth lines supplied for free.

In other words, five lines cost just $90 a month. Some might note that, as always, the promotions are time limited, so the long term impact is hard to gauge. Some might argue that additional promotions tend to follow, until the suppliers reach whatever market share goals they have targeted.

Savings through March 31, 2018. After the promotional period is over, customers will pay an  additional $10 per month for first line, the second line remains at $40 per month and each of lines three to five are $30 per month.

Verizon then launched new unlimited plans, something Verizon had eliminated in 2011. The Verizon Unlimited plan costs $80 a month, for unlimited data, talk and text, using paper-free billing and AutoPay features.

Both T-Mobile US and Sprint gained market share in the second half of 2016, presumably on the strength of  aggressive “unlimited usage” promotions. Verizon previously had stopped selling unlimited plans in 2011.

At the moment, all four leading U.S. carriers offer unlimited usage plans of one sort or another.

Just prior to the Verizon announcement, Sprint introduced a new family promotion, offering five lines of unlimited data for $90 per month excluding taxes and fees. Sprint's promotion lasts through March 31, 2017.

A current T-Mobile US unlimited plan costs $180 plan for five lines and $160 plan for four lines,  including taxes and fees.

Multi-user plans cost $45 per line for four lines. Some nevertheless are going to complain. After 22 GB of data usage on a line during any billing cycle, Verizon says it “may prioritize usage” in the event of network congestion. That “throttling” feature always is criticized in some quarters as a violation of the “unlimited” feature, but others simply see that as “fair use” policies.

Also included are up to 500 MB per day of 4G LTE roaming in Mexico and Canada.

T-Mobile US announced the addition of high-definition quality (HD) video and 10 GB of T-Mobile hotspot access, at no extra charge, with monthly taxes and fees included, for T-Mobile US customers on “T-Mobile One” customers,  T-Mobile US says.

T-Mobile US also introduced a new offer of two lines on T-Mobile ONE for $100 a month.  

The upgrades are available starting February 17, at no extra charge for customers on “T-Mobile One” service plans. Customers can simply activate their new features in the T-Mobile app or at my.t-mobile.com.

Previously, T-Mobile US had been offering unlimited video streaming--without usage charges on the customer’s mobile data plan--at standard definition. The latest move bumps up image quality, and also bandwidth consumption per minute of use.

Customers will get HD quality video streaming and up to 10GB of high-speed Mobile Hotspot data per month, so they can ‘tether’ a laptop or other device to access the Internet. And, after the included 10GB of high-speed data, customers still get unlimited 3G data through the end of the month.

As the latest move by T-Mobile US shows, unlimited data plans might have disruptive consequences.

Now that all four leading U.S. mobile service providers now offer some form of unlimited usage plan, consumer behavior and service provider behavior become crucial. Will consumer usage increase, and by how much, where and when? Will service providers keep the unlimited offers prominent in their marketing efforts?

If mobile network usage profiles change, how will that affect quality of experience on the various networks? And what will mobile service providers have to do to maintain quality in the face of increased network demand? How much can they do, near term?

In other words, will network congestion suddenly become a much-bigger issue?

Monday, February 13, 2017

Who are the Gatekeepers in the Internet Ecosystem?

Regulators often think of "gatekeeper" power as an "access provider" issue. But some app providers arguably also share gatekeeper power within the internet ecosystem. If consumer protection and innovation are issues regulators can shape and affect, some might argue all the sources of potential gatekeeper power have to be addressed. At least, that is what Hal Singer argues.

East-West or North-South?

East-west and north-south are terms with specific meaning in the data center business. East-West means traffic that moves between servers in a single data center. North-south means traffic that moves between a data center and other locations. 

But the terms also are important in a traditional geographic sense. Most global traffic flows--geographically--east and west, although a growing volume of traffic also flows north-south. That is the sense in which executives at FP Telecom describe their business.

FP Telecom's primary business is long haul transport between North America and South America. 

Hyperscale Data Centers Now Originate, Terminate Most Global Traffic

Among many other changes to the global telecom business caused by a shift to internet, mobile and cloud communications, global bandwidth patterns have changed. In the past, long haul traffic was originated and terminated at central offices. As the internet become dominant, long-haul traffic began to originate and terminate at internet exchange points and other internet points of presence.

Now, with the dominance of mobile-consumed content and apps, generally cloud-based, traffic originates and terminates heavily at hyperscale data centers.

Asia-Pacific has been the fastest growing region in terms of hyperscale data center location and will continue to grow more rapidly over the next five years, although North America will still account for 43 percent of hyperscale data centers by the end of 2020, says Cisco.

Annual global cloud IP traffic  will reach 14.1 ZB (1.2 ZB per month) by the end of 2020, up from 3.9 ZB per year (321 EB per month) in 2015, Cisco predicts, nearly quadrupling (3.7-fold) over the next five years.


Overall, cloud IP traffic will grow at a CAGR of 30 percent from 2015 to 2020. On a global basis, cloud IP traffic will account for more than 92 percent of total data center traffic by 2020.


Hyperscale cloud operators are defined by Cisco as entities representing more than US$1 billion in annual revenue from infrastructure as a service (IaaS), platform as a service (PaaS), or infrastructure hosting services (for example, Amazon/AWS, Rackspace, Google); more than US$2 billion in annual revenue from software as a service (SaaS) (for example, Salesforce, ADP, Google); or more than US$4 billion in annual revenue from Internet, search, and social networking (for example, Facebook, Yahoo, Apple); or more than US$8 billion in annual revenue from e-commerce and payment processing (for example, Amazon, Alibaba, eBay).

Hyperscale data centers will grow from 259 in number at the end of 2015 to 485 by 2020, according to Cisco,  representing 47 percent of all installed data center servers by 2020.


Hyperscale data centers also represent a large portion of overall data, traffic, and processing power in data centers, accounting for 34 percent of total traffic within all data centers and driving 53 percent of in-data-center traffic by 2020.

Hyperscale data centers will also represent 57 percent of all data stored in data centers and 68 percent of total data center processing power.

Yes, Follow the Data. Even if it Does Not Fit Your Agenda

When people argue we need to “follow the science” that should be true in all cases, not only in cases where the data fits one’s political pr...