Monday, July 7, 2014

Mobile ISP Role in M2M Will Largely be the Same Role as in Mobile Phone Business, Despite All Efforts

If the "machine to machine" (M2M) or "Internet of Things" market consists of “hundreds of micro-markets,” not a single industry, as a new Vodafone report suggests, then the logical conclusion also is that the role of mobile service providers in M2M or IoT markets will substantially replicate the industry's role in the mobile phone business.

In other words, the mobile service provider role largely will involve communications connectivity, not application-layer functions. Those who worry about "dumb pipe" roles can start worrying now.

If applications are highly fragmented and discrete markets are individually small, then it will make sense for mobile service providers to focus on the general purpose communications role, not a possible role as application suppliers.

That means mobile service providers--like it or not--will be "pipe suppliers," earning nearly all their M2M or IoT revenue from access services.

There are 4.4 billion machines or devices now connected to each other or to servers, growing  10.3 billion by 2018, a study sponsored by Vodafone predicts. 

Still, since definitions of “machine-to-machine” or “Internet of Things” vary, it is hard to separate connected appliances such as televisions or game consoles from industrial sensors and monitors.

Within three years, most firms will be embedding M2M into the actual products and services sold to customers.

That likely has implications for the role of the mobile service provider in the ecosystem. If the application settings are highly fragmented, the easiest role for an access provider to adopt is “horizontal,” not “vertical.”

In other words, mobile service providers supply the communications function, not primarily vertical applications, much as the primarily value provided to business or consumer customers is mobile access (for phones, other personal devices and sensors), with a couple of general purpose applications (text messaging and voice).

About 22 percent of 600 executives involved in machine-to-machine strategy say they already have at least one active M2M deployment in operation, up about 80 percent in 2014, compared to 2013, according to a new study sponsored by Vodafone and conducted by Circle Research.

The three leading industries, in terms of deployment, are the consumer electronics, energy and utilities, and automotive industries, each with a minimum of 30 percent adoption by respondent firms.

By 2016, the percentage of respondents with at least one M2M deployment will be 74 percent, the study predicts, based at least part on the embedding of M2M features into products such as thermostats and kitchen appliances.

Energy and utility respondents will have boosted M2M deployment to about 62 percent by 2016, based on smart meters and grid monitoring programs.

Use of M2M in the transportation and logistics verticals will be 57 percent in 2016, based largely on fleet logistics applications, and adoption in the healthcare and life sciences industry will be identical, the Vodafone survey found.

Automotive segment adoption will reach 53 percent, while retail deployment reaches 51 percent. M2M deployment in manufacturing will be at least 43 percent, though self reporting might be underestimating the actual state of deployment, given the widespread use of automation in manufacturing. Some respondents might not call what they already are doing an instance of M2M deployment.

Safety and security applications are the leading uses of M2M in automotive settings, partly because in many regions they are being driven by regulation, such as the eCall programme in the European Union.

Consumer electronics is at present the leading adopter of M2M, with the highest adoption
of external-facing strategies, at 71 percent.

Those applications primarily include tracking mobile assets including shipping containers.  But
20 percent of all company executives  surveyed in the consumer electronics segment already are selling connected devices directly to consumers.

Asset tracking is expected to be important in the energy and utility segment, monitoring in health care, connected car services in the auto industry, monitoring in manufacturing and connected cabinets or asset tracking being lead apps.

Early adopters tend to say productivity and cost savings are the deployment drivers, with projects tending to be in the internal processes areas, rather than external operations visible to customers.

At the moment, adoption of at least one active deployment is highest in the “Africa, Asia, Middle East” region, a rather broad category of limited analytical usefulness, one might argue. But 27 percent of executives surveyed in that region had projects underway.

The study included respondents from Australia, Brazil, China, Germany, India, Italy, Japan, the Netherlands, South Africa, South Korea, Spain, Turkey, the United Kingdom, and the United States.

In Europe, 21 percent of respondents reported they had at least one M2M project in action. In the Americas, 17 percent of executives said they had at least one project in progress.

But Vodafone expects that, by 2016, deployment profiles will be quite similar, with more than half of all  respondents supervising actual deployments.

As predicted in last year’s Vodafone M2M Adoption Barometer report, the US has been overtaken by the Asia Pacific region as the geography with the widest adoption of M2M. This year’s report suggests that by 2016 the gap will be negligible with all regions close to a 55% average for adoption.

The survey, carried out by Circle Research, captured the views of more than 600 executives involved in setting M2M strategy in seven key industries across 14 countries.

Three sectors have emerged as front runners in M2M with nearly 30 percent adoption rates: automotive, consumer electronics, and energy and utilities.

Automotive is the most mature of the sectors where M2M is now seen as an enabler for additional services such as remote maintenance and infotainment. M2M adoption in energy and utilities is also growing rapidly as ‘smart’ home and office services such as intelligent heating and connected security gain popularity.

This uptake is being fuelled by the use of M2M in connected devices such as smart televisions and games consoles. The research shows that nearly three quarters of consumer electronics companies will have adopted some form of M2M by 2016, whether for new products, logistics or production.

Similarly, the report anticipates that 57 percent of healthcare and life sciences companies will have adopted M2M technologies by 2016.

Sunday, July 6, 2014

Amazon Fire Phone is About the Future of Mobile Commerce, Not Phones

It can be argued that Amazon's Fire Phone has not gotten off to a big start, in terms of sales. And it isn't hard to find detractors who think the device is too late to market. 



But it also can be argued Amazon is testing something more than "one more smartphone." To be sure, it might be an expensive, but important test. But what sort of test?



As Google's efforts center on supporting its advertising business on mobiles, so some might argue the Fire Phone is about Amazon's e-commerce business.



Some argue the issue is whether a smartphone can in significant ways replace the traditional PC-based e-commerce site.



What if, instead of going to an online store to buy something, the phone becomes the store? Taking a picture of an object, or scanning its barcode, or saying its name then pulls up an Amazon "click to order" screen. 



If so, Firefly is an effort to turn the entire physical world--anything that can be photographed, for example--into a buying opportunity for a consumer, using Amazon.



Kindle hasn't managed to become the world's top tablet device, either. But Kindle users almost certainly spend more money with Amazon that owners of other brands of tablets. 



The Fire Phone most likely is seen as a gamble on similar behavior on smartphones. If you have used a Kindle, you might agree it is less useful as a general purpose device, but excels as a gateway to content sold by Amazon. 



Likewise, the Fire Phone might not so much be viewed as a general purpose smartphone but more like a Kindle, a device optimized for Amazon e-commerce.



That might limit its appeal as a general purpose phone. But the Fire Phone might be useful for some users who actively engage in mobile commerce. 


Are ISPs Responsible for Video Stalling?




Google's YouTube now is offering consumers reports about video streaming performance, as Netflix likewise displays messages that video stalling at the moment is caused by the Internet service provider.


To be sure, Internet service providers directly control contention ratios, capacity investments, network architectures and other network elements that enable and limit both bandwidth and latency.


On the other hand, other users also are, in a direct sense, the cause of congestion, as always is the case on shared networks.


And some apps impose more load on any network, video being the primary and common example, representing an order of magnitude or two orders of magnitude more bandwidth loading than other apps such as voice.


Compounding the problem, most streamed video entertainment is a zero-revenue driver of consumption, as are most apps, for the ISP. That is not to say apps should be sources of direct revenue for ISPs, but only to note that the business context for supplying more network resources is affected by that lack of revenue.


Any ISP will have a direct incentive to invest in additional capacity to support revenue-generating apps and services, and incentive to provide whatever level of network support is required to ensure good app performance.


Consumer Internet access is more problematic. On one hand, consumers expect affordable prices and virtually unlimited usage. On the other hand, suppliers have to balance expectations with a business case for more investment.


In other words, no ISP can afford, over the long term, to invest more in facilities and support than the customer is willing to spend to use the product, unless there are compensating revenue streams that can be used to augment the business case.


Voice services, texting, revenue-producing video entertainment or ancillary services are examples.


And there is no question but that widespread use of entertainment apps and services has dramatically different revenue implications for ISPs, compared to other apps.


How much bandwidth is required to earn those $43 revenue components? Almost too little to measure in the case of voice; gigabytes for Internet content consumption and possibly scores of gigabytes for video.


By some estimates, where voice might earn 35 cents per megabyte, revenue per Internet app might generate a few cents per megabyte. At one level, a network engineer might argue that such fine distinctions do not matter. The network has to be sized to handle the expected load.


McKinsey analysts have argued in the past that a 3G network costs about one U.S. cent per megabyte. The problem, in many developing markets, is that revenue could drop to as little as 0.2 cents to 0.4 cents per megabyte, for any mobile Internet usage.

That implies a strategic need to reduce mobile Internet costs to as little as 0.1 cent per megabyte, or an order of magnitude. Tellabs similarly has warned about revenues per bit dipping below cost per megabyte, leading to an "end of profit" for the mobile business.

The point is that ISP investment in higher-capacity networks does affect app quality. But so do prevailing business models, app bandwidth requirements, end user demand for video content and contention from other users.

Video at standard definition is one issue. High definition requires even more bandwidth. And "4K" video, requiring bandwidth four times that of HDTV, is coming. Few networks can be upgraded fast enough to cope with those sorts of capacity demands, to say nothing of changing business models to support continual capacity upgrades.

Yes, ISPs control their own investment levels. But they are not singularly responsible for quality of experience in the whole content ecosystem. Consumers, app providers, app technology requirements, display devices and communication networks and revenue relationships all play some part, even if it is the ISP that controls access bandwidth.

Saturday, July 5, 2014

Major Changes Coming for Mexico Telecommunications

Mexico is moving towards the biggest overhaul of the telecommunications sector in more than two decades, as legislation has passed the Senate authorizing a number of changes to increase competition in the mobile and fixed communications markets, as well as television broadcasting.

The new rules will mandate lower termination charges for competitors completing calls to Telmex customers. 

But the legislation also eases restrictions in investment by non-Mexican firms, and is expected to allow new TV networks to be created. 

Telefonica is expected to be among the firms that takes advantage of the regulatory changes to expand its market share in the Mexican market. 

On the other hand, observers also expect that  América Móvil, owner of Telmex, will make a move into the broadcast television as well. 

The legislation also does allow Internet service providers to provide quality of service mechanisms for their access services. 

In other words, "best effort" access can be supplemented by access with quality of service features such as provided by content delivery networks in the wide area network. 



5G Might be Powerful, but Also Will be Fragile



Future 5G networks are expected to feature much-faster speeds. Compared to fixed network access, though, there still will be issues. Total mobile bandwidth will never approach what is possible using a fiber connection, and interference will remain an issue for wireless approaches.

Friday, July 4, 2014

Government Content Blocking, Commercial Pressures Are Internet Dangers

Government blocking and filtering of content poses a big danger to the future Internet, but so does growing commercialization of Internet apps a survey of 1,400 experts by the Pew Research Center's Internet & American Life Project has found.



Less content access is a possible or likely consequence of government action, but a growing commercial context also will shape the unrestricted flow of information, the experts say.


But growing lack of trust also will reduce end user willingness to share using the Internet. And  "too much information" might likewise reduce end user desire to share content and information, as use of content filtering grows.


Despite those perceived threats, many respondents expressed optimism that the problems can, and will, be addressed.

In fact, a majority of respondents say they hope that by 2025 there will not be significant changes for the worse and hindrances to the ways in which people get and share content online today. 



And they said they expect that technology innovation will continue to afford more new opportunities for people to connect.



In fact, a majority of respondents he majority of respondents  say they hope that by 2025 there will not be significant changes for the worse and hindrances to the ways in which people get and share content online today. And they said they expect that technology innovation will continue to afford more new opportunities for people to connect.


By 2025, about 35 percent of respondents thought there would be significant hindrances to the free exchange of information, while 65 percent predicted the obstacles would be overcome, and that the free exchange of information would not be significantly dampened.


To be sure, some who are optimistic said they “hoped” that would be the case, not necessarily that they expected such an outcome.


Those who expressed hope or the expectation that access and sharing would survive challenges between now and 2025 also often noted that billions more people may gain access and begin sharing online over the next 11 years, allowing content sharing to survive the challenges.

Digital Divide Now is More Subtle

There is no digital divide on inter-city trains, inter-city buses or airplanes, a study of use of personal devices on buses, trains and airplanes suggests.

Internet access remains an issue in rural and lower-income areas, compared to suburban and mid-income urban areas, to be sure, but the issues now are more subtle, having as much to do with people not seeing Internet access as useful as actual physical lack of access.

To be sure, use of fixed high speed access services is lower among households with less income. Access is 70 percent amongst households with $10,000 or less annual income, in the 85-percent range for households with income between $20,000 and $40,000, and above 90 percent for households in higher income ranges.

But age explains non-use of the Internet as much as income. Also, mobile Internet access is substantial among lower-income households, ranging from 50 percent to 60 percent among lower-income groups.

In fact, many younger users use mobile Internet access, rather than fixed network access. In other words, much of the digital divide that remains in U.S. Internet access is explained by age or use of mobile access.

Access speeds in rural areas continue to lag offered speeds in urban and suburban areas, as a rule, though the gap is closing, as cable TV high speed access services tend to be much faster than all-copper digital subscriber line connections. That is true even in India.

That is one good reason why AT&T, among others, is upgrading rural networks with fiber.


In fact, widespread use of connected personal devices on inter-city transportation services suggests the important role ownership of connected devices now plays.

On Greyhound inter-city buses, the use  of personal technology use is now significantly higher than on airplanes and is only marginally below that on Amtrak and discount bus lines, a study by the Chaddick Institute for Metropolitan Development has found.

In fact, for the first time in five years, use of personal devices on at least one  inter-city bus service was higher than on airplanes or Amtrak.

Among the 505 passengers observed on 20 Megabus and Van Galder buses operating from curbside locations in 2013, 59 percent were using technology, compared to 46 percent in 2012.

In large part, that might be because the amount of use of new “connected” bus services--which offer travelers uninterrupted cell phone signals as well as free Wi-Fi and power outlets--grew 30 percent between 2012 and 2013.

On Amtrak, the share of technology users was flat at 52 percent in 2013, the study found.

Availability of power outlets, Wi-Fi and mobile access likely explains the lighter use of personal devices on airplanes, according to the Technology in  Intercity Travel Study.

Technology use on airlines remained virtually flat and continues to lag behind other
modes in 2013, suggesting that lack of communications “for no incremental cost” is an issue.

But the ban on phone calls aboard aircraft, as well as the lack of power outlets, likely also are issues.

The two fastest growing modes of intercity travel over the calendar years 2012 and 2013—intercity trains and discount buses—were also those in which the technology use was observed to be the highest in early 2013.

The amount of discount bus service grew by four percent between 2012 and 2013, while the number of Amtrak seat-miles grew by 1.4 percent, as did airline seats.

Availability of Wi-Fi and mobile Internet connections, the “no incremental cost” access and lawfulness of device app use on trains and buses possibly explains the higher use of personal devices on buses and trains.

Mobile device connections are disabled in the air, on airplanes, in addition to being unlawful. When Wi-Fi is available, usage requires payment, and power outlets often also are not available.

But there seems to be no “digital divide” between passengers on inter-city buses, trains or airplanes.
The Chaddick Institute survey in 2014 consisted of 1,659 airline travelers, 1,608 intercity train (Amtrak) passengers, 505 discount city-to-city bus passengers (Megabus and Coach USA), 270 conventional intercity bus passengers, and 2,992 commuter rail passengers.





Wednesday, July 2, 2014

FTC Charges Un-Carrier with Un-Cool Cramming

It is hard to know what what is worse, T-Mobile US being charged by the Federal Trade Commission with cramming, or the abuse by third party information and content suppliers using third party billing.



Cramming is the practice of placing unauthorized, misleading or deceptive charges on a telephone bill.


Crammers rely on confusing telephone bills in an attempt to trick consumers into paying for services they did not authorize or receive, or that cost more than the consumer was led to believe.


Purportedly,  T-Mobile USA made hundreds of millions of dollars by charging customers for by “premium” text messaging services that never were authorized by its customers.


The FTC alleges that T-Mobile received anywhere from 35 to 40 percent of the total amount charged to consumers for subscriptions for content such as flirting tips, horoscope information or celebrity gossip that typically cost $9.99 per month.

According to the FTC’s complaint, T-Mobile in some cases continued to bill its customers for these services offered by scammers years after becoming aware of signs that the charges were fraudulent.

Which Firms Will Lead the Next Generation of Video Aggregation?

As surely as night follows day, one already can predict that as consumer demand for unbundled, on-demand access to TV series content is satisfied, the fragmentation of desired content on many different distribution services will lead to dissatisfaction with the unbundled approach, leading suppliers and distributors to try and recreate the linear video subscription bundle.

As leading over the top video distributors work to create unique,  “must see” video series that create product differentiation, many consumers will find they are buying multiple subscriptions.

As always, that is going to create demand for a bundled approach that allows convenient access to multiple services. That is why numerous suppliers are creating new devices that can aggregate video from multiple sources and display that content directly on a TV.

That creates a “logical” or “virtual” bundle rather than the formal bundle now sold as “cable TV.” Only this time, the distributors are Amazon, Google, Apple and others.

That also is why some linear video distributors already have moved to add Netflix access to standard TV decoders.

In principle, the new aggregators (Apple TV, Amazon Fire and Google Chromecast) threaten to rival or displace traditional aggregators, over time, allowing consumers to create their own bundles.

Though there is risk for traditional programmers as well--who might well see far smaller audiences--the new aggregators should make it easier for consumers to buy and then watch content from multiple independent sources.

The habit of getting and watching over the top television already is well established.

Some 77 percent  of U.S. adults say they regularly watch television shows using either cable TV (55 percent) or satellite TV (23 percent), while 43 percent view streamed video. About  67 percent of Millennials report they watch streamed video, according to Harris Interactive.

About 38 percent of respondents say they've subscribed to premium cable TV channels in order to watch specific shows, while 24 percent have subscribed to one or more streaming services for the same reason, Harris Interactive reports.

Among those who regularly watch television shows using streaming, 74 percent use a computer to do so, while 55 percent use a television (attached to a set-top box, a game system or a television with integrated online capabilities).

About 37 percent watch on tablets, including 63 percent of tablet owners. Some 30 percent watch on smartphones, including 42 percent of smartphone owners.

It isn’t clear how those preferences might change, though, as more content traditionally available only from a bundled linear video subscription gets “unbundled,” albeit slowly.

Looking specifically at streaming TV's likely "core" constituents, half of those who list streaming among their top venues for television shows say they've subscribed to streaming services for access to specific shows, according to Harris Interactive.

In the near term, that means more fragmentation of the video subscription business, as consumers buy discrete services to get access to a couple of lead unique series (“House of Cards” or “Orange is the New Black,” for example.

Also, there is new potential for creation of “premium” services. About 40 percent of respondents say they would be willing to pay more for a service that allowed them to stream current shows ad-free.

About 37 percent of respondents report they  would pay more for a streaming service that allowed them to temporarily download TV episodes, for when they're away from an Internet connection.

As you would guess, streamed video gets viewed on a variety of screens. About 85 percent of respondents report they most often watch TV on an actual TV (live feed, recorded or on demand). That is down from about 89 percent in 2012.

Streaming use, meanwhile, has grown from 20 percent of respondents in 2012 to 23 percent in 2014. Among Millennials, 47 percent of respondents say they use alternate screens, while use of TVs has dropped from 77b percent to 68 percent.

About 23 percent of respodnents say they're watching more online or streaming television in 2014 now than they were a year ago.

Some 37 percent say their online or streaming viewership is unchanged over the last 12 months while seven percent say they watch streamed content less than a year ago.

Half of respondents say they expect no change in viewing habits over the next year. Some 18 percent say they think they will watch more streaming or online video in the next 12 months.

Some four percent of respondents think they will watch less online or streamed video.

The Harris Poll included 2,300 U.S. adults surveyed online between April 16 and 21, 2014.

Tuesday, July 1, 2014

Does Product Reinvention in Telecom Work, Long Term?

Frontier Communications Corporation is launching a text messaging feature for business voice accounts that allows  business fixed network numbers to support text messaging. That is a primary example of one way service providers try to prop up the fortunes of a declining product.


That itself often is one part of a two-pronged effort to maintain revenue growth. Adding value to an existing product hopefully enhances a particular product enough to slow or arrest rates of decline.


The other challenge is to create new lines of  business and sources of revenue to displace declining lines of business.


Some are more sanguine about ways to change the value of voice services than are others.


When confronted with a sustained drop in average revenue per minute of long distance usage, AT&T essentially decided to harvest the revenue stream rather than reinvent the product, turning its attention instead to measures that would create new lines of business.


In 1991 AT&T bought NCR in an effort to enter the computing business. NCR eventually was spun out as a separate company.


In 1993 AT&T entered the mobile business by buying McCaw Cellular.


AT&T bought the largest U.S. cable TV company in 1998, and MediaOne, in 2000, becoming the largest provider of cable TV services in the U.S. market, only to dismantle the strategy and sell those assets in 2001.


In 2005, AT&T was acquired by SBC Corp. The point is that, in essence, AT&T never was able to create new lines of business big enough to displace the older revenue sources.


Still, “harvesting” legacy revenues while creating new lines of business is the fundamental strategy for any service provider facing challenges in its original lines of business. History also suggests the task is fraught with uncertainty.


Arguably, most such efforts essentially fail, in the sense that the firms simply are acquired by other firms, and cease to exist.


But there is a difference between transforming a whole industry, or a whole firm, and changing key revenue sources and product lines. Whole industries and firms can manage big transitions more easily than they can manage to reverse the fortunes of a declining and key line of business.


Frontier Communications hopes, at the very least, to slow the rate of decline of its business voice lines.


So far, it has not found any solution for arresting the decline of the consumer voice business, even if bundling arguably has proven the most-successful tactic so far, in the consumer business.


Frontier Texting and high-definition voice are both examples of efforts to reinvigorate voice services by “adding more value,” enhancing the core functionality in some customer-significant way.


Whether such techniques can do much more than slow the rate of decline is the issue. At the moment, there is virtually no evidence that this sort of product revamping actually can reverse a product line’s decline, though one might argue such measures slow rates of decline.


That is worth doing, so long as other growth initiatives also are underway.


What the industry has yet to prove, though,  is that it actually can enhance legacy products enough to reverse losses.


In other words, there is a strategy challenge: should capital be invested in revamping legacy products, and if so, how much? The alternative is to harvest revenues, deploying available capital into creating new lines of business.


Though industries sometimes can reinvent themselves, the issue is whether specific products can be reinvented, and if so, how often that actually happens. At a high level, the global telecom business has managed to replace declining revenue sources with new sources.


Mobile revenues have supplanted long distance revenues, while video and high speed access revenues have replaced voice revenues. Some firms have shifted from a reliance on consumer customers to business customers.

But that really does not address the specific challenge of adding enough value to a declining product to stem losses long term, or possibly reignite growth.

Backhaul Increasingly is a Strategic Matter for ISPs

Backhaul sometimes is a strategic advantage or key impediment for one or more service providers, as well as an important driver of operating cost. Backhaul often accounts for as much as 25 percent of total operating cost for a mobile service provider, for example.

As transmission networks become more dense, using small cell and carrier Wi-Fi architectures, for example, backhaul will be a major issue, mostly because the cost of backhaul has to scale to much-lower levels than has been the case for mobile and enterprise backhaul prices.

Where a traditional enterprise backhaul had substantial revenue generated by the link, a carrier Wi-Fi or small cell might have close to zero incremental revenue generated by the link.

The cost of backhaul also has been a key impediment for ISPs seeking to provide higher access speeds in regions distant from an Internet access point.

When the “Broadband Technology Opportunities Program” was launched in 2008 to promote high speed access advances in rural areas, you might have predicted that most of the money would be spent to create or augment access facilities.

Instead, middle mile backhaul facilities received significant funding. The reason was simple enough: in many rural areas, it is the backhaul to Internet points of presence that is the key impediment to faster end user Internet access.

You might argue that is the case in many parts of South Asia and Africa as well. There is little point in creating new access networks where backhaul is insufficient to support those access assets and potential customers.

Wi-Fi also now is an essential part of the backhaul strategy for most mobile service providers, allowing carriers to offload half or more of total Internet access traffic from the mobile network to the fixed network.

In some cases, up to 80 percent of mobile traffic is offloaded to Wi-Fi networks.

In similar fashion, deployment of mobile cell capacity likewise drives growth of demand for backhaul. And though much attention has been focused on the impact of new small cells and carrier Wi-Fi, standard macrocell deployments can be important as well.

“Over the past several years of experience, a fairly strong correlation between domestic carriers, aggregate CapEx and our level of organic growth in American Tower” can be seen, said Jim Taiclet, American Tower Corp. CEO. “For example, from 2010 to 2012, we saw aggregate spend on wireless CapEx of about $25 billion to $30 billion supporting our organic core growth rates in the range of seven percent to eight percent during those years.”

In 2013, when U.S. mobile service provider capital investment grew to nearly $35 billion a year, American Tower has seen revenue growth of nine percent.

Backhaul bandwidth demand scales in other ways beyond the number of tower sites and radios, though.

Between 2012 and 2013, average daily U.S. smartphone data consumption grew by almost 40 percent, while connected tablet usage increased by over 50 percent. And then there is mobile video, consumption of which might grow an order of magnitude between 2013 and 2018, according to Cisco projections.

Small cell deployments will have an impact, but American Tower presently generates 95 percent of its revenue from macrocell sites.

And one big question is how much incremental demand Sprint might drive, as it activates new 2.5 GHz capacity.

As an example, said Taiclet, Sprint would have to add 30,000 to 40,000 transmission locations to have 2.5 GHz coverage match the existing 1.9 GHz network footprint. That could possibly double the number of tower locations operated by Sprint.

All of those examples--Wi-Fi offload, small cell backhaul, existence of backhaul facilities in emerging markets, additional 2.5-GHz cell sites and BTOP funding--illustrate the roles backhaul often plays as a strategic matter for mobile service providers, not merely a tactical necessity.

Goldens in Golden

There's just something fun about the historical 2,000 to 3,000 mostly Golden Retrievers in one place, at one time, as they were Feb. 7,...