Tuesday, October 27, 2015

T-Mobile US Adds 2.3 Million Net New Customers

T-Mobile US third quarter 2015 results continued a recent streak of subscriber, revenue and earnings growth.


T-Mobile US added 2.3 million total net customers, grew service revenue 11 percent and “adjusted earnings” 42 percent, quarter over quarter. T-Mobile’s total revenues for the third quarter of 2015 grew by 6.8% year-over-year


T-Mobile US added 2.3 million total net adds, the 10th consecutive quarter when T-Mobile US added more than one million net new accounts.


Of the total net adds, 1.1 million were branded postpaid net adds, of which 843,000 were branded postpaid phone net adds. T-Mobile US also added 595,000 branded prepaid accounts.

Verizon added 430,000 net accounts during the same quarter, while AT&T lost 333,000 postpaid accounts.  

European Parliament Sets Network Neutrality Rules

The European Parliament has ratified net neutrality rules applying across the European Union. Blocking of lawful apps has been an issue.  

In 2012, the Body of European Regulators of Electronic Communications (BEREC) reported that between 21 percent and 36 percent of Internet access subscribers were affected by blocking or throttling depending on the type of application.

The new rules forbid blocking or throttling of lawful online content, applications and services, on mobile or fixed networks. That will have possible implications for Skype, Facetime or other apps sometimes blocked by ISPs. Nor will it be lawful to charge a fee to “unblock” those apps.

No traffic can be prioritized, whether on a paid or unpaid basis. At the same time, equal treatment allows reasonable day-to-day traffic management according to justified technical requirements, and which must be independent of the origin or destination of the traffic and of any commercial considerations.

The rules also clarify the conditions under which “Internet” and “managed services” can be offered.
Basically, managed services only might be offered where and if sufficient capacity for internet access remains available.

The rules forbid any special treatment of different classes of Internet traffic except for “reasonable traffic management” to optimize overall transmission quality.

Reasonable traffic management therefore cannot be used to discriminate against specific categories of content or services such as peer-to-peer (P2P) traffic.

The legislation allows operators, under very strict conditions, to take action in the network that may affect certain types of traffic in order to mitigate the effects of congestion. Such measures are only permitted if congestion is "exceptional" or "temporary", and provided all traffic of the same category is treated alike.

Zero rating, also called sponsored connectivity, is a commercial practice used by some providers of internet access, especially mobile operators, not to count the data volume of particular applications or services against the user's limited monthly data volume. Zero rating is not forbidden, but must comply with the other provisions of the rules, in particular those on non-discriminatory traffic management.

The rules on net neutrality will apply starting 30 April 2016.

European Parliament Approves End to Mobile Roaming Charges

The European Parliament has ratified an end to wholesale mobile roaming charges by June 2017 and also set net neutrality  rules for the first time in EU law.

For consumers, the new rules mean that, starting 15 June 2017, EU mobile customers will pay home rates for voice, texting and mobile data when traveling within the EU, subject to a fair use cap that is not yet defined.

Since the EU took action in 2007, prices that consumers pay for roaming across calls, texting and data have decreased by over 80 percent.

Data roaming is now up to 91 percent cheaper compared to 2007, with perhaps predictable results:  the volume of the data roaming market has grown by 630 percent.

Current Revenue Opportunities Dictate AT&T, Verizon Linear Video Strategies

It might not have seemed obvious, a few years ago, what stance either AT&T or Verizon "should" take towards future video entertainment strategy. Both firms had modest share of the linear video business, even if video was an essential ingredient for the triple play anchor service.

Today, AT&T and Verizon are taking different tacks to video entertainment. AT&T has made a bigger commitment to linear video. Verizon is emphasizing mobile video.

To be sure, there are some commonalities. AT&T believes its ability to bundle video entertainment, on a national basis, will help it sell and retain mobile accounts that also are sold nationwide. 

Verizon, on the other hand, had a smaller fixed network footprint to begin with, and concluded for several reasons that the better bet was to "go mobile," since perhaps 85 percent of total Verizon revenue is generated by mobile services. 

Aside from other considerations, the DirecTV acquisition was immediatly accretive for AT&T in terms of free cash flow. That is an important consideration for a firm committed to continual dividend payments and dividend increases. 

Long term, neither firm believes linear video will continue to be as big a business, or have the profit margins, as at present. 

In the near term and medium term, however, linear and other forms of video entertainment often are seen as essential products to support the fixed network business case, which no longer can be supported by voice, Internet access as a stand-alone product, or a dual-play voice-plus-Internet-access approach. 

Some have argued AT&T would have been better served had it not acquired DirecTV.  The thinking there is that the capital could have been deployed in Internet access facilities. Whatever the merits of those arguments, there would have been no immediate lift in free cash flow or revenue magnitude, had that choice been made. 

source: Market Realist

Does 25 Mbps Change Anything?

With one obvious possible exception--doling out of federal funds to support high speed access in rural and underserved areas--what is the impact of the Federal Communications Commission’s change in definition of “broadband” from 4 Mbps (downstream) to 25 Mbps?

Record keeping obviously changes. When the next report appears, many service providers will no longer be said to be providing “high speed access,” their services being defined out of existence.


Many more residents will said to be “lacking” such access. That, of course, is the purpose: creating a gap; a problem to be fixed. And the shift could be as high as 90 percent, using the latest Akamai study results.


Akamai also says U.S. “adoption rates for 25 Mbps broadband remain fairly low nationwide, with 46 states seeing levels below 10 percent.” That’s the magnitude of the likely reporting change.


It matters “what” is being measured, of course. What likely matters more are Internet service provider investment decisions about what is required in each market to remain competitive. And those decisions are not necessarily related to definitional changes.


Google Fiber, many would argue, is largely responsible for the widespread shift in thinking about headline speeds, shifting the marketing battle to “gigabit” levels, even if most consumers, given a choice, seem to be opting for speeds ranging from 40 Mbps to 100 Mbps, even when they could buy a gigabit service.


As often happens, the market and the technology is moving faster than the definitions or regulations. Cable TV companies, Verizon and many independent ISPs already have eclipsed the new FCC definition, and had done so before implementation of the new minimums.  




Cable TV companies also, by virtue of their use of hybrid fiber coax networks, had generally surpassed all but the fiber to home providers in terms of Internet speed. That largely accounts for the slower speeds notched by CenturyLink and AT&T, for example.




In 2015, according to Ookla tests conducted by PC magazine, only Verizon was among the top providers of Internet access in the U.S. market. Obviously, where Google Fiber operates, it has been the provider of the fastest service, by an order of magnitude.


That should continue to be the case, even where Comcast and other competitors boost speeds. The reason is that Google Fiber sells just one service--at a gigabit--where the other providers tend to offer multiple tiers of service, including services ranging from 50 Mbps to hundreds of megabits, in addition to the headline gigabit services.


That essentially means all consumers buying from Google Fiber are taking the gigabit service. Only some of the customers of other ISPs offering gigabit services will buy the fastest service.




Average global speeds in the first quarter of 2015 were about 5 Mbps, said Akamai. But global  peak speeds were about 32 Mbps.


In the second quarter of 2015, average global speed was still about 5.1 Mbps, while peak speeds
globally were still at 32.5 Mbps.


So it matters which figure is cited and used. In U.S. cities, Akamai says average speeds were between 14 Mbps and 19 Mbps, while peak speeds ranged from 62 Mbps to 73 Mbps. That also is true at the state level, according to Akamai.




You can agree--and most will--that higher speeds are a good thing, and that bumping up a standard to a minimum of 25 Mbps only reflects reality. In fact, that is the case, many would argue, even if “average” and “peak” speeds diverge.


Traditionally, long access loops have been an inhibitor. But investment in optical fiber networks has been ramping up. Comcast, especially, will have the greatests immediate impact, as it will upgrade virtually 100 percent of its consumer locations to 1 Gbps in 2016, with some 85 percent of locations capable of buying a 2-Gbps service.


AT&T likewise has committed to a big upgrade across its footprint as part of its Project VIP, something CenturyLink also is doing, introducing faster speeds and gigabit access in metro areas.


That refers to the supply situation. The other element is the demand. According to the FCC, consumers are upgrading speeds at about a 20-percent annual rate, as well.


The point is that there arguably has been remarkably little impact from the definitional change. For starters, the definition simply raised the minimum to a level many suppliers already were meeting or beating, especially in the case of cable TV operators who are the majority suppliers in the U.S. market.

At the same time, competition from the likes of Google Fiber definitely has encouraged cable and telco suppliers to boost speeds. That, and not a definitional change, is what is driving investment.

Monday, October 26, 2015

How Big Will Proximity Mobile Payments Be in 2016?

It always is difficult to predict the likely growth of the mobile payments business.

In part, that is because there are different mobile payment segments; mobile payment revenue for providers is distinct from transaction volume, the number of suppliers is quite fragmented and the actual revenue for payment system suppliers is quite variable.

Aside from all that, even successful innovations in the financial services business can take quite some time to get traction. In the case of mobile proximity payments, the issues range from the base of active devices able to support specific mobile payment systems to retailer adoption of the services and terminal deployments.
source: the financial brand

All of that adds up to a huge challenge for the various payment system contestants.

Nor is it clear what transactions will make most sense, for most people, based on which services are supported at the venues they frequent, and the types of purchases where using proximity payments makes the most sense.

According to the latest proximity mobile payments forecast from eMarketer, the total value of mobile payment transactions in the US will grow 210 percent  in 2016.

In 2015, mobile payments transactions will total $8.71 billion in the United States, with users spending an average of nearly $376 annually using their mobile phone as a payment method.

By way of comparison, other forms of mobile payment or mobile banking represent far larger transaction volumes. In 2015, for example, more than $53 billion in remote mobile payments will be made.

By 2016, total mobile payment transactions at retail locations are estimated by eMarketer to reach $27.05 billion, with users spending an average of $721.47 annually. By 2016, according to Forrester Research, about $63.4 billion in remote mobile payments will occur.

US Proximity Mobile Payment Forecast, 2014-2019There will be 23.2 million people in the United States using proximity mobile payments in 2015, eMarketer expects. By 2016, that will grow to 37.5 million.

“Mobile wallets like Apple Pay, Android Pay and Samsung Pay will become a standard feature on new smartphones,” said eMarketer analyst Bryan Yeager. “Also, more merchants will adopt point-of-sale systems that can accept mobile payments, and incentives like promotions and loyalty programs will be integrated to attract new users.”

eMarketer defines proximity mobile payments as point-of-sale transactions that use mobile phones as a payment method, via tapping, waving and similar functionality.

OTT is Becoming a Bigger Issue Even in Physical Access, Backhaul and Transport

Neither Google nor Facebook are being anything but transparent about their own roles in new parts of the Internet ecosystem, whether that is moves into e-commerce as a foundational business model, pioneering use of new Internet access platforms or accepting new roles in app hosting, transport or backhaul.

Long past is the time when Google or Facebook solely sourced their global connectivity from capacity providers, their networking gear from traditional original equipment manufacturers or their data center requirements from third party data center operators.

Having achieved huge scale, both firms now can make different “make versus buy” decisions. And the increasing trend is to “make” rather than “buy.” That makes the large app providers clear examples of “frenemies.”

They both buy from, and compete with, traditional suppliers of transport, access, data center support, networking gear and software.

In many markets, that should lead to many new opportunities to work with the likes of Google and Facebook as backhaul and transport providers.

That is how Google, Facebook and most proposed low earth orbit satellite constellations are planning on coming to market with new Internet access platforms. Basically, all propose use of their new platforms for backhaul and transport.

That said, Google Fiber and Nexus also directly compete with existing suppliers. In India, both Google and Internet Basics (Facebook) are working with existing Internet service providers to build out public Wi-Fi networks.

And though the decisions will be no easier than in the past, affected contestants will have to decide how to work with, or compete with, the likes of Google and Facebook at many layers of the protocol stack, not just the physical and transport layers.

In most cases, access and transport providers will mostly avoid competing at the application layer. At some point, scale advantages held by the biggest app providers are too great to overcome.

That appears to be a growing reality in the cloud computing business, where public cloud increasingly is concentrated among the ranks of Amazon Web Services, Microsoft and Google, many would argue. Private and hybrid cloud computing continues to be the place where others have significant market share.  


Private cloud continues to offer more opportunity for rival suppliers. By 2018, 31 percent of cloud workloads will be in public cloud data centers, up from 22 percent in 2013. That represents a compound annual growth rate (CAGR) of 33 percent from 2013 to 2018.

By 2018, 69 percent of the cloud workloads will be in private cloud data centers, down from 78 percent in 2013. That represents a CAGR of 21 percent from 2013 to 2018, according to Cisco.

The point is that there are few completely fixed roles in the Internet ecosystem. That means every contestant will compete with, and cooperate with, other participants at various times, in different places, for distinct reasons.

Perhaps there are other implications as well. Most observers of business strategy would note that, over the long haul, a bifurcated, or barbell structure tends to develop.

That is to say, there are a relatively few market leaders with most of the share (customers and revenue), but also many many small firms with some niche strategy, dependent for success on the very inability of large firms to compete in small markets.

If so, then there might be many new opportunities for specialists, even in the scale-driven consumer services business. Large providers will continue to dominate the “commodity” parts of the business.

Specialists will have greatest opportunities in niches of various types, geographic or functional. Only a large firm might be able to supply mobile services effectively and sustainably across a subcontinent.

But it also is likely there will be room for local specialists as well, even in the “commodity” Internet access business, simply because the large-firm business model will not work everywhere, always.

Directv-Dish Merger Fails

Directv’’s termination of its deal to merge with EchoStar, apparently because EchoStar bondholders did not approve, means EchoStar continue...