Thursday, October 16, 2014

Why Common Carrier Regulation is Such a Bad Idea Now

source: ITU
One of the ironies of the debate over how to best regulate Internet access services is the call for utility style common carrier regulation of the industry, something that arguably could only be argued by observers who really do not remember what monopoly telecommunications was really like.

As the old adage goes, "if you do not know your history, you are doomed to repeat it."

Those who call for a return to common carrier regulation simply do not remember, or have not studied, what end user value, choices, prices and features were like, in the monopoly era. 

According to virtually any analysis, global investment and consumer benefit have improved since the 1980s shift to privatization and competition.
source: Management Information Systems

For example, since 1981, when long distance competition began to take hold in the U.S. market, long distance calling within the continental United States decreased 95 percent.

Since 1999, after the Telecommunications Act of 1996 was passed, business single-line voice prices have dropped 67 percent. Since 1999, T-1 prices dropped 88 percent.

Since the refrain often is heard that U.S. Internet access prices are “too high,” one might say that claim is only possible because “price per Mbps” trends in the U.S. market are ignored.

In the mobile segment, the effective price per megabyte of Internet access has declined from 47 cents per megabyte in the third quarter of  2008 to about 5 cents per megabyte in the fourth quarter of 2010, about an 89 percent decrease.

Likewise, the cost of a text message dropped from about six cents a message in 2005 to about one cent a message in 2010.

On the wholesale side of the business, Internet transit prices declined from about $1200 per Mbps in 1998 to about $0.94 in 2014.

At the same time, end user bandwidth has grown about 50 percent a year.

At the same time, though retail high speed access prices in developed countries have shrunk slightly, as a percentage of gross national income (2.5 percent in 2008 to 1.7 percent in 2012), the effective price per Mbps has declined substantially.  

As is true for many products related to computing, cost might not change much from year to year, but features, processing speed and memory grow at about 60 percent a year.

From 1995 to 2003, for example, the cost of a kilobit per second of internet access fell from about $1.50 per kilobit to about two cents per kilobit.

The point is that calls for a return to utility regulation (“common carrier”) would jeopardize such achievements. Only people who haven’t lived through deregulation and the advent of competition would think common carrier will lead to similar boosts in capacity and declines in cost.


Spectrum Futures 2014
Find out more


For Massachusetts Town, 2 Bad Choices for Entertainment Video Service?

There is some irony in a recent city council vote to reject transfer of a cable TV franchise from
Charter Communications to Comcast. For starters, the council vote is not binding. Second, the rejection itself is only to allow two more weeks for Comcast to address council concerns.

The irony is that, on the most-recent rankings of customer satisfaction by the American Customer Satisfaction Index, a national cross-industry measure of customer satisfaction in the United States, Charter and Comcast have identical scores of 60 on the ACSI consumer satisfaction index. Only Time Warner Cable, with a score of 56, ranks lower.

Among the various industries tracked by the ACSI, only the Internet service provider industry has worse overall scores. Video subscription service ranks next to last out of all industries monitored by ACSI.

“Customer satisfaction is deteriorating for all of the largest pay TV providers,” ACSI has said. To be sure, there are nuances.

Consumers are much more dissatisfied with cable TV service (average score of 60) than fiber optic and satellite service (average score of 68).

Though both companies drop in customer satisfaction, DIRECTV (-4 percent) and AT&T (-3 percent) are tied for the lead with ACSI scores of 69.

Verizon Communications FiOS (68) and DISH Network (67) follow.

DISH Network may be the lowest-scoring satellite TV company, but it is better than the top-scoring cable company, Cox Communications (-3 percent to 63), according to ACSI.

Cable giants Comcast and Time Warner Cable have the most dissatisfied customers. Comcast scores fell five percent to 60, while Time Warner registered the biggest loss and plunged seven percent to 56, its lowest score ever.

Spectrum Futures 2014


SDN, NFV Seen Boosting Revenue Potential, Reducing Capex

A survey of global service providers suggests that software defined networking and network functions virtualization is seen as valuable as a means of enhancing revenue and reducing capital cost.


A study conducted by Infonetics Research found that “service agility,” the ability to quickly add, drop and change services and applications was viewed as an enabler of revenue.


At the same time, NFV functions were seen as a way to reduce capital spending in the network.


(SDN tends to be term of art for data center personnel, NFV tends to be the term of art used by a growing number of service providers)


The study by Infonetics Research suggests 29 percent of respondents already have plans to deploy SDN or NFV solutions for mobile backhaul networks, for example, to gain flexibility and achieve cost savings.


Respondents suggest they might shift as much as 20 percent of backhaul traffic from the macrocell network to small cells of some type by 2018. And it is those new network elements where one might expect SDN or NFV deployments to happen, as adoption will not require displacing existing network elements or systems.


Ranked on a scale of one to seven, where one is “not important” and seven represents something “very important” for producing new revenue, bandwidth on demand was rated important by about 58 percent of respondents, who gave that value a score of six or seven.


And service providers see advantages in both the consumer and business customer segments.
Businesses can get more bandwidth, instantly,  if they expect an uptick in web traffic or host a videoconference or must support seasonal shopping traffic peaks.


Consumers could be offered instant “turbo” boosts when they are watching videos, and then scaling back down when they are finished.


As you would expect, service providers believe that dynamic bandwidth policies could create an opportunity for dynamic pricing, or at least dynamic provisioning of bandwidth to priority or high-value customers, as well.


About 52 percent of respondents saw that as a value provided by NFV and SDN, the study suggests.


About 48 percent of respondents saw value in “elastic service chaining,” which allows processing of different services to scale out when needed and scale back in when not required.


A similar percentage saw SDN and NFV as helpful for creating, selling and supporting virtual managed services.


Scaling services up or down quickly was rated six or seven by 86 percent of respondents asked about NFV. That was seen as an advantage for introducing new services quickly, by about 69 percent of respondents asked to rank their interest in NFV.


Also important: the ability to test new services on a small group of customers before expanding for wider commercial availability, modify the service and give it another try, or scrap services without too great an investment if it’s not working out.


Operationally, respondents valued the “global view” of the network across multi-vendor networks and multiple layers.

That is believed to offer more granular control of resources, which in turn will allow networks to be operated more efficiently, reducing capital investment.

Spectrum Futures 2014

Wednesday, October 15, 2014

"Growing Revenue" and "Cutting Costs" are Different; So is Customer or Employee Satisfaction

Enterprises are likely to invest three to four times more in technology aimed at growing revenue than technology to lower business costs, predicts Andrew Bartels, Forrester Research VP and principal analyst.

That represents a change from practices of the past six decades, when most information technology spending was designed to improve efficiency and lower cost.

The new emphasis is on revenue growth, supporting enterprise activities related to winning, serving, and retaining customers, Bartels says.

In addition to higher spending on front-office systems for sales and marketing, organizations will  be spending to develop new products, handle and fulfill  orders, serve customers and acquire  the human and partner resources for doing this effectively.  

Still, what might be called “hygenic” spending (to maintain health) to support efficiency and cost reduction will continue to represent over 70 percent of total enterprise technology spending through 2017.  

Much of that spending will support core systems already put into place, and represents maintenance and operations spending.

Still, spending aimed at growing revenue will increase more than three times as fast as “hygenic” spending to gain efficiencies or support existing programs.

Increased spending on technologies to grow revenues through customer focus will be boosted at 10 percent to 12 percent rates through 2017, while spending on hygenic systems will grow at two percent to four percent rates.

As a proportion of new project spending, purchases to support revenue initiatives will exceed hygenic spending starting in 2014. And more of that new spending will use cloud computing mechanisms.

Those spending priorities are similar, in principle, to a theory about employee satisfaction, originally developed by Frederick Herzberg, and known as two-factor theory.

The basic insight is that “satisfaction” and “dissatisfaction” are not points on a scale, but actually two different scales. One scale measures degree of “happiness,” (or satisfaction) while the other measures degrees of unhappiness (or satisfaction).

That is not the way most of think about the matter. Instead, we tend to see a single scale, with happiness on one end, and unhappiness on the other.

As applied to customer satisfaction, employee satisfaction or even organization success, two-factor theory suggests that different actions are needed to prevent unhappiness and increase happiness; prevent dissatisfaction and increase satisfaction.

The practical implications are enormous. You can pay employees enough to keep them from leaving. But paying them more does not automatically increase output, motivation, creativity or commitment.

To increase employee motivation and commitment, you have to do other things that speak to employee sense that the firm recognizes and values the employee, that people have a chance to contribute and be recognized.

The same applies for customers. A perceived fair value for fair price will help keep customers from deserting to other providers.  

To make customers advocates, an organization must do different things. Somehow, customers must come to feel that a product or service reflects who they are, mirrors and shares their values, adds prestige or other feelings of well-being. Think Apple, Starbucks, Whole Foods.

What makes them “unhappy” is on a different scale.

Think of one scale as “motivation” (or customer happiness or job satisfaction) with the scale running between “satisfied and motivated” (high satisfaction) on one end, and “not satisfied and unmotivated” (low satisfaction) on the other end of a linear scale.

Think of the other scale as “hygiene,” with the scale running between “not dissatisfied” on one end of the scale and “dissatisfied” on the other end of the scale.


All that might seem esoteric. It is as practical as possibly could be. It means managers must offer working conditions and pay viewed as adequate enough to avoid employee dissatisfaction. That just keeps people from leaving, though.

To create active, contributing, motivated employees, organizations have to work at other levels, offering opportunities for personal growth, recognition and achievement that are relatively non-material.

The “hygiene” factors, in other words, keep people from leaving. The “motivation” factors generate enthusiasm, creativity and energy. Dealing with hygiene (pay, working conditions) is necessary. But it isn’t sufficient to create commitment.

In the customer realm, good “hygiene” means customers do not churn. But high “motivation” efforts mean customers recommend the service to friends. Successful hygiene means people say “it works.” Successful  motivation efforts means people say “I love this product.”

Why Netflix 4K Matters

Why does Netflix 4K video matter? Because 4K will boost bandwidth consumption by viewers watching 4K video.

Netflix says it now is shooting all its original series content in 4K format. Netflix believes 4K will “completely change” end user expectations for online video entertainment image quality.

Where a standard-definition Netflix stream is designed to work at about 3 Mbps, 4K will require 15 Mbps to perhaps 17 Mbps, as a minimum. That might be an advantage for Internet service providers able to support those rates easily, but will cause some potential issues for ISPs not able to comfortably deliver streams at such speeds.

At peak hours, Netflix accounts for 34 percent of all downstream usage, up from 31.6 percent  in the second half of 2013, according to Sandvine . Peak period is defined as 7 p.m. to 11 p.m. on Sandvine’s reports.

Real-time entertainment is the dominant traffic category in North America, Sandvine says, representing more than 63 percent of downstream bytes during peak hours.

A shift to 4K video, as it grows, will increase the amount of real-time entertainment traffic during peak hours.

As with most phenomenon related to the Internet, “average” or “typical” is misleading.

U.S. consumers who use video streaming as a primary form of entertainment consume about 212 gigabytes of data per month (with 153 GB of that consumption representing the impact of “real-time entertainment”).

“Typical subscribers” had a mean monthly usage of 29 GB.

“Non-streamers,” who are consumers typically streaming less than 100 megabytes of audio or video each month, had a mean monthly usage of 4.5 GB.

While the number of streaming hours consumed by people who fit a cord-cutting profile might seem “shockingly high to some,” Sandvine said it’s “quite easily achievable” when homes have multiple people using multiple screens.

Boosting a significant percentage of Netflix views to 4K format will boost consumption about five times for the viewed content, for the 4K items watched.

And that is why 4K matters for ISPs.

Spectrum Futures 2014
Find out more

What Drives Postpaid Net Adds: Tablets or Former Prepaid Accounts?

Over the past year, tablet connections have largely accounted for net subscriber growth at AT&T Mobility, Verizon Wireless and Sprint. What now remains to be seen is whether growth over the next couple of years might shift back to phone account additions.

The reason is an apparent shift of policy by Sprint and T-Mobile US to convert more prepaid accounts into postpaid accounts, principally by relaxing traditional credit standards.

Some will immediately grasp the other implications. Churn rates might increase and the size of the prepaid market might level off or possibly shrink, at least for a time.

On the other hand, between tablet net adds and a faster conversion of prepaid accounts to postpaid, the number of U.S. mobile accounts might grow substantially over the next few years.

Tablets with mobile Internet subscriptions using 3G and 4G LTE will grow more than five times in the next five years to reach nearly 250 million in 2018 at a global level, according to the Strategy Analytics.

Strategy Analytics predicts there will be 247 million tablet subscriptions by 2018, up from 45 million in 2013.

In the U.S. mobile market, that will translate into 50 million tablet subscriptions between 2014 and 2018. Verizon Wireless, Sprint and AT&T combined added nearly 1.5 million tablet subscriptions in the first quarter of 2014, for example.

That implies a growth rate of about 25 percent a year.

But connected tablet forecasts might also have to contend with a seemingly higher rate of postpaid smartphone net additions as well, as at least some of the largest U.S. mobile service providers attempt to convert more prepaid accounts into postpaid accounts.

By at least one analysis, it is conceivable the U.S. mobile industry could add four times more net new phone accounts in 2014 as it did in 2013, and possibly three times more phone accounts in 2015, compared to 2013 levels.

For reasons related to tightened restrictions on fraud, a “lifeline service” program propping up prepaid account volume saw a big fall in 2013.

Where in 2012 the mobile industry added 4.5 million net new accounts because of the lifeline program, in 2013 only 1.2 million were added.

A continued decline in net prepaid additions in 2014, even after the one-time adjustment for program rule changes, is what requires explanation.

The industry should add only 959,000 prepaid subscribers in 2014, UBS says. Since the slowdown cannot be attributed to the impact of lifeline program rule changes, something else is going on.

And that something is likely bigger numbers of consumers shifting from prepaid to postpaid accounts as credit standards are relaxed. Some might argue that process will result in higher churn rates, higher bad debt and higher customer service costs for the firms that relax the standards.

As of the end of the second quarter, there were about 74 million prepaid subscribers in the U.S. versus 228 million postpaid subscribers.

The big question now is how many of those prepaid accounts actually can become long-term postpaid accounts.

Spectrum Futures 2014


Monday, October 13, 2014

70% Improvement in Mobile Network App Performance is Possible,Study Finds

Mobile networks can be optimized for app performance, a test by XL Axiata, Ericsson and Facebook has found.

The resultant optimizations improved app coverage by up to 70 percent, according to Internet.org.

The number of connections completed within three seconds improved up to 70 percent.

Time to content improved up to 70 percent and upload time improved up to 50 percent. In the radio access network, problems were found with parameter settings and capacity bottlenecks. 

“For example, the higher-end smartphone took significantly longer than the lower-end smartphone to upload photographs, which was due to a parameter settings issue,” Internet.org says. 


To remedy that problem, Ericsson and XL Axiata performed optimizations of coverage, uplink performance parameters and RAN capacity parameters. DNS servers also were an issue. XL Axiata’s DNS servers experienced a high processing load, and measurements revealed that this had a significant impact on user quality of experience.


Results showed that objects, such as photographs, might download in 0.5 to 3 seconds, while DNS resolution could skyrocket to 10 to 35 seconds.

This delay would render the user experience so bad that the app would appear to be not working.

To remedy that problem, XL Axiata reconfigured its servers, changed parameter tunings and performed some capacity upgrades, according to Internet.orgContent delivery networks also had impact on user experience. 

Looking at the worst 10 percent of samples, the local servers, which processed 16 percent of data, had a time to content of three seconds, while more distant servers had a time to content of up to 20 seconds.


In response, Facebook redirected traffic to different servers that were closer and had better connectivity to XL Axiata’s network. In stationary rural tests on the worst 10 percent of scenarios, time to content was lowered from nine seconds to 2.8 seconds on the lower-end smartphone.

The final results of the combined changes to the RAN, DNS and CDN showed app coverage improvements of 40 to 70 percent. Within that scope, time to content improved up to by 80 percent, while upload time improved by up to 50 percent.

Illiad Ends Bid to Buy T-Mobile US

In the continuing saga of “who will buy T-Mobile US,” France’s ended its long-shot effort to buy T-Mobile US from Deutsche Telekom, apparently after Deutsche Telekom board members “refused to entertain its new offer.”

That clears the way for an expected bid by Dish Network sometime in early 2015, many predict. The reason for the expected delay is that spectrum auctions to be held in late 2014 might put a higher value on Dish Network spectrum holdings, allowing Dish Network to benefit from a higher equity value when and if it makes a bid to buy T-Mobile US.
 
Buying T-Mobile would allow Dish to offer entertainment video, high-speed Internet and mobile  voice service across the country nationwide.

Dish Network’s strategy is in one way the mirror image of what AT&T is attempting with its purchase of DirecTV.

Where Dish Network is a satellite video provider trying to create a national triple play, AT&T will try to add a national video entertainment offer to its national mobile network, creating a similar triple play.
 
T-Mobile US serves approximately 47 million subscribers, generating revenues of over $24 billion with a current market value of close to $25 billion.

But the real prize is the means for Dish Network to fulfill conditions of its spectrum licenses, which require building of an actual transmission network, without which Dish Network loses its licenses, and the equity value the licenses represent.

Should Dish Network make such a successful bid, and should AT&T’s bid to buy DirecTV be approved, we also would see the end of the U.S. satellite entertainment business as an independent segment of the U.S. consumer services business.

What Accounts for High LTE Adoption in U.S. Market?

Ironically, “receiving party pays,” a billing practice distinct from “calling party pays,” might have created the incentive for larger usage buckets in the U.S. market, moves that in turn might have contributed directly to higher rates of use of all mobile services, including voice, text messaging and mobile Internet access.

“Receiving party pays” tends to discourage users from keeping their devices powered up, since they pay for all received calls and messages, and cannot control them. On the other hand, that policy creates the incentive for service providers to create large usage buckets that alleviate user concern about such inbound traffic.

Also, the U.S. market historically has been based on postpaid, rather than prepaid retail plans. That encourages customers to use what they already have paid for, rather than restrict usage because they pay by the minute or byte.

One direct result is higher usage and lower retail fees, the GSMA argues.

The typical North American mobile user consumes 629 minutes of voice a month, compared to 334 minutes in the Asia-Pacific region and 151 in Europe, according to GSMA.

U.S. mobile consumers send or receive 467 text messages a month, compared to 122 a month in Germany, 188 a month in the United Kingdom and 199 per month in Italy.

At the same time, shared data plans have encouraged consumers to connect a broad range of data-intensive devices, including dongles, laptops and tablets. with increased data traffic leading consumers towards more expensive data plans. In turn, this generates higher average revenue per account.

It isn’t clear that those usage patterns are a result of LTE availability, or whether LTE adoption is a result of high demand for bandwidth. The former might suggest that making LTE available causes people to use more data. The latter might suggest LTE is simply a response to high demand.

The correlation (or causation, to some extent) matters when policymakers consider the role of LTE as an economic tool, or service providers look to boost use of mobile Internet access services.

Does LTE availability “cause” more data usage, or does more data usage require LTE?


Spectrum Futures 2014
Find out more

LTE adoption so far has been uneven globally, but probably for reasons related to market dynamics in each country or region. In other words, it is not so clear that introducing LTE necessarily changes the demand curve for mobile Internet access.

On the demand front, U.S. operators decided not to charge even a slight premium for LTE access, compared to 3G. So pricing might matter.

At the same time, consumer unhappiness with performance of the Apple iPhone on 3G networks quickly convinced mobile service providers to boost performance of the access networks.

Sprint, on the other hand, charged a $10 a month premium for users of tis WiMAX 4G network. That probably retarded adoption.

At the same time, high use of streaming video in the U.S. market aguably created more need for bandwidth, at a time when spectrum was released to market to support LTE networks.

At the same time, Sprint’s WiMAX offer also meant that other service providers were “behind” in the marketing of fourth generation network services, and had to catch up.

The first commercial 4G-LTE network was launched in the region in the third quarter of 2010.

But Sprint had launched its WiMAX 4G network in 2008, reaching a relative handful of cities by 2009 and widespread service in 2010. So many would argue Sprint squandered a two-year headstart on the first LTE or 4G network.

In the meantime, rival carriers also boosted the speeds of their 3G networks. In some cases, 3G networks offered access speeds equal to, or greater than, WiMAX.

By the end of 2013, the United States had 85 million 4G mobile connections, making it the world’s single largest 4G market. Japan ranked second with 44 million accounts. South Korea had 29 million accounts.

Total mobile connections (SIM cards) in the region stood at 341 million at the end of 2013, excluding M2M connections. However, the number of unique mobile subscribers (individuals) was significantly lower at 250 million, reflecting the high levels of multiple SIM and device ownership in the region.

4G accounted for approximately one in four of the total mobile connections in North America in 2013, the highest proportion of any global region.

Close to 97 per cent of the entire population in North America lived within the coverage range of 4G networks at the end of 2013, also one of the highest levels globally.

Build-out of 4G networks has occurred at a more rapid rate than the earlier move to 3G. It took around four years for 3G coverage to reach 95 percent of the population, 4G took just two and a half years.

Some might say that is because of the prominent role now played by Internet apps and access in driving the overall value proposition for use of smartphones. And U.S. adoption of smartphones is high.

North America had the highest levels of smartphone adoption of any region at the end of 2013, with smartphones accounting for 60 percent of total connections.

The point is that a confluence of factors likely accounts for high Long Term Evolution adoption in the U.S.market.

Goldens in Golden

There's just something fun about the historical 2,000 to 3,000 mostly Golden Retrievers in one place, at one time, as they were Feb. 7,...