Friday, January 9, 2015

Video Autoplay Drives 60% to 200% Facebook Data Consumption Growth

Autoplay video is having a huge impact on Internet access networks, and also likely is going to shape end user behavior in ways beyond new concerns about uncontrollable bandwidth usage.

Over the 12 months preceding September 2014, Facebook traffic increased by 60 percent on the mobile network, and by over 200 percent on the fixed network, driven mainly by the addition of autoplay video to the Facebook feed, according to Sandvine.

Autoplay on Instagram and many other sites probably is having similar effects, namely increasing data consumption in an involuntary way.  

Users probably are starting to shut off audio when working in quiet places, or all the time. People might start reducing or avoiding some sites that are aggressive about autoplay video. Demand for third party apps to disable such features are going to become more popular.

Here are some ways to disable autoplay in your browser.

Thursday, January 8, 2015

Did an Understanding of Moore's Law "Save" the 1980s Cable TV Business?

Did an understanding of Moore's Law "save" the U.S. cable TV industry in the mid-1980s, in the same way Moore's Law enabled Microsoft and Netflix?

Maybe so.

Most will agree that it matters greatly whether Moore’s Law continues at historic rates or whether bandwidth advances continue at historic rates. 

The reason is that so many businesses implicitly or explicitly embed such assumptions into business models and expected or potential rates of growth.

And the continuation of that trend has been highly contestable. For three decades, observers have predicted that the rate of improvement simply could not continue, as we would reach the limits of our ability to etch smaller pathways onto silicon substrates. Optimists have countered that we would begin working with different substrates.

Stubbornly, Moore’s Law has, so far, defied projections. Over the past three decades, big businesses, and big bets, have been contingent on Moore’s Law.

Perhaps the biggest early bet was made by a few in the U.S. cable TV industry.

Way back in the 1980s, proposed high definition TV standards threatened to choke off growth of the U.S. cable TV industry, for example.

Still a smallish industry of possibly $30 billion annual revenues, initial standards proposed by Japanese electronics interests and over-the-air broadcasters would have severely disrupted the cable industry’s business model.

At that time, Japanese suppliers dominated and lead the TV set business, and cable operators were struggling with the cost of supporting complicated in home installs of cable, plus TVs, plus off-air antenna service, plus videocassette records and multiple remotes.

All that complexity generated consumer unhappiness.

At the time, the proposed HDTV standards were partly analog, partly digital and might have required about 45 Mbps of bandwidth per channel, at a time when cable access networks were set up to use 6-MHz channels.

In addition to requiring completely new electronics across the network, some also suggested the initial standard would not last more than five to 10 years, requiring yet another “rip and replace” investment cycle at the end of that period.

Astute cable TV industry executives knew they could not easily afford to make major upgrades twice within 15 years. Consumer electronics suppliers would win, because they could expect two waves of device replacement (consumer and industrial) within 15 years.

Broadcasters might also have reasonably assumed they would gain strategic advantages over cable TV, then seen as a direct competitor. Also, given the growing trend to greater realism in TV image quality, the quality of the existing product would be enhanced, at less cost per TV station than a cable operation would face.

Enter Moore’s Law. Few experts at the time believed it was possible to move directly to an “all-digital” form of HDTV, in one step, and yet retain the standard channelization. The reason was simple enough.

Decoding such a signal, massively compressed and processed, would require the equivalent of a mainframe computer in the home.

The issue, though, was whether Moore’s Law actually would continue to improve at historic rates, and therefore provide affordable mainframe computing capabilities. Most believed that unlikely. But a few did bet on Moore’s Law continuing, which would make possible a consumer decoder at a price that, while significant, would still allow cable operators to support HDTV.

To make a longish story short, Moore’s Law remained intact, and it indeed was possible to compress a 50-Mbps “raw” data stream into 6 MHz of bandwidth.

Some would say, in that instance, Moore’s Law “saved” the economics of the whole U.S. cable TV industry.

Some also would note that Reed Hastings of Netflix made a bet on Moore’s Law as well. Earlier, some would argue Microsoft was built on an understanding of the implications of Moore’s Law. What a business could like like if computing or bandwidth were free is the key question.

For Gates, the key assumption was that Moore’s Law would make the cost of computing hardware a non-problem for a software supplier, and also would create huge new markets for computers.

For Hastings, Moore’s Law, as embedded in Internet access prices, would make possible streaming services even lower in cost than mailing DVDs using the postal service.

The point is that, sometimes, a big forecast on a key trend can enable a whole new industry or business, or perhaps save a whole industry or business.

Most other attempts to quantify the future also are subject to uncertainty. So forecasting errors always are possible. In fact, they might be the normal state of affairs.

Philip Tetlock's Expert Political Judgment: How Good Is It? How Can We Know? found that “specialists are not significantly more reliable than non-specialists in guessing what is going to happen in the region they study.”

Sam L. Savage’s The Flaw of Averages points out that plans based on average assumptions are wrong on average, because uncertainty in life is much more pronounced than people generally assume to be the case.

Nassim Talib’s The Black Swan likewise deals with the powerful impact of unpredictable and unexpected developments.

In fact, some would go so far as to say that forecasts always are wrong, to some degree. That isn’t necessarily a bad thing, as minor fluctuations along a predicted trend line nearly always happen. That is true of most economic forecasting, some argue.

That doesn't mean people will stop listening to forecasts, or that experts will fail to make them. Occasionally, though, big bets are made based on such forecasts, no matter how inaccurate forecasts might be.

Sprint, T-Mobile US Tout Subscriber Growth: Will AT&T and Verizon Tout ARPU?

Average revenue per user, or average revenue per account are important measures of mobile service provider operational performance. The total number of accounts, and the composition of accounts (prepaid or postpaid; tablet or phone) also are important.

With a mobile marketing war going on in the U.S. market, the value of each metric might change. At a high level, Sprint and T-Mobile US are likely to tout subscriber growth. AT&T and Verizon are likely to soon emphasize ARPU or ARPA.

The reasons are prosaic as well as strategic. No company likes to report negative numbers, and eventually, it is possible AT&T or Verizon or both might actually have to report negative account figures. It hasn’t happened yet, but it is possible.

T-Mobile US and Sprint, on the other hand, are gambling on a furious pursuit of account growth, at the expense of ARPU or ARPA. Executives at those firms will be no more happy to report negative numbers for ARPU or ARPA, and will tout account growth.

Sprint preannounced net account growth in the calendar fourth quarter (Sprint’s third fiscal quarter) of 2014.

During the quarter, Sprint platform net additions totaled 967,000 including postpaid net additions of 30,000, prepaid net additions of 410,000 and wholesale net additions of 527,000. That reversed a trend of account losses.

“The growth in postpaid customers was driven by the highest number of postpaid gross additions in three years, and postpaid phone gross additions increased 20 percent for the quarter year-over-year,” Sprint said. “ In addition, the percentage of prime customers was the highest on record.”

Non-prime customers are those who are subject to involuntary churn, being disconnected for non-payment of their bills. In fact, financial performance at Sprint is credited with a 10-percent drop in profits for Sprint parent SoftBank.

T-Mobile US, for its part, likewise touted high account growth for its fourth quarter of 2014.

In the fourth quarter, T-Mobile US added 2.1 million net customers,  the seventh quarter in a row that T-Mobile US has generated more than one million net customer additions.

Branded postpaid net customer additions were 1.3 million, a 47 percent improvement compared to the fourth quarter of 2013, T-Mobile US said.   

Branded postpaid phone net additions were over one million, implying T-Mobile US activated about 300,000 tablet connections.   

For the full-year 2014, the Company reported branded postpaid net customer additions of 4.9 million and 8.3 million net total customers, an 89 percent increase from the prior year.

Killer App for LTE Might be Entertainment Video

What is the “killer app” for fourth generation Long Term Evolution networks, if in fact there is such a killer app? The easy answer is to reiterate that there is no killer app for 4G or 3G, only many apps and services that contribute to the total value, and of course, faster speeds.


That answer, even if arguably correct, is not granular enough to be of any value to practitioners and companies in the ecosystem. If faster speed alone were the key driver, and if retail pricing and packaging allowed mobile to become a viable substitute for fixed network access, a variety of revenue opportunities based on “substitution” would immediately become relevant.


If mobile video, specifically, were to become a lead app, that would imply there is a new opportunity for mobile streaming services. The same is true if LTE creates a better mobile gaming experience (latency performance, not just speed).


Other opportunities arise if cloud-based apps, in general, are widely accessible on mobile devices.


But it always has seemed as if entertainment video was likely to be the key application that distinguishes 4G from 3G. A study by  the Office of Communications, the U.K. communications regulator, suggests that, in most countries, 4G Long Term Evolution networks lead to more video streaming, compared to all other mobile networks.  


More video consumption also was among the predicted value of 4G networks, according to the conclusions reached by a study prepared for the Ofcom.


Among U.S. internet users polled, 50 percent of respondents who used 4G streamed or downloaded mobile video, according to a study by eMarketer.


About 32 percent of non-4G users reported they downloaded or streamed video. And new smartphone users on 4G networks say video is among the new apps they use most.


When a July 2014 Deloitte study asked subscribers in the US about which activities they conducted more often on their mobile networks since signing up for 4G, 33 percent said they watched more video.


Another example is skyrocketing video on Facebook, an app used exclusively on a mobile phone by about 30 percent of Facebook users. About 78 percent of Facebook users use the app on their mobile devices at least some of the time.  
So a shift towards visual content on Facebook, especially video, automatically means more usage on mobile networks.  


In one year, the number of Facebook video posts per person has increased 75 percent globally and 94 percent in the United States.


Globally, the amount of video from people and brands in News Feed has increased 3.6 times year-over-year.


Since June 2014, Facebook has averaged more than one billion video views every day, the company says.  


On average, more than 50 percent of people visiting Facebook in the United States every day  watch at least one video daily and 76 percent of people in the United States who use Facebook also say they tend to discover the videos they watch on Facebook.


A Sandvine report shows that Facebook now accounts for 19.43 percent of all smartphone data consumed in North America.


Facebook leads in “upstream” data, accounting for 22.4 percent of that traffic, and is behind only YouTube in “downstream” data.


YouTube accounts for 19.8 percent of that traffic, compared to Facebook’s 19 percent.


But Facebook has 19.4 percent share of aggregate of upstream and downstream data, exceeding YouTube’s 18 percent share.

Facebook-owned Instagram also accounts for an additional 2.6 percent of upstream data, 4.5 percent of downstream data, and 4.3 percent of total smartphone data consumption.

Wednesday, January 7, 2015

Facebook Video Traffic Grows 94% in U.S. Market

Facebook says it increasingly is seeing a shift towards visual content on Facebook, especially video.

In one year, the number of video posts per person has increased 75 percent globally and 94 percent in the United States.

Globally, the amount of video from people and brands in News Feed has increased 3.6 times year-over-year.

Since June 2014, Facebook has averaged more than one billion video views every day, the company says.  

On average, more than 50 percent of people visiting Facebook in the United States every day  watch at least one video daily and 76 percent of people in the United States who use Facebook also say they tend to discover the videos they watch on Facebook.

A Sandvine report shows that Facebook now accounts for 19.43 percent of all smartphone data consumed in North America.

Facebook leads in “upstream” data, accounting for 22.4 percent of that traffic, and is behind only YouTube in “downstream” data.

YouTube accounts for 19.8 percent of that traffic, compared to Facebook’s 19 percent.

But Facebook has 19.4 percent share of aggregate of upstream and downstream data, exceeding YouTube’s 18 percent share.

Facebook-owned Instagram also accounts for an additional 2.6 percent of upstream data, 4.5 percent of downstream data, and 4.3 percent of total smartphone data consumption.

Major Innovations--Even IoT--Take Decades to Produce Clear Productivity Increases

Vodafone notes it has been 30 years since the first mobile call was carried on the Vodafone network in January 1985. About that time, Vodafone forecast it would sell about a million subscriptions. BT predicted the market at only about 500,000 subscriptions.

In 1995, after a decade of availability, U.K. mobile adoption had reached seven percent. By 1998 adoption reached 25 percent. By 1999 adoption had reached 46 percent. Just five years later, adoption exceeded 100 percent.

We might argue about when mobility became an essential service for consumers or businesses. But we might all agree that point has been reached, and that the bigger question is how much more vital mobility will become, and how it displaces older modes of communication, computing, shopping, working and learning.

Mobile usage in the U.S. market followed a similar trajectory, with 340,000 subscribers in 1985, growing to 33.8 million by 1995. By 2005, mobile adoption had grown exponentially to about 208 million accounts.

Those figures hint at the perception of value by consumers and businesses. Growing abandonment of fixed line voice, greater volumes of mobile-initiated Internet sessions, use of websites, email and social media provide other bits of evidence about mobile’s perceived value.

But even looking only at ubiquity of usage, it took 20 years for mobility to become something “everybody” uses. It took 40 years for electrification to change productivity in measurable ways.

Keep that in mind when thinking about the “Internet of Things.” Despite the fact that U.S. businesses and organizations made huge investments in information technology in the 1980s, many would argue the benefits did not appear until much later in the 1990s.

Most of us likely instinctively believe that applying more computing and communications necessarily improves productivity, even when we can’t really measure the gains.

But investments do not always immediately translate into effective productivity results. This productivity paradox was apparent for much of the 1980s and 1990s, when one might have struggled to identify clear evidence of productivity gains from a rather massive investment in information technology.

Some would say the uncertainty covers a wider span of time, dating back to the 1970s and including even the “Internet” years from 2000 to the present.

Computing power in the U.S. economy increased by more than two orders of magnitude between 1970 and 1990, for example, yet productivity, especially in the service sector, stagnated).

And though it seems counter-intuitive, some argue the Internet has not clearly affected economy-wide productivity.

Whether that is simply because we cannot measure the changes, yet, is part of the debate. To be sure, It is hard to assign a value to activities that have no incremental cost, such as listening to a streamed song instead of buying a compact disc. And many of the potential productivity gains we might be seeing are of that sort.

The other issue is that revenue is decreasing, in many industries, even if most users and buyers would say value is much higher.

A productivity gain, by definition, means getting more output from less input.

In other words, it is “how” technology is used productively that counts, not the amount of raw computing power or connectivity. And there is good reason to believe that new technology does not reshape productivity until whole processes are changed. Automating typing is helpful. But changing the content production ecosystem arguably is where the biggest productivity gains come, for example.

Windows Phone Bigger Than Apple IoS?

Infographic: Windows Phone beats iOS?! | Statista
source: Statista
Forecasting is a tough challenge, even if it is natural to wonder "what will happen this year, or beyond," and something we always hear lots about at the start of a year.  

Most of us would not wish to be reminded of how wrong our own forecasts have been. 

Consider projections about market share in the mobile market made just a couple of years ago, when Microsoft purchased Nokia in 2013 and made a renewed push into the mobile operating system business. 

That move of course required analysts to make estimates of potential changes in the mobile operating system market. Many expected Windows Phone to become the third largest OS in terms of market share

At least a few major firms forecast that Windows Phone would eclipse eclipse Apple iOS to become the second largest OS market share

That is not to diminish the accuracy of other predictions made by those firms, or the overall accuracy of any other forecasts always made by market researchers, weather experts, economists and executives in general. Still, forecasting is a tough business. 

In 2011, Gartner expected that by 2015 Microsoft would have nearly 20 percent OS market share by 2015. 

But Windows Phone seems unable to grow beyond about three percent of the installed base even if recent sales share was as high as 13 percent in Italy. In the same time period in Italy, Android grew 68 percent, by way of comparison.  

It's just hard to predict the future.

CIOs Believe AI Investments Won't Generate ROI for 2 to 3 Years

According to Lenovo's third annual study of global CIOs surveyed 750 leaders across 10 global markets, CIOs do not expect to see clear a...