Friday, November 8, 2013

Telenor Users to Get Free Wikipedia Access in Myanmar

Myanmar is the latest country where mobile phone users will be able to get mobile access to Wikipedia free of charge (without paying data plan charges), provided by Telenor Group.

Telenor and Oredoo recently won licenses and spectrum to create two new mobile networks in Myanmar.

The partnership between Telenor and the Wikimedia Foundation was established in February 2012 and was founded on a shared commitment to bring Wikipedia to Telenor customers free of data charges.

The initiative is part of the Wikimedia Foundation’s mobile strategy, which focuses on reaching the billions of people around the world whose primary opportunity to access the Internet is by means of a mobile device.

Following the agreement, special versions of Wikipedia for mobile phones were launched in Thailand, Malaysia and Montenegro. In addition, Telenor aims to launch Wikipedia Zero in Pakistan, Bangladesh, India and Serbia in 2014.

The initiative is similar to Google Free Zone, which makes mobile search available to any mobile phone equipped for Internet access, without requiring a data plan.

Google South Africa TV White Spaces Trial Ends, No Interference Encountered

As you probably would expect for a high-profile TV white spaces trial with regulatory implications, the Google-sponsored trial of white spaces technology in Cape Town, South Africa has ended in success, connecting 10 schools using TV white spaces without interference, a key issue for TV white spaces systems that must dynamically select which frequencies to use in a particular area.


The trial partners included Google, the Wireless Access Providers' Association, a non-profit industry representative for more than 170 independent wireless operators in South Africa, CSIR Meraka, TENET, e-Schools Network and Carlson Wireless.

The trial was intended to demonstrate the value of TV white spaces technology, helping to persuade regulators that TV white spaces can be licensed for use in South Africa.

Microsoft also is running TV white spaces trials in Africa as well, in Kenya and South Africa.

Thursday, November 7, 2013

Device Preferences Shape Service Provider Opportunities

“Screens” are important to Internet service providers, telcos, cable TV companies and satellite video and Internet providers for one simple reason: ownership and use of various screens is a precondition for service demand.


For decades, the primary screen used by most consumers was the television. Then followed the PC, then the mobile phone, MP-3 player and now the tablet. Whole industries, ranging from broadcast TV and cable TV to DVD rentals and sales and the Web were created by use of those screens.


So changes in device preferences should shape demand for services provided to those devices.


And though the data remains fragmentary, it appears younger consumers see less need to own and use desktop or notebook PCs and televisions, and more often substitute a mobile device for both TVs and PCs.


Nearly half of all people 18 to 34 in the United Kingdom, for example,  consider mobile a more important screen than television for media consumption, a study by Weve has found.


Separately, Gartner estimates PC shipments in Western Europe declined 12.8 percent from the same period in 2012.  PC sales have been dropping since at least 2011.


"The PC market in Western Europe continued to shrink, declining faster than expected," said Meike Escherich, principal research analyst at Gartner. To be sure, demand has shifted to tablets, so even if mobile and desktop PC shipments declined by 14.5 percent and 9.8 percent, respectively, users can use tablets as functional substitutes.


Those trends bear watching. If people don’t own and use TVs, they don’t need a video subscription service aimed at TVs. If people don’t own and use PCs and the Internet, they will have no need for Internet access. If people do not want to use landline phones, they don’t need landline phone service.


On the other hand, demand could be shaped in new ways. In the past the primary reason for buying Internet access was access to email (dial-up era) and in the broadband era has morphed into access to the Web.


In the future there could be multiple reasons for buying Internet access, ranging from offload of mobile device data usage to the fixed connection, support of in-home Wi-Fi for tablets and PCs, as well as video consumption on all devices, including TVs.


Historically, device penetration created a ceiling for service adoption. In any given market, if 15 percent of households find no reason to use the Internet or PCs, those households are not going to be prospects for buying Internet access.


Roughly the same argument applies to ownership of televisions, with the caveat that televisions can be used without video subscriptions, as displays for non-connected game players, DVD and Blu-ray devices.


chart pc salesWhat is new is that widespread use of a variety of connected devices changes the demand driver for Internet access, even if use of some dedicated (TV) or general purpose (PC) devices is lessening, to some extent, in some demographic groups.


One wild card is the suitability of mobile and fixed networks to support the range of devices and use cases many users will have.

The data also suggests mobile devices now are firmly established as competitors to conventional media channels in the United Kingdom, Weve argues. Assuming people prefer to use their mobiles as content consumption devices on the Wi-Fi connection, rather than the mobile network, while in the home,


When including screens used for work or personal purposes, 40 percent of respondents surveyed consider the PC the most important screen, especially for work activity,


But 28 percent of respondents say that mobile devices are now their first screen for media consumption, ahead of TV at 27 percent.


And demographics matter, as 46 percent of respondents 18 to 34 year consider their mobile device as their first and most important screen.


Over a quarter of surveyed consumers turn to their mobile first to interact with online content, rising to 45 percent among 18-34 year olds.


Nearly 10 percent of consumers turn to their mobile first to make online purchases.


The nationwide survey of 2,000 adults between the ages of 18 and 55 (or older) found that
about 39 percent say their mobile device is the screen they look at most often.


One might argue those findings could have implications beyond the advertising and media business, and affect fortunes for video entertainment providers.


Many would note that rates of television ownership among Millennials are lower than might have been expected in past decades, mirroring a trend to rent rather than own homes and cars.


Nielsen found in 2011 that U.S. television ownership actually dropped for the first time.


That is a break from past behavior, as in 2010, Nielsen estimated the typical U.S. home owned more than two TV sets each.


One might argue, impressionistically, that younger people view televisions as quite optional, when forming their own households. 

Though lack of income is an issue for some, quite often, even consumers who can afford to own televisions simply do not buy them, getting most of their video from streaming sites, viewed on tablets, PCs and phones.


Ironically, less demand for some devices might change consumption in ways that create new use cases for service provider products.


Traditional video subscription services might be supplanted, eventually, by streaming alternatives that make the Internet access connection more valuable, while devaluing the legacy service.


In some other cases, a mobile connection might supplant the fixed connection.


But all the trends mean that the historic market ceiling for PC-based Internet access, or the historic floor for video subscriptions, potentially are changing. Increasingly, even households that do not use “the Internet” might discover broadband access is useful to support their mobile device usage, supply TV and voice services.


Put simply, the way Internet access in any market potentially grows to 100 percent is not “use of PCs” or “use of the Internet” but any of those apps, plus “use of mobile phones,” “use of tablets” or “want to watch TV.”

Wednesday, November 6, 2013

Inhabitants Per Household Drives Bandwidth Demand, Study Finds

Number of users per household is likely to emerge as the key driver of bandwidth demand, according to consultants Robert Kenny & Tom Broughton, in a study conducted for the U.K. Broadband Stakeholders Group.

A single-person household in 2023 will require require only about 5 Mbps and 8 Mbps to 10 Mbps for shorter periods of time. Also, the broadband connection is idle for most of the time.

A multi-person household with four occupants using high-definition streaming will experience appreciable usage almost constantly during the busy hours, with approximately 90 minutes of
demand of 25 Mbps or more per month, the study suggests.

Usage of video streaming, and especially use of higher-definition video sources, also will affect bandwidth consumption, but the key factor will be number of users per connection, one might reasonably conclude.

Looking across all households, the model indicates that the median household will require bandwidth of 19 Mbps by 2023, while the top one percent of high-usage households will have demand of 35 Mbps to 39 Mbps.

Those forecasts will strike some as low, given expectations that many consumers will have access to much-higher speeds by 2020 or perhaps 100 Mbps or higher.

But the analysts note that 64 percent of U.K. households are composed of just one or two people, limiting the effective amount of required bandwidth.

For example, even if two people are each watching their own HDTV stream, each surfing the web and each having a video call all simultaneously, the total bandwidth for this use
case is 15 Mbps in 2023.

Also, the growth of video consumption will be matched, to some extent, by improvements n video compression techniques that will reduce required bandwidth by about nine percent annually, for standard definition, high definition and 4K TV alike.

In addition,  an increase in traffic does not necessarily equal an increase in maximum bit rate requirements. Up to a point, higher usage can occur without necessarily requiring an upgrade of top speeds.

The other issue is that the crucial parameter is peak usage, not average usage, which tends to be quite low, across a whole network. Even when data consumption per connection, over a month, is 23 GB, most of that consumption occurs in a “spiky” manner.

About 34 percent of monthly consumption happens in the 6 pm to 12pm period. During those “busy hours,” data consumed per connection is 7.8 GB.
Traffic per hour during the busy hours is 43.4 MB, but average usage is just 0.10 Mbps. Likewise, average modem sync speed is 12.7 Mbps, while average utilisation of the network is just 0.9 percent.

On the other hand, the study deliberately excluded the top four minutes a month of usage (peak demand) to get a better sense of sustained or typical demand. Accommodating the absolute four minutes of peak usage would boost a four-user household speed threshold up to 50 Mbps.

The authors also note that changing end user expectations could significantly affect supplier requirements. For example, For example, the analysis assumes users will tolerate 10 minutes waiting time for a console game to load. If that load time was reduced to 2.5 minutes, then 16 percent  of households would require 83 Mbps.

Reducing the waiting time further would quickly take demand over 100 Mbps for those households.

Perhaps wisely, the study notes that “one cannot predict the future with exact certainty.” So the study conclusions do not include any impact of demand stimulation by providers or potential new applications that could boost demand for faster connections.

“In some cases we believe that current usage was constrained by current bandwidth, rather than reflecting what might be reasonably expected absent this constraint,” the authors note. In other words, actual future levels of demand could vary significantly from past consumer behavior.

That is key. Economists couch their conclusions using an important qualifier, “ceteris paribus” (all other things being equal). Of course, when new applications, new devices, new retail offers, new bandwidth and new access technologies become available, they change existing behavior. So behavior in the real world tends not to reflect “ceteris paribus.”

As with the Heisenberg Principle (often called the “uncertainty principle”), which stipulates that, when attempting to measure a particle’s position, the more precisely the position is determined, the less precisely the momentum is known in this instant, and vice versa. In other words, one can know where a particle is, or its momentum, but not both with equal precision.

A related “observer effect” is probably more germane. The problem, in essence, is that the act of measurement changes the process or thing being observed. The reason is that use of measurement instruments necessarily changes the quantity of the measured process or object.

An easy example is the use of a tire pressure gauge to measure tire pressure: applying the gauge lets some air out of the tire, changing the status of tire air pressure somewhat.

For Internet apps, almost anything “new,” ranging from an app or device to the way retail services are priced and packaged, can change user behavior.

LTE Capex Shifts to Software

The cost of mobile site hardware has become almost irrelevant to operators deploying Long Term Evolution, argue analysts at Maravedis-Rethink. Not only are base station costs crashing, but investment is shifting to software and operating expense.

Spending on radio hardware equipment by the top 100 LTE operators will only rise by 4.2 percent to $22.4 billion in 2013, according to Maravedis-Rethink.  

By 2018 base station costs will have fallen to just 15 percent of mobile operator network capex, compared to about 33 percent in 2012.

In part, that change will be propelled by the shift to small cells, which are expected to help mobile service providers reduce the cost of delivering data by 75 percent compared to conventional macrocell networks.
That architectural shift will affect capital investment as well. Small cell radio investments will account for $12 billion in spending in 2018, while servers to run cloud-based radio access networks will add $2.63 billion, according to Maravedis-Rethink.

Software defined networking (SDN) will be a growing investment driver as well. By 2018, 62 percent of tier-one Long Term Evolution carriers will be supporting virtualized networks.
Software already represents as much as 70 percent of modern communications network capex, and SDN will further that trend, potentially allowing mobile service providers to reduce costs by running more functions in software, rather than in network elements.

Gigabit Connections Will Be Commonplace by 2020, Really

Predictions always are difficult, under the best of circumstances, because researchers cannot really account for the unexpected, principally unexpected developments that analyst Nassim Taleb refers to as black swan events.

And it now appears all predictions about the typical U.S. Internet access speed, or its pricing, are becoming less reliable as the market becomes more unstable, in what many would consider a very good way.

You might argue Google Fiber, with its symmetrical gigabit service, accelerated the trend. Others might argue that the historic rate of growth of typical Internet access speeds would have suggested gigabit networks would arrive.

The point is that activity leading to higher speeds is happening on a faster scale. Since the Google Fiber launch, many other major ISPs have been committing to, and deploying, higher speed networks. AT&T and CenturyLink are among them, each committing to new gigabit networks in some markets where competition now requires it.

Separately, existing suppliers of gigabit services recently lowered prices from the earlier $300 a month level to match Google Fiber pricing of $70 a month, or come close, at $80 a month. Obviously, Google Fiber now creates the pricing umbrella.

Now the Los Angeles City Council, which has been looking at ways to provide a metro Wi-Fi network providing free service to residents, has decided to explore creation of a new citywide gigabit network.

The Council now apparently will issue a request for proposals for vendors willing to fund and build a fiber to the home reaching every home and business in Los Angeles, with wholesale access requirements as well, to be funded entirely from private sources.

The network would be required to offer free access at rates between 2 Mbps and 5 Mbps, but would be allowed to offer paid service at speeds up to a gigabit per second.

At the same time, new suppliers, seeing new business opportunities in the “heterogeneous networks” trend, are attempting to create new public Wi-Fi networks.

Gowex, a Spanish firm that already provides Wi-Fi networks in more than 80 cities globally, wants to combine public Wi-Fi and private access Wi-Fi services from mobile operators and businesses to create seamless mobile data coverage for consumers in New York, N.Y., and hopes to expand beyond New York later.

Called We-2, the new service will launch in December in New York with an initial network of more than 2000 Wi-Fi hotspots across New York,  with plans to further expand aggressively across the busiest corridors of Manhattan, Queens and The Bronx.

The company also hopes to create We-2 Wi-Fi hotspot networks in more than 300 cities by 2020.

The business model is what makes the initiative different from Fon.

Apparently, We-2 will be a network created and sold to mobile service providers who want Wi-Fi offload capabilities.

The larger point is that most existing predictions about how fast the typical U.S. Internet access connection will be, or what it will cost, are going to be wrong if they do not account for the actual pattern of supply growth we already have seen.

And that pattern suggests growth of two or three orders of magnitude, which would put a typical connection of today (perhaps 15 Mbps) at perhaps a gigabit by 2020 (two orders of magnitude growth in seven years), would simply be in keeping with past trends.

Notably, even large Internet service providers often are unable to accurately forecast how much bandwidth their own networks will require. For example, in a March 2011 presentation AT&T projected that data volumes would grow by eight to 10 times between the end of 2010 and the end of 2015.

That forecast appears to be based on an expectation that volumes would roughly double in 2011 and then increase by a further 65 percent in 2012.

Instead, AT&T in 2012 revised that projection to 40 percent annual growth. Now, 40 percent annual growth is significant. It means bandwidth consumption doubles about every two to three years.  

But annual bandwidth growth of 50 percent a year would be well within historical ranges, on an aggregate basis, in terms of long-haul bandwidth consumption. But policies and end user behavior can change the demand curve.

Some would suggest users learned to shift consumption to Wi-Fi, that lighter users unexpectedly had lighter usage profiles and that end users learned to modify their behavior in ways that reduced overall consumption or demand.

If demand grows at that level (doubling every two to three years), it is obvious that supply also has to grow to match consumption, all other things being equal.

And though it might seem improbable that typical purchased speeds could reach the gigabit level by 2020, that is indeed likely, based strictly on past precedent.

In August 2000, only 4.4 percent of U.S. households had a home broadband connection, while  41.5 percent of households had dial-up access.

A decade later, dial-up subscribers declined to 2.8 percent of households in 2010, and 68.2 percent of households subscribed to broadband service.

In other words, from 2000 to 2012, the typical purchased access connection grew by about two to three orders of magnitude in about a decade.

“Why would a consumer pay for a gigabit connection?” has been a reasonable question, given the costs and expected revenues.
Increasingly, that is the wrong question to ask. The relevant question is “why would a consumer want to buy an Internet access connection?” Speed grows, by about two to three orders of magnitude, every decade.

So “speed” is not the most relevant question. Speed grows. The question is the value of an Internet access connection.

Based on history and demand for access to the Internet, plus the dramatic compression of prices, gigabit connections will be common in 2020.

Tuesday, November 5, 2013

Los Angeles Wants Bidders for a New Fiber to Home Network Serving all Businesses and Homes

The Los Angeles City Council has been looking at ways to provide a metro Wi-Fi network providing free service to residents.

But the Council now apparently will issue a request for proposals for vendors willing to fund and build a fiber to the home reaching every home and business in Los Angeles, with wholesale access requirements as well, to be funded entirely from private sources.


The network would be required to offer free access at rates between 2 Mbps and 5 Mbps, but would be allowed to offer paid service at speeds up to a gigabit per second.

The issue is whether any entities want to take on the challenge of doing so, and overbuilding AT&T, Time Warner, Verizon, Cox, and Charter Communications, all of which offer triple play services in some parts of the city.

According to Steve Reneker, general manager of the Los Angeles Information Technology Agency, the network could cost  $3 billion to $5 billion.

Directv-Dish Merger Fails

Directv’’s termination of its deal to merge with EchoStar, apparently because EchoStar bondholders did not approve, means EchoStar continue...