Wednesday, October 5, 2016

U.S. 4G Network Performance Drops 40% to 50% Over Last Year

Source: TwinPrime
T-Mobile US had the fastest 4G mobile network speeds, according to an analysis by TwinPrime, meaning T-Mobile US got access to the network faster, and could download content faster.
T-Mobile US also had the lowest 4G network latency performance, at an average of 52 milliseconds, followed by Sprint at 55 milliseconds, and Verizon Wireless and AT&T Mobility both at 56 milliseconds.
Verizon Wireless ranked highest in terms of LTE network reach, covering 95.3 percent of the U.S. population, compared with 91.7 percent for T-Mobile US, 91.2 percent for Sprint and 91 percent for AT&T Mobility.
As eventually happens as networks become more-heavily loaded, performance is showing signs of  strain. LTE 4G networks now carry 91 percent of total U.S. mobile traffic.
The study found median LTE performance in most U.S. cities has dropped by 40 percent to 50 percent compared to our previous report. This staggering drop in performance could partly be explained by the increased LTE traffic share in these cities, or an increase in overall mobile data consumed by LTE devices.
Source: TwinPrime
Furthermore, the study also found that Wi-Fi tends to outperform LTE in every major city in the United States by a factor of two. In real user conditions, that translates to it being twice as fast to use an app over Wifi than over LTE.
That is a return to the situation that used to hold for 3G mobile networks, when Wi-Fi often was used because Wi-Fi was faster than 3G. Other tests had been showing that LTE 4G network access speeds outperformed Wi-Fi.
If you want to know why mobile operators argue they need more spectrum, the TwinPrime tests provide the answer.
Traffic is growing fast enough that performance now is visibly degraded. That same pressure is why small cell architectures are deemed strategic: small cells allow networks to reuse more of any amount of available spectrum.
Those same concerns drive interest in shared spectrum, bonding mobile and Wi-Fi spectrum, and LTE 4G networks accessing Wi-Fi directly.
The study includes over 6 billion data points collected from 600 apps with traffic across the United States, India and Europe, more than 1,500 different network operators and 2G, 3G, HSPA, HSPA-PLus, LTE and Wi-Fi networks.

Tuesday, October 4, 2016

T-Mobile US to Prioritize Smartphone Data Over Tethered Device Data When Network is Congested

Ignoring the issue of whether one can clearly distinguish between--or accomplish--network management and network neutrality, one frustration some of us might have about network neutrality is the insistence that packet prioritization is simply wrong, always and everywhere.


That arguably is not the case. Under conditions of congestion (when network management is required), user experience benefits from packet prioritization (non-neutral treatment) to preserve experience of apps that are highly sensitive to latency, such as voice or videoconferencing, for example.


Now T-Mobile US says it will--under conditions of congestion--prioritize traffic used directly by devices connected to the mobile network, compared to traffic used by tethered devices.


In other words, the management choice is to preserve smartphone experience over that of tethered devices, when the network is congested.

That is a preference for supporting user smartphone access (implying more preference for bandwidth supplied to customers when out and about), compared to customer use of their devices for tethering, presumably implying stationary usage settings.

To the extent that all networks are built with contention in mind, there always is a need for network management when congestion occurs. No network is built on the assumption that all conceivable customer demand, at the peak hour of the peak day, always will be supported without congestion.

That necessarily means some amount of network management is necessary. But it is hard to clearly distinguish between management to preserve user experience, under conditions of congestion, and "treating every packet equally."


DirecTV is Not a Competitor to U-verse; OTT is the Replacement for Linear TV

AT&T executives now have said that its new DirecTV Now over the top service will become the company’s primary video service within three to five years. That might be called the third shift in AT&T video services strategy.

First, there was U-verse video, delivered over the fiber-to-neighborhood network. Now AT&T primarily relies on satellite delivered DirecTV.

So a shift to primary reliance on OTT delivery would be the third shift in platform. There are some important potential implications. Some now argue that AT&T will shut down the satellite service entirely, at some point, in favor of OTT delivery as the sole distribution method.

There is a bit of irony there. When the DirecTV purchase first was announced, and before the approvals process was completed, some might have argued that AT&T was going to abandon U-verse.

That basically is happening. AT&T pushes new subscribers toward DirecTV, and not U-verse video. The U-verse brand apparently will be discontinued. At the same time, AT&T plans to zero rate DirecTV Now usage by customers of its mobile services.

What now seems to be shaping up is not just a DirecTV replacement of U-verse, but replacement of both U-verse and satellite delivery by a switch to streaming, on all AT&T network platforms, mobile and fixed.

So as it might well turn out, DirecTV was not a threat to U-verse. Instead, all linear delivery will be phased out in favor of streaming over all AT&T networks, fixed or mobile.

By zero rating entertainment video consumption, AT&T and others also are demonstrating that the entertainment video service is a managed service, like cable TV, satellite or telco TV, and not an “Internet” service.

By incorporating access into the cost of the purchased video content, AT&T and others are using the media, broadcast and linear video models, where delivery bandwidth is simply incorporated into the price of the product.

DirecTV is Not a Competitor to U-verse; OTT is the Replacement for Linear TV

AT&T executives now have said that its new DirecTV Now over the top service will become the company’s primary video service within three to five years. That might be called the third shift in AT&T video services strategy.

First, there was U-verse video, delivered over the fiber-to-neighborhood network. Now AT&T primarily relies on satellite delivered DirecTV.

So a shift to primary reliance on OTT delivery would be the third shift in platform. There are some important potential implications. Some now argue that AT&T will shut down the satellite service entirely, at some point, in favor of OTT delivery as the sole distribution method.

There is a bit of irony there. When the DirecTV purchase first was announced, and before the approvals process was completed, some might have argued that AT&T was going to abandon U-verse.

That basically is happening. AT&T pushes new subscribers toward DirecTV, and not U-verse video. The U-verse brand apparently will be discontinued. At the same time, AT&T plans to zero rate DirecTV Now usage by customers of its mobile services.

What now seems to be shaping up is not just a DirecTV replacement of U-verse, but replacement of both U-verse and satellite delivery by a switch to streaming, on all AT&T network platforms, mobile and fixed.

So as it might well turn out, DirecTV was not a threat to U-verse. Instead, all linear delivery will be phased out in favor of streaming over all AT&T networks, fixed or mobile.

By zero rating entertainment video consumption, AT&T and others also are demonstrating that the entertainment video service is a managed service, like cable TV, satellite or telco TV, and not an “Internet” service.

By incorporating access into the cost of the purchased video content, AT&T and others are using the media, broadcast and linear video models, where delivery bandwidth is simply incorporated into the price of the product.

Asset-Light or Asset-Heavier Models for Video Services?

The video content business might be moving, to some extent, towards a more “asset light” model, as it increasingly is possible to supply content directly to consumers “over the top,” without building or owning physical access assets (transmission towers, cable or other broadband networks, satellite fleets, spectrum, rights of way).

On the other hand, content creation and acquisition then become a bigger portion of the total cost model. Where content costs might be 44 percent to 49 percent of costs for a firm such as Comcast, content costs might represent as much as 68 percent to 71 percent of total cost for a firm such as Netflix.

In at least one sense, that higher percentage of total cost attributed to content might be a good thing for OTT providers.

To the extent that original content largely provides the distinctiveness and uniqueness for any video entertainment supplier, ability to spend more on those assets, rather than technology and platform costs, advertising and marketing as well as overhead is an arguable advantage.

So does that mean an “asset light” approach is the best strategy for every contestant? Not necessarily. As Apple famously has demonstrated, there are clear advantages to a “closed” or “integrated and controlled” approach to platforms and services.

Google, originally a big proponent of “open” approaches, now experiences the downside of openness, namely fragmentation of the experience.

“Asset heavier” models that bundle content and apps with services and the access function have advantages as well. Suppliers able to do so have “owners economics” and can better control their costs, while gaining more flexibility in terms of pricing, compared to competitors who lease such services or cannot control access asset performance.

Also, at least in principle, access asset ownership provides another element of data on user behavior, in real time, plus ability to shape experience.

Perhaps oddly, in an era of loosely-coupled networks, larger platform providers increasingly look to integrate experiences that look more like “vertical integration” than “horizontal “ dominance of one function, and reliance on others for other key elements of experience.

Emphasis needs to be placed on “larger.” Webscale firms can think about integrating key functions from content creation/acquisition to access. Smaller firms cannot do so.
source: Accenture

Monday, October 3, 2016

Google Fiber Really is Going Hybrid

Google Fiber, Comcast and CableLabs all are interested in business potential of shared spectrum in the 3.5-GHz band. For Comcast and CableLabs the attraction might skew more to use of the shared spectrum to support mobile access.

For Google Fiber the issue is more likely related to fixed wireless as a complement to the Google Fiber service.

If you had questions about whether Google Fiber was serious about using fixed wireless to reach its customers, consider what Dennis Kish, Google Fiber president, says: “As we’ve said, our strategy going forward will be a hybrid approach with wireless playing an integral part.”


Among other things, Google has applied for permission to widely test 3.5-GHz shared spectrum at 24 U.S. locations, especially including areas where interference with licensed users would be expected to be an issue. Google already is testing shared spectrum in that band in Kansas City, Mo.

“Webpass (a wireless Internet service provider now part of Google Fiber) has proven that point-to-point wireless is a reliable way to connect more people to high-speed Internet in a densely populated environment,” said Kish. “Google Fiber will continue to build out our portfolio of wireless and fiber technologies.”

Nokia Bell Labs Shows Self-Activating, Drone-Delivered Small Cell

Nokia Bell Labs recently demonstrated the world's first drone-based delivery of an F-Cell to a Nokia office rooftop in Sunnyvale, Calif. The F-Cell wirelessly self-powered, self-configured and auto-connected to the network and instantly began to stream high-definition video.
The importance is that the deployment could allow very-rapid activation of small cells to support 4G or 5G., ideally without the time and expense required to lay cables, connect power sources and attach radio arrays to poles.
None of that dispenses with the need to negotiate roof rights with property owners, of course. Still, the innovation shows the sort of work now being widely conducted to dramatically lower the cost of Internet access infrastructure.
The F-cell system features a 64-antenna massive MIMO system supporting eight beams to eight energy autonomous (solar powered) F-Cells.
The advantage is easy small cell deployment and backhaul services. The system supports non-line-of-sight wireless networking in frequency division duplex (FDD) or time division duplex (TDD) mode.
Each F-cell supports up to eight individual 20 MHz channels, allowing for a system throughput rate of about 1 Gbps  over existing LTE networks.
The system is designed to scale up to tens of gigabits per second speeds as well.

source: Nokia

AI Impact on Data Centers

source: PTC