Tuesday, October 4, 2016

DirecTV is Not a Competitor to U-verse; OTT is the Replacement for Linear TV

AT&T executives now have said that its new DirecTV Now over the top service will become the company’s primary video service within three to five years. That might be called the third shift in AT&T video services strategy.

First, there was U-verse video, delivered over the fiber-to-neighborhood network. Now AT&T primarily relies on satellite delivered DirecTV.

So a shift to primary reliance on OTT delivery would be the third shift in platform. There are some important potential implications. Some now argue that AT&T will shut down the satellite service entirely, at some point, in favor of OTT delivery as the sole distribution method.

There is a bit of irony there. When the DirecTV purchase first was announced, and before the approvals process was completed, some might have argued that AT&T was going to abandon U-verse.

That basically is happening. AT&T pushes new subscribers toward DirecTV, and not U-verse video. The U-verse brand apparently will be discontinued. At the same time, AT&T plans to zero rate DirecTV Now usage by customers of its mobile services.

What now seems to be shaping up is not just a DirecTV replacement of U-verse, but replacement of both U-verse and satellite delivery by a switch to streaming, on all AT&T network platforms, mobile and fixed.

So as it might well turn out, DirecTV was not a threat to U-verse. Instead, all linear delivery will be phased out in favor of streaming over all AT&T networks, fixed or mobile.

By zero rating entertainment video consumption, AT&T and others also are demonstrating that the entertainment video service is a managed service, like cable TV, satellite or telco TV, and not an “Internet” service.

By incorporating access into the cost of the purchased video content, AT&T and others are using the media, broadcast and linear video models, where delivery bandwidth is simply incorporated into the price of the product.

DirecTV is Not a Competitor to U-verse; OTT is the Replacement for Linear TV

AT&T executives now have said that its new DirecTV Now over the top service will become the company’s primary video service within three to five years. That might be called the third shift in AT&T video services strategy.

First, there was U-verse video, delivered over the fiber-to-neighborhood network. Now AT&T primarily relies on satellite delivered DirecTV.

So a shift to primary reliance on OTT delivery would be the third shift in platform. There are some important potential implications. Some now argue that AT&T will shut down the satellite service entirely, at some point, in favor of OTT delivery as the sole distribution method.

There is a bit of irony there. When the DirecTV purchase first was announced, and before the approvals process was completed, some might have argued that AT&T was going to abandon U-verse.

That basically is happening. AT&T pushes new subscribers toward DirecTV, and not U-verse video. The U-verse brand apparently will be discontinued. At the same time, AT&T plans to zero rate DirecTV Now usage by customers of its mobile services.

What now seems to be shaping up is not just a DirecTV replacement of U-verse, but replacement of both U-verse and satellite delivery by a switch to streaming, on all AT&T network platforms, mobile and fixed.

So as it might well turn out, DirecTV was not a threat to U-verse. Instead, all linear delivery will be phased out in favor of streaming over all AT&T networks, fixed or mobile.

By zero rating entertainment video consumption, AT&T and others also are demonstrating that the entertainment video service is a managed service, like cable TV, satellite or telco TV, and not an “Internet” service.

By incorporating access into the cost of the purchased video content, AT&T and others are using the media, broadcast and linear video models, where delivery bandwidth is simply incorporated into the price of the product.

Asset-Light or Asset-Heavier Models for Video Services?

The video content business might be moving, to some extent, towards a more “asset light” model, as it increasingly is possible to supply content directly to consumers “over the top,” without building or owning physical access assets (transmission towers, cable or other broadband networks, satellite fleets, spectrum, rights of way).

On the other hand, content creation and acquisition then become a bigger portion of the total cost model. Where content costs might be 44 percent to 49 percent of costs for a firm such as Comcast, content costs might represent as much as 68 percent to 71 percent of total cost for a firm such as Netflix.

In at least one sense, that higher percentage of total cost attributed to content might be a good thing for OTT providers.

To the extent that original content largely provides the distinctiveness and uniqueness for any video entertainment supplier, ability to spend more on those assets, rather than technology and platform costs, advertising and marketing as well as overhead is an arguable advantage.

So does that mean an “asset light” approach is the best strategy for every contestant? Not necessarily. As Apple famously has demonstrated, there are clear advantages to a “closed” or “integrated and controlled” approach to platforms and services.

Google, originally a big proponent of “open” approaches, now experiences the downside of openness, namely fragmentation of the experience.

“Asset heavier” models that bundle content and apps with services and the access function have advantages as well. Suppliers able to do so have “owners economics” and can better control their costs, while gaining more flexibility in terms of pricing, compared to competitors who lease such services or cannot control access asset performance.

Also, at least in principle, access asset ownership provides another element of data on user behavior, in real time, plus ability to shape experience.

Perhaps oddly, in an era of loosely-coupled networks, larger platform providers increasingly look to integrate experiences that look more like “vertical integration” than “horizontal “ dominance of one function, and reliance on others for other key elements of experience.

Emphasis needs to be placed on “larger.” Webscale firms can think about integrating key functions from content creation/acquisition to access. Smaller firms cannot do so.
source: Accenture

Monday, October 3, 2016

Google Fiber Really is Going Hybrid

Google Fiber, Comcast and CableLabs all are interested in business potential of shared spectrum in the 3.5-GHz band. For Comcast and CableLabs the attraction might skew more to use of the shared spectrum to support mobile access.

For Google Fiber the issue is more likely related to fixed wireless as a complement to the Google Fiber service.

If you had questions about whether Google Fiber was serious about using fixed wireless to reach its customers, consider what Dennis Kish, Google Fiber president, says: “As we’ve said, our strategy going forward will be a hybrid approach with wireless playing an integral part.”


Among other things, Google has applied for permission to widely test 3.5-GHz shared spectrum at 24 U.S. locations, especially including areas where interference with licensed users would be expected to be an issue. Google already is testing shared spectrum in that band in Kansas City, Mo.

“Webpass (a wireless Internet service provider now part of Google Fiber) has proven that point-to-point wireless is a reliable way to connect more people to high-speed Internet in a densely populated environment,” said Kish. “Google Fiber will continue to build out our portfolio of wireless and fiber technologies.”

Nokia Bell Labs Shows Self-Activating, Drone-Delivered Small Cell

Nokia Bell Labs recently demonstrated the world's first drone-based delivery of an F-Cell to a Nokia office rooftop in Sunnyvale, Calif. The F-Cell wirelessly self-powered, self-configured and auto-connected to the network and instantly began to stream high-definition video.
The importance is that the deployment could allow very-rapid activation of small cells to support 4G or 5G., ideally without the time and expense required to lay cables, connect power sources and attach radio arrays to poles.
None of that dispenses with the need to negotiate roof rights with property owners, of course. Still, the innovation shows the sort of work now being widely conducted to dramatically lower the cost of Internet access infrastructure.
The F-cell system features a 64-antenna massive MIMO system supporting eight beams to eight energy autonomous (solar powered) F-Cells.
The advantage is easy small cell deployment and backhaul services. The system supports non-line-of-sight wireless networking in frequency division duplex (FDD) or time division duplex (TDD) mode.
Each F-cell supports up to eight individual 20 MHz channels, allowing for a system throughput rate of about 1 Gbps  over existing LTE networks.
The system is designed to scale up to tens of gigabits per second speeds as well.

source: Nokia

In Gigabit Era, Fixed and Mobile Networks Will Reach 10 Gbps Per Device

With gigabit Internet access speeds now shaping the context of the consumer Internet access market, it also should be acknowledged that current development work, historical precedent and competitive markets already suggest that multi-gigabit speeds up to 10 Gbps already are on the standards agenda and product roadmaps for both fixed and wireless networks.

At a time when consumers actually do not have access to applications that require a gigabit, much less 10 Gbps, that might seem an example of pure marketing hyperbole.

What might be shocking is that history suggests 10 Gbps is precisely where we should expect bandwidth to go.

Basically, since the time of dial-up access, we have seen linear increases in bandwidth that very much resemble Moore’s Law. Indeed, some even argue that Internet access bandwidth, in terms of the top marketed speeds for consumers, have increased precisely as Moore’s Law would suggest computing power grows.

What will likely come as a bigger surprise is the improvements we will see with 5G and future mobile generations, where mobile or fixed wireless speeds will reach gigabit ranges (one Gbps up to 10 Gbps) as part of the standards.



Precisely how all that bandwidth can be provided at prices regular consumers are willing to pay is the issue.

And that is why any number of development initiatives, mobile and fixed, are key. Basically, all the efforts aim to supply gigabit bandwidth on networks that are efficient enough to support consumer price points.

In that regard, much attention now is going into fixed wireless.

AT&T, for example,  is working on “AirGig,” a method for combining fixed wireless with power line transmission for communications, without building towers, laying cables or acquiring new spectrum.

All three of those attributes have the potential to dramatically lower the cost of delivering gigabit services in the access network.

AT&T’s Project AirGig has several key advantages:
  • Easier to deploy than fiber
  • Uses license-free spectrum
  • No need to deploy towers, dig trenches or connect cables
At&T expects to conduct field trials in 2017.

Combined with the dominant role of cable TV networks in the access network, and the upgrades to gigabit speeds, serious questions can be asked about whether fiber to the home will continue to be viewed as the “best” way to deliver gigabit Internet access and other services to consumers.

“Project AirGig has tremendous potential to transform internet access globally, well beyond our current broadband footprint and not just in the United States,” said John Donovan, AT&T chief strategy officer.

AT&T says it has more than 100 patents or patent applications supporting this new technology and other access technologies.


“We’re experimenting with multiple ways to send a modulated radio signal around or near medium-voltage power lines,” said Donovan. “There’s no direct electrical connection to the power line required and it has the potential of multi-gigabit speeds in urban, rural and underserved parts of the world.”

Project AirGig is therefore one more potential platform for Internet access and communications that uses fixed wireless.

As part of Project AirGig, AT&T Labs invented low-cost plastic antennas and devices located along the power line to re-generate millimeter wave (mmWave) signals that can be used for 4G LTE and 5G multi-gigabit mobile and fixed deployments.

“These patent-pending devices can mean low hardware and deployment costs while maintaining the highest signal quality,” said Donovan.

Also,  5G standards will include multi-gigabit speeds, and the cable TV industry already envisions 10 Gbps service. Nokia already has demonstrated 10 Gbps symmetrical speeds on hybrid fiber coax, supporting the CableLabs “full duplex” version of DOCSIS 3.1.

All that illustrates a principle: advertised Internet speeds are mostly about marketing, at this point, not “need.” The clearest use case for most accounts is that multiple users in a family or household watch lots of high-definition format streaming video simultaneously.

Generously allocating 10 Mbps per stream would mean a need for 70 Mbps for seven simultaneous HDTV streams.

Some users running servers out of their homes plausibly need similar levels of bandwidth. But the average consumer arguably needs nowhere near 100 Mbps.

A reasonable 2015 analysis of functional need, per user, might have looked something like this:
  • 5 Mbps or less: Basic web surfing and email
  • 5-10 Mbps: Web surfing, email, occasional streaming and online gaming with few connected devices
  • 10-25 Mbps: Moderate HD streaming, online gaming and downloading with a moderate number of connected devices
  • 25-40 Mbps: Heavy HD streaming, online gaming and downloading with a lot of connected devices
  • 40+ Mbps: Hardcore streaming, gaming, and downloading with an extreme number of connected devices.

The fundamental point is that we now are in a phase of development where end user need does not drive bandwidth growth. Suppliers and apps are pushing the trends.

1/3 of Telco Execs Ponder Moves "Up the Stack" into Applications

There are very good reasons why global telecom executives are looking for a range of new revenue generators: the legacy revenue streams are shrinking.

Over the past several years, the telecom business has entered a period of slow decline, with revenue growth down from 4.5 percent to four percent, EBITDA margins down from 25 percent to 17 percent, and cash-flow margins down from 15.6 percent to eight percent, say Paul-Louis Caylar and Alexandre Ménard, McKinsey partners.

Among U.S. telecom companies, landline and mobile voice now account for less than a third of total revenues, down from 55 percent in 2010.

Over the last half decade, mobile data revenue growth has offset the losses in voice and messaging. Mobile data, in fact, now represents 65 percent of total revenues. In 2010, mobile data represented just 25 percent of total revenues.

A third of the 104 respondents to a 2015 McKinsey survey of senior industry leaders said they were preparing to move into adjacent businesses such as financial services, information technology services, media, or utilities in search of new opportunities and revenue streams, McKinsey says.

With the possible exception of media services, most of the opportunities seem to involve Internet of Things to a great extent.


Directv-Dish Merger Fails

Directv’’s termination of its deal to merge with EchoStar, apparently because EchoStar bondholders did not approve, means EchoStar continue...