Monday, September 19, 2016

What Does a Name Change Mean?

source: Cultofmac
Nomenclature always tells you something about how the telecommunications market is changing. NCTA, the U.S. cable TV industry association is formally changing its name to “NCTA-The Internet & Television Association.”


This is not the first re-branding of NCTA. The group began as the National Community Television Council in 1951 and then became the National Community Television Association in 1952.


The NCTA was known as the National Cable Television Association for a few decades beginning in 1968.


In 2001, Cable Television was dropped from the name and replaced with Cable & Telecommunications.


In 2015, the NCTA also changed the name of its annual conference from "The Cable Show" to "INTX: The Internet & Television Expo."


Each of that nomenclature changes reflected some change in business models, from off-air signal re-transmission to “program choice” to multiple services including telecommunications and especially Internet-related services.


There are other triggers for re-branding, though, often based on big acquisitions. Those costs are less discretionary.


By some estimates, the at&t brand is worth more than $45 billion. Reason enough, one might argue, to simplify branding of key products around at&t. It appears that the U-verse brand is going away.


It isn’t the first time AT&T has ditched a major brand name.


When SBC Communications was re-branded as AT&T, the cost was said to involve spending of about $1 billion.


When AT&T Wireless was re-branded as Cingular, the move is reported to have cost $4 billion. Some three years later, Cingular went back to AT&T, for possibly another $2 billion.


There have been other, arguably less-expensive rebranding efforts as well. There was the cost of re-branding BellSouth as AT&T, said to have cost as much as $2 billion. One has to assume the re-branding of Ameritech as SBC cost at least $1 billion. Add on the earlier re-branding of Pacific Telesis as SBC as well.


Add it up and those earlier re-branding efforts probably has cost about $9 billion.

Big companies and associations often change their names to reflect changes in markets, business models and missions. Big cable and telco organizations are no different.

Virgin Media Makes the Best Argument for Openreach Separation, as it Opposes Separation

Image result for cable telco internet speeds
source: FCC National Broadband Plan
Sometimes competitors line up on the same side of an argument. Often, the issue is what that means.


Consider that Virgin Media supports Openreach remaining integrated with BT, even while other customers of Openreach want the business completely separated.

There is a straightforward reason why Virgin Media, which operates on it own infrastructure, might support BT keeping Openreach, the unit of BT that supplies wholesale Internet access to other Internet service providers.

And that reason tends to support the arguments for full separation of Openreach from BT. Simply, Virgin Media apparently believes BT keeping Openreach will limit the wholesale unit’s investments in faster speeds, which benefits Virgin Media.


Already the leader in consumer access speeds, Virgin Media apparently believes a separated Openreach will invest faster in access speeds used by virtually all other ISPs in the United Kingdom.


And that would allow all those competitors to better position their offers in relationship to Virgin Media, which has been the market leader in terms of speed for several years.

Oddly enough, Virgin Media’s support of keeping BT Openreach part of BT supports the notion that full separation would lead to more investment in faster speeds by the access wholesaler.

In the U.S. market, where cable operators lead the market in terms of subscriber share and access speeds, the leadership in speed has been clear for more than a decade.


"Fiber Versus Copper" Argument is Stale

Telecom network strategists have been debating the feasibility of copper, optical fiber and mixed-media access networks for a few decades. But a couple of developments show how “stale” many of those arguments have become.

For starters, with the rise of hybrid fiber coax as the leading platform for consumer Internet access in the United States, and the leader in gigabit access availability, “sustainable bandwidth,” not access media, is the issue.

It is becoming irrelevant what the access medium happens to be. What consumers are buying is fast Internet access, that can be supplied in a growing number of ways.

Also, rapid advances in fixed wireless and mobile platforms will make wireless a viable and sustainable way to supply huge amounts of bandwidth to fixed locations.

Ronan Dunne, O2 UK’s former CEO and new Verizon Wireless president, thinks the arguments about “fiber versus copper” are stale.

“My sense is that there’s a more forward looking context for the delivery of regulation and policy there, which is adopting the notion of a digitally-led mobile first,” Dunne said.

“In the longer-term, we will forget this stupid debate about rolling out fiber cables,” he said.

Sunday, September 18, 2016

Fast Next-Generation Network Deployment Has to Balance "Competition" and "Incentives for Investment"

Verizon's new understanding of the value of its fiber-rich fixed networks (supports business cases ranging from enterprise services to small cell backhaul, fixed wireless and mobile 5G) now mirrors the cable TV industry's belief in the value of its consumer triple-play networks.

That is to say, the same network required to drive gigabit consumer Internet access creates the foundation for a dense network of small cells to support 5G services.

Where we once talked about "fiber to the home" or "fiber to the building," or "fiber to the neighborhood," it now is generally understood that the same fiber "to lots of places" creates the possibility of backhaul for small cells that are the foundation for both mobile 5G and new fixed wireless services.

So incentives always matter, when regulators and policymakers want robust investment in next-generation networks. Any policies that promote more-extensive gigabit or fiber investments necessarily ease the business case for tomorrow's services.

But policymakers also want more competition. The problem is that competition and investment tend to be rival goods in competitive telecom markets. Success on one dimension tends to diminish the other.

Consider the European Community broadband targets, which call for gigabit access by 2025 for “all schools, transport hubs and main providers of public services as well as digitally intensive enterprises.”

The problem: the EC itself believes this will cost about €500 billion ($558 billion) in investment over the next eight or nine years.

However, the costs of building such a network are high, with the commission estimating that there is a shortfall of €155bn of the €500bn of investment that could be required to deliver a so-called “gigabit society”.

Right now, investment in next-generation networks might fall short of that goal by about €155 billion ($173 billion).

As plans for 5G mobile networks advance, additional spectrum--and lots of it--will be needed.

Spectrum below 1 GHZ and between 1 GHz and 6 GHz will be important. But spectrum in millimeter wave regions will be crucial, as there is not enough additional bandwidth available in low bands (below 1 GHZ) or in the middle bands (up to 6 GHz).

That means the cost of acquiring or using spectrum will be important drivers of how much investment happens, and how fast.

The planned 5G networks are expected to serve up to one million connected devices per square kilometer, about a one thousand-fold (three orders of magnitude) more than current levels.

Though investment will not be linear (three orders of magnitude more investment to support three orders of magnitude more devices), the incremental investments will be quite substantial.

High spectrum prices, tax policies, mandatory fees and other conditions of doing business also shape the possible business models and therefore the magnitude and timing of investment.

Telecommunications always has been a highly-regulated business, where government decides whether market entry is possible, by whom, under what conditions, what prices can be set and what services can be offered.

Creation of incentives to invest is now, and always will be, a crucial element of policy, especially if 5G requires use of small cells, on networks that are very dense users of optical fiber.

Looking only at service provider cell sites (not enterprise or consumer), millions of new sites will be added globally by about 2020.

If in some urban areas the density is roughly “fiber to every other light pole,”

If, as expected, millimeter wave small cells have a transmission radius of about 50 meters (165 feet) to 200 meters (perhaps a tenth of a mile), it is easy to predict that an unusually-dense backhaul network will have to be built (by mobile network standards).

In the past, mobile operators have only required backhaul to macrocells to towers spaced many miles apart. All that changes with new small cell networks built using millimeter wave spectrum (either for 5G mobile or fixed use, or for ISP fixed access).

Keep in mind that street lights are spaced at distances from 100 feet (30.5 meters) to 400 feet (122 meters) on local roads.

As a rough approximation, think of a small cell, in a dense deployment area, spaced at roughly every other street light, up to small cells spaced at about every fourth light pole.

That is a lot of new cells, with a low-cost backhaul requirement. That is why dense fiber networks now are seen as a business asset by Verizon and Comcast, for example. Very few other providers will be able to connect “every other light pole” to high-capacity backhaul, affordably.

Among other things, 5G networks should dramatically expand what we have traditionally viewed as the “fiber to tower” backhaul market. There are about 300,000 macro cell sites in the United States and perhaps 200,000 towers.

It remains unclear how many 5G or 4G small cells ultimately will be built. But it is reasonable to assume almost all the growth will come from putting small cells on existing structures, not installing towers.

And that is why cable operators believe small cell backhaul will be an important new business opportunity.


Will Carrier Cost Reduction Efforts Boost Channel Sales?

Historically, the reason telecom and information technology providers have used indirect distribution (channel partners) is because they cannot afford to sell direct, using internal sales forces.

The reason major service providers use mass media and retail channels to reach consumers has the same business drivers. Service providers cannot afford to sell direct to the mass market, and cannot actually rely on channel partners, either, as there is not enough revenue per account.

Retail stores, on the other hand, which generally are a form of direct sales--and more importantly a fulfillment mechanism--are necessary because few customers are completely comfortable buying devices “sight unseen.”

But as tier-one service providers continue to face a need to reduce costs, it is possible that reliance on channel partners for an increasing portion of overall sales efforts is possible.

The reason is drop-dead simple: as profitability becomes tougher, and aggregate sales volumes decline, “cost of sales” also must drop. Reliance on channel partners is one way to do so.

Consider recent developments at CenturyLink, which plans to lay off seven percent to eight percent of its fixed network workforce by the end of 2016.

CenturyLink revenue fell 0.7 percent in 2015 to $17.9 billion. Analysts project revenue will decline two percent in 2016, according to Bloomberg. So revenue is shrinking.

So CenturyLink--like any other business--has to match costs to expected revenue. “We all understand the pressure caused by the decline in our legacy revenues; it creates a $600 million negative impact on our business each year,” said Glen Post, CenturyLink CEO.

“While we continue to see positive growth in our strategic products, the profit margins of these strategic products and services are considerably lower than those associated with the legacy revenue we are losing,” Post added.


CenturyLink faces some of the same issues Frontier Communications an Windstream face. All are former rural fixed network telcos that grew and repositioned, in major ways, as business specialists.

But all three firms are fixed network only operators, in a market where mobile drives revenue growth in the broader market. None of the three firms own mobile revenue streams.

AT&T has become one of the biggest linear video providers in the U.S. market by virtue of its acquisition of DirecTV, and many believe the company will deemphasize fixed network linear video services in favor of satellite delivery.

So the big challenges include how to restructure their businesses for potentially-smaller gross revenues and lower profit margins in the mass market portions of their businesses.

Stranded asset issues are going to grow, as well, as fewer customers deliver revenue to support fixed costs.

So it is not a surprise that CenturyLink is trying to reduce its operating costs. It has to do so. A shift to greater reliance on channel partners would not be surprising, eventually.

Productivity is a Devilish Thing

Will automated trucking create more jobs than it destroys? Nobody knows yet. What is easier to say is that some jobs will be destroyed, and others created elsewhere. That tends to be the pattern for major waves of technology-driven change.

Driverless vehicles might directly affect millions of people.

In the United States, there were in 2014 about 1.8 million heavy-truck and tractor-trailer long-haul drivers, with employment growing about four percent a year, according to Bureau of Labor Statistics.

So there might be nearly two million long-haul truck drivers working, in 2016.


There might also be 1.44 million delivery truck drivers.

There might also be some 233,700 taxi drivers and chauffeurs, not including ride hailing drivers associated with firms such as Uber and Lyft, which could number as much as half a million.

Commercially-available autonomous vehicles should eliminate many of those jobs, while potentially creating new jobs in new areas. The issue is where the new jobs are created, and in what quantities.

It is almost certain that there will be no one-to-one correspondence between the specific jobs lost by specific people and the new jobs created elsewhere.

Productivity might be a good thing, but it also is a disruptive--and hurtful--thing for specific industries, firms and workers. Hence the need for adjustment mechanisms.

That is never easy.

One reason it often is “so hard to get things done” in politics is that it often is easier to identify “losers,” who are quite aware of their losses, than “winners” throughout the economy who cannot specifically identify the gains they make.

That means opposition to change is highly organized, while support for changes that benefit everyone, or most people, can be diffuse.

That has analogies in other areas. Productivity is good for nations and government revenues generally, but nearly always has negative consequences for highly-specific groups.

Ability to “do more with less” might be green, nearly always boosts general consumer economic well-being, but often is harmful for specific groups of workers or industries.

Lower prices for all manner of computing-assisted products benefits consumers and enables many new industries such as e-commerce, digital and social media. Conversely, suppliers of computing products and platforms find that prices always decline.

And many industry segments, and the people who work in those segments, can find that higher productivity also means lower revenues, fewer jobs and lower wages.

At a high level, it can be argued that major technology innovations eventually create new jobs at higher levels than destroyed jobs. But the gains and losses tend to fall on different industries, firms and people. Again, losses are clear, gains are diffuse.

Zero Rating, Lower Prices Change U.S. Mobile Customer Behavior

source: P3
Zero rating does change user behavior. As Facebook has argued, its Free Basics program, which allows mobile users access to a bundle of apps without the need for a data plan, dramatically increases use of mobile Internet apps and creates demand for mobile Internet access.

It now appears that similar programs such as the T-Mobile US "Binge On' program, which zero rates use of video streaming services, also increases usage of streaming apps.

As economic theory also suggests, lower prices for some product in demand will stimulate usage. And that seems to be happening in the U.S. mobile market, as the return of "unlimited usage" plans and bigger usage buckets at lower prices per gigabyte seem to be spurring customers to use more mobile data.

That can be seen most clearly in new research about the amount of mobile app use happening when connected to mobile networks, compared to Wi-Fi networks. Simply, U.S. mobile users are showing more preference for accessing mobile apps when on the mobile network, than when on the Wi-Fi network.

That could indicate less reluctance to use favor apps on the go, since data consumption now is less an issue than before. For users of data plans with zero-rated video streaming, that likely also means customers are not as concerned about data charges, so see no need to flip to Wi-Fi access to avoid such charges.

That might come as a surprise. Most projections about mobile customer use of Wi-Fi suggest the percentage of Wi-Fi connection time is growing, compared to use of mobile network connection time.

And while that might still be true as a general statement, U.S. users of mobile apps seem to be spending more time using the mobile network than Wi-Fi, according to a study conducted by P3 and commissioned by Fierce Wireless.

In the United States, Wi-Fi's share of mobile app connection time has been declining since the beginning of 2016.
source: P3

In January, some 60 percent of the time Verizon subscribers were using a mobile app, those interactions used a Wi-Fi connection.

By August, mobile app use when connected to Wi-Fi dropped to 52 percent.


In January, Sprint subscribers used Wi-Fi for apps 56 percent of the time. By the end of August, Wi-Fi was used for apps 45 percent of the time.

T-Mobile US customers had the smallest decline in Wi-Fi usage for apps.

In January, Wi-Fi connections supported 40 percent of app usage time. By August, that figure was 39 percent.

Users of all the studied networks consume more data on Wi-Fi compared to the mobile network.
Only T-Mobile US customers spend more time using apps on the mobile network. Analysts at P3 believe that is a direct result of T-Mobile US “Binge On” plans that do not count streaming consumption against a data usage plan.

Verizon users show the fewest app sessions, lowest total data consumption and least amount of usage time among the four top U.S. mobile operators. T-Mobile US subscribers are the heaviest users across these categories.

Some would suggest that is because Verizon generally is considered to have the highest prices of the four carriers, while T-Mobile US has been the most-aggressive on price over the past several years.

Basic economic theory suggests that when a supplier lowers the price of some product in demand, customers buy or use more of that product.

The price war now raging in the U.S. mobile market has meant the return of "unlimited use" and "more for your money" plans. That, in turn, seems to be changing consumer behavior, at least where it comes to use of mobile apps.

The other possible contributor to change in behavior is widespread access to faster 4G networks. Where in a 3G environment Wi-Fi tended to be faster than the mobile network, it now often is the case that 4G is faster than Wi-Fi.

To the extent that users switched to Wi-Fi for reasons of speed, that makes less sense, now.

Also, to the extent that users switched to Wi-Fi to conserve usage, the unlimited features and lower cost of mobile data, with bigger buckets of usage, barriers to use of the mobile network also have fallen.

source: P3











Directv-Dish Merger Fails

Directv’’s termination of its deal to merge with EchoStar, apparently because EchoStar bondholders did not approve, means EchoStar continue...