Wednesday, January 25, 2017

Why Subsea Networks Measure Cash Flow, Not Profit

Marginal cost pricing is now, and has been, a huge business model problem for capital-intensive communications infrastructure providers. Marginal cost pricing involves selling incremental units at incremental cost to produce those units, not including the amortization of the actual network build.

The hope is that the seller eventually can recoup sunk costs eventually. Whether that actually works is increasingly the issue for communications infrastructure.

Indeed, some already argue that tier-one telcos do not recover their cost of capital, perhaps an indication that marginal cost pricing is dangerous to the long term health of the industry.

That is an issue, according to Eric Handa APT Telecom CEO.  As a rule, suppliers hope to recover their capital investments in three years. That hardly ever happens, says Handa. In other words, cash flow is the key business requirement, as most subsea--and possibly many other access networks--will never recover capital investment amounts.

Suppliers “need to recognize that loss and sell to cover future opex,” says Handa. That is why, these days, one sees the use of “earnings before interest, taxes, depreciation and amortization,” a measure of cash flow, and not a measure of “profit” in a “generally accepted accounting practices” sense.

That should provide a warning for regulators: modern communications networks are expensive and might no longer be “profitable.” Policies that make harder the task of sustaining cash flow (not even profits) will burden suppliers that already are not “profitable” in the old sense.

Tuesday, January 24, 2017

Applying HHI in a Triple-Play Merger Context Will be Complicated

The next antitrust review involving any of the larger fixed network providers will provide a challenge, as the evolution of the business into a triple-play business means internet accounts, video accounts and, to some extent, voice accounts. Under such conditions, figuring out the actual market share, or household reach, will involve a bit of work, across the three key service silos.

In the fixed networks business, antitrust reviews have included a number of tools, such as the Heffindahl-Hirshman Index (HHI). But, up to this point, regulators and antitrust officials might not have had to deal with the complexities of an industry that sells multiple products (internet access, entertainment video, voice), with different households buying different mixes of those products.

Also, the video entertainment share analysis is complicated by the significant presence of one independent satellite provider, though the biggest satellite provider now is owned by AT&T.

As a rough rule of thumb, past proposed horizontal cable TV mergers have used a 30-percent test: no single proposed entity could have market share of more than about 30 percent of U.S. accounts, or have networks passing more than about 30 percent of U.S. homes.

Those rules are applied differently in the mobile business, as national reach of the population is not the issue, but rather actual account share.

Some day, it might be even harder, as the difference between the mobile and fixed industry segments might blur quite a lot.

For the moment, the HHI remains a huge barrier for many of the trial balloon mega-mergers being floated.

Can Verizon Justify Much More Investment in Fixed Networks Segment?

Verizon earned $32.3 billion in its fourth quarter of 2016, including about $3.2 billion from its fixed line mass market customers, while earning $23.4  billion from its mobile segment. In other words, all mass market (consumer and small business) revenues represented just under 10 percent of total revenues, while mobile represented about 72 percent of total revenues.

It is quite easy to argue that fixed network operations actually make less sense for Verizon than for most other telcos. Some might even argue whether it would make sense to get out of the fixed network business, if a buyer could be found.

The corollary is that regulatory burdens in the fixed networks area probably do not help Verizon make the case for robust fixed network investment. And that case would be difficult in any case.

Consider that Verizon, in its Fios areas, has 40 percent penetration of internet access services and about 34 percent adoption of its video entertainment services. Given that cable companies have nearly all the rest of the internet access share, while cable and satellite split the video share, it is hard to see how Verizon does much better in its Fios areas, no matter what  it does.

Wireline operating income was $414 million in fourth-quarter 2016, Compare that to mobile segment operating income of $6.3 billion.

Fixed networks are a small part of Verizon’s revenue, cash flow and profit (if any), while Verizon arguably does as well as it possibly could in that area. It is not a recipe for robust additional investment.


Enterprises Boosting Cloud Spending

Enterprises are increasing their spending on cloud services, analysts at Forrester say. As you would guess, that is going to decrease the amount of spending by enterprises on their own hardware and software.

Since 2013, the percentage of enterprises that have deployed cloud has risen from 10 percent to 33 percent, and 49 percent of enterprises with more than 100 employees now use public cloud, according to Forrester Research.

Analysts at 451 Research predict enterprise information technology staffs will spend an average of 34 percent of their overall budgets on hosting and cloud services in 2017, up from 28 percent last year, with increased reliance on infrastructure, applications management and security.

IDC, meanwhile, predicts that 23 percent of IT infrastructure and application workloads will reside in the public cloud in two years, up from today's 14 percent.


Hybrid cloud management, application development, the internet of things, and a myriad of software innovations are ways existing suppliers will have to work to remain relevant, Forrester says.




Monday, January 23, 2017

Streaming is the Biggest Change in U.S. Video Entertainment Market, Says FCC

The most significant change in the status of competition in the market for the delivery of video services has been the introduction of Sling TV by Dish Network and DIrecTV Now by AT&T, the U.S. Federal Communications Commission says.

Others might also note the importance of the AT&T acquisition of DirecTV, which vastly expanded the addressable universe of homes AT&T could reach from less than 21 percent to virtually 100 percent of U.S. homes.

One sign of the robustness of competition is that profit margins, which once were as high as 40 percent, had fallen to about 10 percent in 2015, and likely are a bit lower in early 2017.

The FCC says profit margins were 15 percent in 2014 and 20 percent in 2013. That might be unappetizing in one sense, but arguably is meaningful in a context where other legacy services, whatever their profit margins, do not contribute much revenue. Video services represent a huge percentage of potential access provider revenue, if margins are not so high.

By some estimates, internet access gross margins (at least for cable operators) might be as high as 60 percent, while voice services might have profit margins near 20 percent. Telco margins likely are not that robust.

If cash flow matters--and it does--then the revenue video entertainment represents is hard to match, averaging between $80 and $110 a month, per account.

According to SNL Kagan, there were 134.2 million housing units in 2014 and 135 million housing units in 2015. The FCC therefore assumes that cable and satellite companies cover nearly every household in the country.

At the end of 2015, cable suppliers accounted for 53 percent of all subscribers, down from 53.4 percent at the end of 2014. Direct broadcast satellite (DBS) providers accounted for 33.2 percent of subscribers at the end of 2015, down slightly from 33.3 percent at the end of 2014.

Telephone companies accounted for 13.4 percent of MVPD subscribers at the end of 2015, up from 12.9 percent at the end of 2014.

Total subscribers declined in 2013, 2014 and 2015, with the suppliers losing about 1.1 million video subscribers in 2015.

But total video revenue increased from $112.7 billion in 2014 to $115.6 billion in 2015, partly because of rate increases and partly because of subscribers upgrading to higher levels of service.

What Happens at FCC in 2017?

Though it is likely network neutrality rules will be changed under a new Federal Communications Commission, we are likely mostly to see a return to “internet freedoms” rules that emphasize access to all lawful apps, a position we might call “weak” network neutrality, compared to the strong form of net neutrality (best effort only access, no quality of service mechanisms) that has been the recent policy.

At the same time, much of the change will come in the area of “why and how” network neutrality rules are justified, with a likely move away from Title II common carrier regulation and FCC action justified by the Communications Act Section 706 rules.

Separate from the specific network neutrality provisions themselves, there has been debate over the imposition of such rules using either common carrier or “report-making” authority that becomes “regulating” authority.

The other change is institutional. Unlike the recent FCC, the new FCC likely will prefer that consumer protection and potential antitrust issues be overseen by the Federal
Trade Commission, not the FCC.

Expect the FCC to undo both claims of legal authority underlying the FCC’s net neutrality regulations: Title II and Section 706. What has fueled the fight over the last decade is the FCC’s authority, not the core of “net neutrality” itself, says Berin Szóka, TechFreedom president.

We might also see more spending on universal access and support for broadband deployment, with new incentives for deployment in urban areas.

HHI is a Major Reason Why Sprint and T-Mobile US Merger Would Not be Approved

JP Morgan Securities sees a 90 percent chance of T-Mobile US being acquired over  the next five years. That would be part of a consolidation of the U.S. telecom business some believe will be more vertical than horizontal (access providers combining with app providers, for example, more than mobile or fixed operators getting significantly bigger).
The problem with horizontal mergers always is the resulting market concentration. As a rule of thumb, any fixed network access provider (mobile, fixed, cable TV) combination that reaches above about 30 percent of U.S. homes--or has more than about 30 percent market share)  has been denied.
So either regulators will have to argue that cable TV companies and telcos are not in the same business, major asset divestitures will have to happen, regulators will have to scrap the historic screening tools they have been using for antitrust reviews. The U.S. mobile market, using the standard screening formulas, already features a market that is too concentrated.
So one of the hoped-for mergers--between Sprint and T-Mobile US--would lead to even higher concentration, beyond the level regulators have approved in the past.  
To be sure, many would argue that the U.S. market can sustainably support only three leading providers, not four. That has been the issue for European regulators and Asian regulators as well. There is a clear preference for maintaining four leading suppliers, rather than allowing the market to consolidate to three big providers.
That is why some believe the more-likely mergers will be vertical, not horizontal. That would imply a cable company acquiring either Sprint or T-Mobile US--or both-- as those transactions would not further concentrate the market.  For the same reasons, a sale of either Sprint or T-Mobile US to foreign buyers or Dish Network would have a much-easier time of gaining antitrust approval, as those transactions would not further concentrate market power.
That noted, some equity analysts think the odds of a Sprint and T-Mobile US merger now stand at more than 35 percent, up from 10 percent in September 2016, with a 70 percent chance of approval, if announced, JP Morgan analyst Philip Cusick said.
Always looking for profits to be made from big deals, such speculation is to be expected. But it might strike some observers as fanciful. The Heffindahl-Hirshman Index is used globally by regulators to measure market concentration.
To approve a Sprint merger with T-Mobile US, U.S. regulators would have to ignore the HHI, a global test of market power that historically has been used in the U.S. and other global markets. Since the purpose of a horizontal merger is gaining of scale, it is hard to see how any of the big mobile companies could merge with each other. The HHI screen would be violated.
Vertical mergers (cable TV plus mobile), or acquisitions of any big mobile company by international buyers, would not inherently violate the HHI screens.

Sunday, January 22, 2017

Telecom is Dying; Distributed Computing is What Comes Next

Telecoms is dying, says consultant Martin Geddes. “The industry that acquires the name “telecoms” is slowly going out of business.”  

That does not mean physical infrastructure is going away. “We still need physical infrastructure,” says Geddes.

“It is the active layer is in the process of being absorbed by the computational cloud borg,” Geddes says, while access and transport are being commoditized. To put it another way, telecom is a form of computing.

“What we are really building is not a ‘telecoms’ network any longer, but an ‘ultracomputer’ or ‘hypercloud,’” Geddes argues.

“Unfortunately for investors, the telecoms business is at the losing end of this change.” Though voice and messaging have been the prime examples so far, there is more to come, he argues.

The next decade will also find “most enterprise access revenue going down the drain.”
 
What is needed are varying and segmented levels of data service resilience and performance, that can be tied (loosely or tightly) to delivery of some kind of application outcome or experience, a theme Geddes has emphasized in the past.

Cloud computing also will have a key impact. “We are seeing rapid growth of the scale, scope and value of giant cloud platforms,” he says.

“Payment for communications is going to increasingly come from the cloud providers and their customers, via wholesale mechanisms,” he says.
 
“This blows up the financial model on which investments in telecoms are presently made,” says Geddes. “The tragedy about to unfold is that telecoms business asset values price in neither the downside risks (you’re now the Uber driver), nor the upside opportunities (you’re the restaurant whose high-margin alcohol sales go up as posh people don’t need to drive home or slum it in a taxi).”
 
“If that wasn’t bad enough, the regulatory system appears determined to pretend that this all isn’t happening,” says Geddes. “We are seeing in a variety of cases where common carriage (and the circuit mentality) is being misapplied to a distributed computing system.”

“The result is that regulators are tasked with regulating an industry that is disappearing and being subsumed into another,” Geddes says.

What replaces telecoms (and cloud) is a distributed computing industry.

Saturday, January 21, 2017

New Business Models are the Biggest Issue of the New Era of Communications

New business models are at the heart of most developments in telecommunications. It is no secret that value within the internet value chain has moved to app providers and away from access, whether you measure by annual revenue, profit margins or equity values.

That automatically raises the issue of whether telcos, cable TV companies and internet service providers can create new business models. Already, many would argue that profit already has been vaporized in the undersea transport business, the long distance voice business and fixed line services in general.

That same process is being seen in the mobile segment, as voice and text messaging revenues are generating less gross revenue and profit in many markets, supplanted by mobile data services that, in turn are becoming problematic in many markets that adopted early.

"The hyperscale guys are so much more efficient" than the access providers," argues Tim Horan, Oppenheimer analyst. "I kind of think everything goes to LTE (mobile) and cannibalizes wireline."

"Do telcos become commodity infrastructure providers?" asks Pierre de Vries, co-director of the Silicon Flatirons Center for Law, Technology, and Entrepreneurship at the University of Colorado. Basically, some might argue, that is the present danger.



If big data ownership is the oil of the coming age of the internet, then it might also be true that spectrum is the beachfront property of the communications networks that support the internet. But there remains huge disagreement about the most-fundamental aspects of the wireless and mobile networks that serve most consumers globally.

For example, there is not complete agreement about whether there is scarcity of available spectrum, and therefore, how important additional spectrum allocations might be. Demand for mobile spectrum has not been as extensive as the Federal Communications Commission predicted in 2010, for example, says Armand Musey, Summit Ridge Group founder.

To be sure, there are discrete industry viewpoints. Satellite industry supporters almost always argue there is no shortage of mobile spectrum. Virtually all supporters of the mobile industry argue much more spectrum is required.

LIkewise, there remains disagreement about the various ways any existing spectrum should, or can, be allocated. Allocation of licensed spectrum (exclusive use) with, or without auctions, have been the preferred methods.

But now shared access (dynamic access) is emerging as a new choice, especially where the objective is to maximize the use of already-licensed spectrum, beginning with the 3.5-GHz bands in the U.S. market, and 2.3 GHz in Europe, for example.

Dynamic access is the key to alleviating spectrum shortages, particularly in the mid-bands, argues Kalpak Gude, Dynamic Spectrum Alliance president. “I’m not sure people understand the potential,” says Gude.

Using several techniques, it is possible to protect existing licensed users, but allow opportunistic use when the licensed spectrum is not actually in use, thus using existing spectrum resources more efficiently, without the expense and time required to move existing users from their existing bands. Though Musey does not believe there is a mobile spectrum scarcity, he does agree that a technology revolution coming, in the form of shared access capabilities.

Still, scarcity might matter greatly. “Without scarcity, there is no business,” says Tim Horan, Oppenheimer analyst.

At the same time, there are big new moves being made to enable sharing of licensed and unlicensed spectrum, including 4G Long Term Evolution and Wi-Fi, for example, or to enable use of LTE in unlicensed bands alone. In the U.S. market, some seven gigaHertz of new unlicensed spectrum is going to be made available.

And while some argue for more use of unlicensed spectrum, others believe that without exclusive use licenses, investment will not be made. “Where is the revenue, if spectrum is unlicensed,” Musey asks. “How do you finance the construction of networks?”

Actually, it is quality of service that has to be assured if investments are made, argues Bob Pepper, Facebook global director.

As a practical matter, regulators globally believe much more mobile spectrum is required, and much more spectrum is coming. Exclusivity of rights is the issue, not so much scarcity of bandwidth, argues Pierre de Vries, co-director of the Silicon Flatirons Center for Law, Technology, and Entrepreneurship at the University of Colorado.

The issue is the right to exclude versus the right to be protected from interference, says De Vries. And under any system of licensed and unlicensed access, we still would have regulation, says Pepper. But the amount of regulation could be reduced if we relied more on technology standards to protect users from interference, says Gude.

Push on Steroids: the Next Era of the Internet

The internet has moved, over time, from push to pull, and then back to push. In the next evolution, push might be even more important.


AOL, the big U.S. ISP in the early days, largely relied on a push model, aggregating content it believed most people would be interested in. Then, with the World Wide Web, the internet moved to a “pull” model, where people knew what they wanted, and asked for it.


In the next big era, push likely will become even more powerful.


Pull is user-initiated. The best example is “search,” where a user seeks information, usually an answer to a question. Google providers the best example.


Push is internet app initiated, where an app sends you information the app believes you value, without any action on your part. Facebook provides a good example of push.


The killer app for push is social networks. Information is pushed from user to user using likes, shares or tweets. Now, people push items. In the next wave, powerful artificial intelligence engines will scour vast data stores to figure out what each user likes, values and wants, and then delivers it, with no direct action on the part of any user.


In a sense, apps on the internet can initiate action and push content and items to each user, based on data mining based on use of machine learning and artificial intelligence. Push and pull have analogies in marketing, and so might change or create new business models.


Characteristics
Pull
Push
dominant platform
Search
Social
dominant platform company
Google
Facebook
growth era
2000s
2010s
successful content type
Utilities
Media
marketing activity
links and algorithms
shares and people
source: cdixon.org


source: Smartinsights

The Next Generation of the Internet is Coming

You might argue the internet has evolved since its inception. Originally a narrowband tool for researchers, it now is a broadband tool used widely by most consumers and businesses. Early on in the development of the World Wide Web, people gained the ability to publish, says Charles Fan, Cheetah Mobile CTO. Then, with the emergence of search, we gained the ability to find the world's information, he says. The problem is that you have to know what you are looking for, to find it. That makes search less useful.

In the coming wave, content relevant to a consumer will be found and then "pushed" to each user. We already see glimmers of that in the high use of social apps, where people now find "information" useful and relevant to them. Also, "news" is redefined less as what is happening in the broader world, and more what is happening with your friends, family and social circles.

To a large extent, that means we are using an algorithm that essentially assumes "what is interesting to your friends is interesting to you." That is correct, up to a point. The next wave will involve use of artificial intelligence, coupled with big data stores, to actually predict what you like.

In the next generation of the internet, machine learning will be better than knowing your social graph, as a way of connecting you with things you are interested in. That AI-driven model might also lead to creation of huge new business models to replace existing and older models (advertising, e-commerce or peer trading mechanisms like Uber), says Fan.

That reliance on big data stores might have huge implications. On one hand, the algorithms will have huge amounts of new data to work with. On the other hand, algorithms will be commoditized, democratized or “made somewhat obsolete.” In a world where everyone has access to good algorithms, value will shift to access to huge data stores.

“Who has the better quality of data wins,” Fan says. “How do you get better at collecting data?” Fan asks. “Big data ownership is going to be the oil of the modern age.” But that will require artificial intelligence or machine learning.

For most of us, artificial intelligence (AI) has been a science project for the past few decades: interesting and provocative, but not something that actually affects the businesses most of us deal with on a daily or even annual basis.

There are reasons to believe that is changing. Charles Fan, Cheetah Mobile CTO, said it might seem odd for a mobile app and tools company such as Cheetah Mobile to be seriously evaluating AI. Actually, it turns out to be most practical, as Cheetah Mobile launches new applications in the news aggregation area.

Eventually, the ability to personalize and then predict what a particular person might like will require AI to mine and then predict and deliver “suggested items” to individual people.

Your social profile helps, but only so much. Content providers or advertisers can assume you are somewhat like your “friends” in terms of interests. But only up to a point is that correct. Each individual actually is quite different, at a more-granular level. But it will take AI to rapidly process all the data used to assemble a highly-personalized set of content and then match people with highly-targeted offers and ads.

Incentive Auction a "Massive Disappointment?"

Among the conclusions one might draw from the Federal Communications Commission’s 600 MHz two-stage “incentive auction” are that low-band mobile spectrum, in the U.S. market, is deemed to be worth less than always has been argued.

In terms of revenue raised by the new way of auctioning spectrum, some might call the outcome a massive disappointment.

In January 2016, FCC Chairman Tom Wheeler claimed the auction would be the “world's largest spectrum auction that has ever taken place.” Not so, as it turns out. In fact, the auction likely will clear far less (less than half) of the $44.9 billion raised by auctioning AWS-3 spectrum.

By way of comparison, the AWS-3 spectrum (in the 1.7 GHz and 2.1 GHz ranges) raised $44.9 billion for 65 MHz of total spectrum.

The 600-MHz spectrum will raise perhaps $18 billion for 84 MHz of spectrum. The FCC originally had expected to raise as much as $60 billion in proceeds.

This was in sharp contrast to the FCC’s initial expectation of collecting at least $60 billion from Incentive Auction. Spectrum owners had expected as much as $86 billion in sales.

Arguably, AT&T and Verizon believe they have other ways to satisfy any “coverage” capacity they might require in the future, while the “reserve spectrum” conditions that barred both from bidding on much of the spectrum (about a third) might have contributed to the lack of enthusiasm as well.

Some argued that the "reserve spectrum" feature, as always, produces distortions in spectrum markets. Among those distortions are price and demand impacts.

On the other hand, some bidders such as T-Mobile US will pay less for their spectrum, and likely will get some spectrum, simply because competitors were forbidden to do so.

There are a huge number of ways mobile operators will be increasing capacity in coming years. There is an astounding amount of new millimeter wave spectrum coming that dwarfs all current authorizations of communications spectrum.

Offload of demand to Wi-Fi also will help. So will small cell networks operators will build to deploy the new millimeter wave assets. As much as seven gigaHertz of new unlicensed spectrum also will be released by the FCC, allowing use without payment of license fees.

At the same time, better radios also will help boost capacity, using any specific set of frequencies. And some potential bidders, such as cable companies, also can buy existing companies and thereby acquire their spectrum assets.

We will have to wait to see whether other large auctions of new spectrum (India and Egypt  also have found far less demand than anticipated) continue to show lower prices.

But it seems markets are rational. With vast increases in supply now possible, and more supply coming, prices should fall.

Yes, Follow the Data. Even if it Does Not Fit Your Agenda

When people argue we need to “follow the science” that should be true in all cases, not only in cases where the data fits one’s political pr...