Wednesday, May 30, 2018

Colocation, Data Center Market Still Fragmented

Equinix, Digital Realty and NTT are market share leaders in the colocation market, according to Synergy Research. But the market arguably remains fragmented, globally.

The three now control nearly 28 percent of the worldwide market and all have grown their market share over the last four quarters, both organically and through acquisitions.

Equinix had a 13 percent share in the first quarter of 2018. Equinix had a 17 percent  share of retail colocation, while Digital Realty had a 28 percent share of the wholesale (business to business colocation) segment.

Equinix sells more to enterprises who need private cloud computing facilities.

source: Synergy Research

Tuesday, May 29, 2018

Will 5G Capex Be Less, the Same, or More than 4G?

Many critics of 5G argue that it will cost mobile operators too much to build the networks or that incremental new revenue will be insufficient to support the networks, or both. The argument often is accompanied by the observation that some other approach (platform, network or business model) would work better.

But the cost of building mobile networks is not growing. mobile operator capital investment has been flat as capacity supply has grown as much as seven times over the last four years, argue analysts at Rewheel.

To be sure, the analysis requires careful thinking.

“Our calculations show that the doomsayers will again be proven wrong,” Rewheel says. “Mobile network capex will stay flat the next five years with the help of 3.4-3.8 GHz spectrum and massive MIMO even if data traffic grows another 10-fold (from 20 GB in 2016 to 200 GB per unique user per month in 2021) as forecasted by Finnish operators.”

Rewheel also notes that “the annual cost of expanding a 4G network’s aggregate capacity by a gigabit per second is as low as few hundred thousand euros, which is roughly the equivalent of €0.1 per GB, a near-zero marginal mobile data cost if one considers that consumers are paying few hundred EUR per year for their smartphone or mobile broadband plans,” say analysts at Rewheel.

The key word in that statement is “marginal” cost, however. Once built, a modern mobile network can indeed add a unit of supply affordably. But that does not speak to the cost of recovering the sunk investment in the full network; only the cost of the incremental supply.

The trick is that “flat capex” has to include both marginal increases in capacity and the full recovery of the cost of building the network in the first place.


Can Mobile-Only Succeed? If So, Where and Why?

Can a “mobile-only” business strategy succeed when markets reach saturation? In other words, once every potential customer already buys the service, can some combination of higher usage, new products or vertical integration offset pricing and profit pressures in zero-sum markets where market share gains can only come at the expense of other mobile operators?

It is, at present, an open question. Optimists argue that mobile operators will grow revenue from new customers, new applications, new roles in the ecosystem and even cannibalization of fixed network market share.

Pessimists likely will argue that while this is a possibility for the best-capitalized and largest providers in some markets, in most markets opportunities to grow roles, value, revenues and revenue sources are quite limited.

Still, there are some examples of mobile-only financial performance beating the financial performance of converged suppliers who own both mobile and fixed assets. In the quite-mature European markets, for example, Rewheel analysts say mobile-only providers have outperformed converged suppliers.


Some caveats are in order. Generally speaking, mobile-only attackers have no legacy to protect, and the converged operators include both the former incumbent fixed operators and some attackers who have chosen to be in both mobile and fixed markets. The point is that attackers have more choices.

Also, there are structural issues. Fixed network suppliers have lost share in voice, while mobile operators have gained share. Mobile operators have been the exclusive providers of messaging. Now something similar is happening in the internet access market, as mobile alternatives are starting to compete with fixed networks.

When utilization of any network drops--when there are fewer paying customers--per-customer costs obviously grow. Conversely, on any network with higher utilization--more paying customers--such costs drop.

In other words, all other things being equal (they are not equal), any network that gets revenue from only a half of locations the network passes (for example), will have per-customer costs that are literally double the per-passing figures.

Selling more products to a smaller number of customers helps. That is why triple play bundles are so popular. Selling more products means revenue is higher, per account.

The point is that we need to be careful when analyzing strategies. There are multiple reasons why particular suppliers, in particular markets, might achieve higher or lower revenue, revenue growth, profitability and cost-per-customer metrics.

Lower-cost platforms, as a rule, will help, and mobile networks are lower cost than fixed networks. Fixed networks, on the other hand, have had strategic advantages where it comes to the types of products that can be sold. Video entertainment is bandwidth-intensive, so in markets where linear or on-demand video is popular, fixed networks have had a price advantage.

That also has been true for internet access charges. Where demand is heavy, fixed networks have a cost of supply advantage. Generally speaking, older mobile networks have featured costs per gigabyte as much as an order of magnitude higher than fixed networks. That explains the widespread use of offload to Wi-Fi.


For most fixed networks, the key problem is product substitution. In the past, voice revenues have driven financial results, and customers have fled for mobile alternatives. Fixed internet access and video services have helped compensate for voice losses.

The problem is that growth is tough when network utilization grows increasingly negative. That is why some have called the fixed networks business a matter of terminal decline.

At the same time, fixed network suppliers face not only mobile substitution, but also facilities-based new fixed network competitors in some markets. So revenue growth and profit now are bigger issues in some markets.  

In other words, fixed networks now are in the midst of a key business model change, where customers, products and roles are evolving. It is conceivable that capex and opex profiles have to adjust to lower potential revenues, no matter how customer sets and perceived value changes. Some argue that fixed network backhaul will drive more value in the 5G era.  

It might be tempting to argue that mobile-only business models still make more sense than integrated models combining ownership of both fixed and mobile assets. But many argue that converged business models actually have performed better, financially, than mobile-only operations, across much of Asia, for example.

The issue is the growing cost of supplying incremental bandwidth. To be sure, there are three basic positions on this matter. Some believe costs will grow unmanageably. Others believe costs can be managed to match demand. Yet others believe costs will drop.

The outcome likely will be different in various markets, and by firms within those markets. Low-cost or lower-cost providers will generally win, all other things being equal. Of course, all things rarely are truly equal.

App Providers Dominate Global "Brand Value" Rankings

With the caveat that evaluating the value of a brand hinges on valuation assumptions and methodology, as well as one’s estimation of the value of brands in a market where consumers arguably are less reliant on brands to drive purchasing decisions, the latest BrandZ study illustrates the strength of tier-one app provider brands and growth, while also highlighting the challenges facing telecom service providers as both industries arguably are becoming parts of one broader category.

The biggest takeaway is that although one telecom service provider appears in the top ten of “most valuable” global brands, the top six spots are all app providers. Significantly, app provider brand value is growing at rates from 23 percent to 92 percent, while AT&T, the lone telecom service provider to make the top ten, shrank seven percent over the last year.

Though Orange lead the telecom group with a 14-percent annual change in brand value, while NTT added 10 points, others suffered declines. Comcast Xfinity, Deutsche Telekom and Movistar brands actually gained a percent to four percent, all other telecom service providers on the list actually saw negative brand value change over the last year.

So while the technology segment grew value 23 percent last year, the telecom service provider segment grew only about two percent.

Assuming the researchers still are measuring what matters, the latest BrandZ report suggests that value in the internet ecosystem is shifting from access to applications, platforms and devices.

It is fair to say that change is happening, as well. While most marketers would likely argue that brand reputation does matter, recent studies suggest that perhaps half of Millennials, the most-important buying cohort, have no use for brands.

Others might argue that Millennials still have brand preferences, but that their criteria are different.   










Amazon is the most valuable retailer brand on the planet, according to BrandZ.  

Sunday, May 27, 2018

Spectrum Supply and Demand are About to Go "Unstable"

It always is possible to get a robust debate about "whether enough spectrum is available." It might soon be possible to get a robust debate on whether spectrum prices will drop, based on increases in supply.

On one hand, demand keeps growing, so even if orders of magnitude new supply are added, supply and demand should remain in equilibrium, where prices are stable and supply matches demand, as economists like to say. 

On the other hand, many of you might look at your own experience in the communications business and not agree that the business is in equilibrium, and that applies to the value and price of acquired spectrum, as well as expectations about value and price as markets evolve. 

For example, though it is hard to place a financial value on mobile operator, business or consumer end user access to Wi-Fi, the ability to offload huge amounts of mobile phone internet access demand to Wi-Fi has a clear value to network operators who do not have to invest as much in capacity as they otherwise would have to do. 

In the U.S. mobile market, various participants also see multiple ways to increase the effective amount of capacity they can use. Buying new spectrum licenses is but one way to do so. In other cases a shift to small cells is seen as a reasonable alternative to acquiring new spectrum. That is basically Verizon's stated position.

In other cases, firms can buy companies that own licenses, as both Verizon and AT&T have done, and others might do. Dish Network has a trove of spectrum which must be put to commercial use or the licenses are lost. So many believe Dish ultimately will sell the licenses to a firm that can do so, fast. 

Mobile service provider mergers or acquisitions are another way to acquire additional spectrum.

But new techniques are coming, including the ability to aggregate unlicensed spectrum with licensed spectrum; access to shared spectrum that might cost less, or be accessible in unlicensed mode; plus huge increases in the amount of licensed and unlicensed spectrum available for mobile and other uses. 

So a good argument can be made that spectrum equilibrium is less likely. 

Recent spectrum auctions have diverged from expected sales values. In the past, mobile operators also have paid too much for spectrum.  

Recent U.S. spectrum auctions show mobile service providers being much more cautious about what they are willing to spend on buying spectrum licenses. The same trend was evident in recent spectrum auctions in India as well.  

In part, that is likely due to a perception that there are other ways of sourcing additional capacity, from aggregating unlicensed spectrum to use of smaller cells to shared spectrum or acquiring assets already awarded, but not yet in use. In some markets, spectrum trading also is a solution.  

But it also is possible that the perceived value of spectrum--still high--also has to match with expectations about the amount of revenue incremental spectrum can generate. If operators believe 100 new units will not drive the same amount of revenue as in the past, then their willingness to invest in spectrum will be less, on a per-unit basis.

Also, coming physical supply is disruptive, to say the least. All presently-licensed mobile spectrum, plus all Wi-Fi spectrum, plus new shared spectrum, amounts to about 2,600 MHz in the U.S. market. The actual mobile and Wi-Fi spectrum is closer to 800 MHz to 1,000 MHz.

But the Federal Communications Commission is releasing an order of magnitude more physical spectrum; much unlicensed; with possibly two orders of virtual capacity increases; plus spectrum sharing; plus small cells; plus better radios, is bound to be disruptive.

Supply and demand is at work, in other words. And if supply increases by

So how much will 5G change service provider spectrum valuation and asset models? Quite a lot. In fact, say consultants at Deloitte, “5G changes everything,” they say. That might be a bit of hyperbole, but the point is that there is greater uncertainty, for several reasons.

For starters, it is an underestimated fact that the value of spectrum licenses is part of the equity value of any public mobile service provider company.

Spectrum licenses account for “an average 35 percent of the assets of US WSPs (wireless service providers), and close to 20 percent of WSPs elsewhere, according to consultants at Deloitte.

But present valuations are assigned at original purchase value, and therefore might actually be different in an era of growing spectrum need and supply. At one level, the potential mismatch is easy to illustrate.

The value of assets for which an operator overpaid represents more value than similar assets for which an operator paid less, even if the assets acquired at lower cost might be equally, or more, valuable. So accounting “fiction” is at work.

Still, historically, rights to use mobile spectrum have been fundamental drivers of the ability to be in the business and earn revenue. But there are new questions in the 5G and coming eras, as the supply of spectrum (physical and virtual) is changing by orders of magnitude.

And how does one account for the value of being able to offload traffic to Wi-Fi? That avoided capital investment is worth something, but how much? And even if valuable, can it be reflected in an assessment of equity value?

Scarcity also matters. Historically, mobile spectrum has had value in two or more ways. It has been the necessary precondition for conducting business and satisfying demand. But it also has been a means of denying competitors access.

Licensed spectrum has been a driver of scarcity, and therefore equity value.

Deloitte argues the value of spectrum is presently undervalued. On the other hand, one might argue that so much new spectrum is coming, and the ways to use unlicensed spectrum also multiplying, that old rules of thumb about value and pricing do not work so predictably.

Cable operators, for example, clearly see lots of value in using their distributed public Wi-Fi nodes as infrastructure for their new mobile services. The “Wi-Fi first” access model does reduce either capex or wholesale capacity purchases or both.

And though the correlation is not linear, since mobile operators can increase capacity in other ways, the amount of spectrum a mobile operator can deploy is linked to the amount of revenue it earns. But each contestant has other assets to deploy (capital, brand, scale), so the relationship is not linear and causal.

In each market, some operators earn more revenue than others, for reasons including, but not limited to, the amount of spectrum they can deploy.

The point is that it is no clear whether spectrum presently is undervalued or not. The harder question is how to value such assets in the future, when the amount of supply--ignoring quality issues--is going to increase by an order of magnitude, and the effective capacity is going to increase by possibly two orders of magnitude.

Qualitative changes also will matter. Most internet of things apps will not require much bandwidth. And much bandwidth presently consumed across the backbone might in the future be cached and processed at the edge of the network. That will shift the bandwidth demand curve in significant ways.

On the other hand, if mobile networks are to challenge fixed networks as platforms for consumer internet access, then lots of cheap new bandwidth will be necessary, so mobile alternatives can offer comparable bandwidth and prices. Lower bandwidth costs are coming, in the mobile area, driven by platform improvements, more and more-efficient spectrum assets, use of small cells and shared, unlicensed and aggregated spectrum options.  

If mobile bandwidth traditionally has been an order of magnitude more expensive than fixed network bandwidth, then it is obvious that, to compete, mobile bandwidth has to be as capacious and affordable as fixed network bandwidth.


Up to this point, mobile cost per gigabyte has been as much as an order of magnitude more costly than fixed network cost per gigabyte. That is going to change.

Friday, May 25, 2018

No Demand for Gigabit?

Among the most-dangerous of statements is that something cannot be done, violates the laws of physics, costs too much, is too difficult to manage, or is not wanted by consumers or other users.

The reason such statements can be quite dangerous is that they are sometimes spectacularly wrong, in ways that dramatically affect whole indus

I can remember being at a meeting at the headquarters of the National Cable Television Association, in the earlier days of high definition television discussions, where it was proposed that a full HDTV signal could be squeezed from about 45 Mbps of raw bandwidth to the 6-MHz channelization used by the North American television industry.

The room essentially exploded, as the attendees, mostly vice presidents of engineering from the largest cable TV and broadcast firms, disagreed with the sheer physics of the proposal. Later, the executive who suggested HDTV in 6 MHz was indeed possible talked with his firm’s engineering vice president, about the the science, to reaffirm that such a thing actually could be done. “Are you sure about this?” was the question, given the magnitude of opposition.

To make a longer story short, it did prove feasible to compress a full HDTV signal into just 6 MHz of bandwidth, making for a much-easier financial transition to full HDTV broadcasting, as well as an ability for cable TV operators to support the new format.

Similarly, when the U.S. cable TV industry began to ask for analog optical transmission systems capable of carrying 20 channels of standard definition video without complicated channel-by-channel coding and decoding, a distinguished engineer from Bell Laboratories privately assured me that such a thing was in fact not possible, and that people who claimed it was possible were simply wrong.

To make a longer story short, it did indeed prove possible to take a full complement of analog video signals (40 channels, as it turned out), convert the full set of broadband signals to analog optical format, and deliver them over distances useful for cable TV purposes.

On another occasion, the vice president of one of the world’s biggest suppliers of equipment said privately that “digital subscriber line does not work” as a platform for high speed Internet access, even at relatively low speeds. Ultimately, that also proved incorrect. Over time, DSL performance was not only proven to be commercially viable, but also delivered much-faster speeds, over longer distances, as experience was gained.

The point is that when a smart, experienced, thoroughly-knowledgeable executive says that something “cannot be done,” one has to translate. What the statement means is only that, at a given point in time, before the application of effort and ingenuity, a given entity has not been able to do something.

That does not actually mean something literally “cannot be done.” Quite often, formerly impossible things actually are made possible, after dedicated investigation and development.

That applies to consumer demand for internet access, as well. It might well have been true two decades ago, or a decade ago, that there was no appreciable consumer demand for gigabit internet access, if such services could have been provided, and at retail prices those services would have cost, back then.

There was a time when the computing power (in constant dollar terms) of an Apple iPad 2, when Microsoft was founded in 1975, would have cost between US$100 million and $10 billion.

But when gigabit internet access is priced at a relatively modest premium to the standard offers (gigabit for $100, standard 200 Mbps for possibly $50 to $70), then there is much more demand.

“There is no demand for gigabit internet access” is a conditional statement. At some point, there will be lots of demand, on both mobile and fixed networks. All that has to happen is that price changes.

Comcast, for example, has increased the highest offered internet access speed at nearly Moore's Law rates. So the unstated qualifiers “at this price, at this time,” have to be kept in mind.

There will be a time, and a price, that has many, if not most, consumers buying gigabit rate internet access.

Global WANs Now Driven by Hyperscale Data Center Locations

Zoom Wants to Become a "Digital Twin Equipped With Your Institutional Knowledge"

Perplexity and OpenAI hope to use artificial intelligence to challenge Google for search leadership. So Zoom says it will use AI to challen...