Saturday, September 29, 2018

Could Edge Computing Change Smartphone Design and Cost?

Edge computing is almost always touted as a necessity in the 5G era to support ultra-low-latency services, the typical examples being support for autonomous vehicles, remote surgery or even prosaic requirements such as supporting channel changes on video screens supporting ultra-high-definition TV (4k, 8K, virtual reality).

But are there are other possibilities? Consider the advent of the Chromebook, a “PC” that essentially conducts all computing activities at a remote cloud data center. The advantage is lower-cost customer premises equipment (CPE).

Sure, one needs a screen, power supply, keyboard and some amount of on-board memory and processing. But not so much. It often is said, with a good measure of truth, that a Chromebook is a device supporting a browser, and not much more.

So can edge computing support a similar approach to the design of smartphones, essentially creating a device that resembles earlier efforts to create network-centric computing devices? Maybe, some think.

Could edge computing create new opportunities for access providers supplying phone services? AT&T believes that could happen.

AT&T plans to build thousands of small edge computing data centers in central offices and other locations across the United States. So could a big edge computing network affect mobile phone design as much as cloud computing has affected the design and use of computing devices? AT&T’s Mazin Gilbert, VP, thinks that is a possibility.

Edge computing could create the conditions for really cheap smartphones. “Can my $1,000 mobile phone be $10 or $20 dollars, where all the intelligence is really sitting at the edge?,” Gilbert asks. “It’s absolutely possible.”

That obviously would dramatically reduce barriers to smartphone use by everyone, while providing some means of differentiation for access services provided by AT&T. Both trends would provide more reasons for consumers or businesses to use the AT&T network, instead of rival networks.

It has been decades since tier-one telcos actually had a significant role in customer premises equipment business. Back in the monopoly era, telcos actually made and sold the phone devices people used. In fact, it was illegal to use any phone not manufactured by the service provider.

In the competitive era, service providers have been irrelevant as suppliers of CPE, as that role was ceded to device suppliers active in the consumer electronics space.

Edge computing could change those assumptions. Perhaps a firm such as AT&T licenses the building of cheap smartphones that rely extensively on edge computing and are designed to work on AT&T’s network.

As always, that approach will start out as a “useful for many people” but not a “full and complete substitute” for standard smartphones able to work globally. But not every customer requires global roaming. For most customers, coverage most places in the United States will work.

As any Chromebook user will attest, the “connect to Wi-Fi or you cannot do too much” approach is not perfect. You cannot “compute” anywhere (except to conduct offline transactions or activities). But it works, especially if one has the ability to tether to a smartphone.

Something like that could be possible once edge computing is fully built out.

U.S. Device Adoption is Near Saturation

Use of communications-dependent devices obviously has direct implications for communications service demand. So it matters that U.S. consumers now are reaching--or already have reached--saturation levels of device use.

Not to belabor the point, but device and account saturation strongly suggests that demand for new services and apps has to be created, beyond current levels of functionality for devices and connections.

That is one reason why many believe 5G is going to be different than all prior generations of mobile platforms. It will be the first platform where brand-new value, and therefore new revenue opportunities, will be created by enterprises. Consumer demand for phone functions and connecting other devices is fairly well saturated.

source: Pew Research Center

Thursday, September 27, 2018

Why Nobody Releases Gigabit Take Rates, Yet

Not one U.S. internet service provider publicizes the take rates it gets for gigabit internet access. Historically, no ISPs have done so for their fastest tiers of service, either. The reason, as you might suspect, is that it is highly likely take rates for such tiers of service are rather modest, and tend to be purchased by businesses rather than consumers.

Eventually that could change, but only when purchases of gigabit access service is the mid-tier offer.

Back in the days when cable TV operators first were rolling out consumer Internet access at speeds of 100 Mbps, it was virtually impossible to get subscriber numbers from any of the providers, largely because take rates were low.

In the United Kingdom, then planning on upgrading consumer Internet access speeds to “superfast” 30 Mbps, officials complained about low demand. In fact, demand for 40 Mbps was less than expected.

So “gigabit” internet access remains mostly a marketing platform, not an indicator of what services people actually buy, when they have access to gigabit services.

Value versus price is the likely reason for consumer behavior. “Value (performance versus price)” seems to be evaluated as best in the mid ranges of internet access service, not the “fastest” grades of service. Nor is that an unusual situation for most product categories.

In Australia, in 2016, for example, perhaps 15 percent of consumers purchased the then-fastest speed tier of 100 Mbps. Some 47 percent bought the mid-range service at 25 Mbps. Some 33 percent of buyers were content with service at the slowest speed of 12 Mbps.

Likewise, even where fiber-to-home connections are available, that does not mean most consumers will buy such service, if other options also are available. Data from New Zealand suggests take rates might be 33 percent where FTTH is sold.

Price has much to do with those choices, as do perceptions of value. The safest assumption is that multi-user households are most likely to buy faster tiers of service, reasoning that the connection bandwidth has to be shared by all members of the household.

Also, since there always is a direct relationship between purchases of internet access generally with higher incomes, we should not be surprised if cost-conscious consumers opt for less-expensive packages, while higher-income consumers are most likely to buy the most-expensive packages, which also are the fastest.

The takeaway is that most consumers buy the mid-tier offers. According to Federal Communications Commission data, in 2015 the most popular advertised speed plans purchased by consumers tend to range about 100 Mbps for cable providers.

AT&T U-verse plans generally were in the 45 Mbps range in 2015, while DSL speeds (all-copper access) were quite low, in comparison. Verizon FiOS speeds were generally in the 80-Mbps range.

Over time, as speeds increase, consumers have tended to keep upgrading. But they have generally tended to buy the mid-tier services. That is what AT&T has found as it increases the top speeds available.  CenturyLink also found that to be the case.  

In 2010, for example, about 40 percent of U.S. consumers were buying Internet access at about 6 Mbps. You might wonder why, but the answer is simple. In 2010, the 6-Mbps service offered what consumers then considered the best value for the money paid.

Wednesday, September 26, 2018

U.S. Internet Access is Not "Expensive"

One always can get a good argument about whether internet access markets in the United States are getting less competitive or more competitive. What often gets lost in such discussions are facts. Everyone is entitled to an opinion; but not their own facts.

And there are several ways to look at internet access services, in the United States or anywhere else. For starters, there is a difference between mobile access and fixed network access. Most studies of internet access globally tend to focus on “fixed network” access, even when, in many markets, most people only use mobile internet access.

Availability is one important metric: can consumers buy service? Take rates are a different matter. Even where available, not every consumer wants to buy a fixed network service. Nor do consumers tend to buy the fastest service available. Instead, they compare value with price, and almost always buy services that are “good enough,” and neither the fastest nor slowest options available.

Speeds also vary from country to country, and within countries (urban, rural), and by provider (telco, cable, satellite, fixed wireless, mobile). We always can argue about what speeds are “good enough.”


Finally, there is the matter of price. Many only look at price in absolute terms, not relative terms. In other words, they look at total price, not the price as a percentage of buyer income. That matters whenever one is making international comparisons.



To state what should always be obvious, prices are higher in more developed economies, and that applies to internet access prices as well. Consider mobile broadband.

Fixed network internet access prices in developed nations--measured as a percentage of gross national income--are quite low, less than one percent of gross national income per capita.

Prices for mobile internet access, as a percentage of gross national income are even lower. The point is that U.S. internet access prices, as a percentage of household income or per-capita gross domestic product, are quite low, by global standards.

In other words, U.S. internet access is not expensive.



Tuesday, September 25, 2018

Cable TV Operators Gradually Start to Compete with Each Other

Historically, cable TV companies do not compete directly with each other in the same geographic areas. That is changing a bit, though. In the United Kingdom, if Comcast completes its purchase of Sky, Sky and Liberty Global (Virgin) will compete head to head, for the first time in the U.K. market.

That is something that has happened in telecom markets, both mobile and fixed, and some have wondered how long it would be until cable companies began to compete in such a manner as well. We appear to be one step closer, in the U.K. market.

In the U.S. market, such head-to-head competition is more likely to come as cable TV companies get into the mobility business, as has been the case for U.S. telcos generally. Even when firms such as AT&T, Verizon and CenturyLink mostly have not competed against each other in the fixed network area, there has been no way to limit competition when mobile networks operate ubiquitously across the country.

That means AT&T and Verizon, for example, were early on forced to compete against each other nationwide, in the mobile arena. In the fixed networks area, they have not competed in the same territories.

That now is changing as Verizon plans a 5G fixed wireless attack in AT&T areas (out of region). But Liberty Global and Comcast now will face each other as direct competitors in the U.K. market as well. That is new.

Revenue Upside and Cost Reduction Will Drive Networks towards Edge Computing

There are three major reasons why edge computing is going to reshape networking architectures: revenue, cost and functionality.

On an internal level, network cost and functionality are shifting towards use of edge computing to support access networks in the 5G era. For starters, centralizing radio processing further into the network reduces radio cell site costs, in addition to improving flexibility.

On the revenue side, core networks will evolve towards edge computing to reduce latency, a primary requirement for creating new applications that require one-millisecond or just a few milliseconds latency.
source: Nokia

Sunday, September 23, 2018

Disintermediation in the Subsea Business

“Disintermediation” is a term some attendees at the PTC Academy event in Bangkok, Sept. 20 and 21, 2018, heard for the first time. The term simply means that product and service providers go direct to end users and customers, rather than using distributors.

Since communications service providers are distributors, that has key implications. Think “over the top” and you get the point: apps go direct to customers and end users with no direct business relationship between the app/platform and the user.

To an astonishing degree, market demand for wide area communications has shifted away from telcos and to application and platform providers.

The amount of undersea traffic carried by the largest U.S. application and platform providers grew to 339 Tbps between 2013 and 2017. International capacity supplied by internet transport companies grew to 350 Tbps.

“15 years ago, 100 percent of my clients were telcos,”  said Sean Bergin, APTelecom president. “Now 80 percent of my customers are OTTs,”


So platform and app companies Google, Facebook, Microsoft and Amazon do not yet move more bits than service providers do, but arguably will do so in the future. And that “function substitution” has happened in telecommunications before.

Though you are familiar with mobile substitution--the use of mobile networks to displace use of fixed networks--the substitution happening elsewhere is “over the top” substitution for carrier services and value.

In the undersea and wide area network business, that means enterprises of a particular type (tier-one application and platform suppliers) are creating and owning their own transmission networks, and no longer buying capacity from transport providers. And that also means disintermediation of the communications service provider.


Put another way, wide area networks now are experiencing product substitution, as did fixed network service providers, where mobile services are preferred to fixed services. As "over the top" apps, platforms and services often displace carrier services and apps, so enterprises (app, platform, device providers) increasingly have found it makes sense to own their own global networks. 


And that means the demand for capacity services from "public" networks (telcos) is diminished. In other words, as bandwidth demand grows, the amount of growth available as "revenue for service providers" diminishes. 


That trend can be seen clearly in the growth of transoceanic capacity that is supplied directly and internally by app and platform providers directly, on their own private networks. 

In other words, OTT now covers a much-wider range of business cases, all based on disintermediation, where producers go straight to their customers or users, without relying on distribution partners. 


Intel Follows Pattern: Replace 1/2 of Current Revenue Sources Every Decade

One rule of thumb I use when looking at business model change is to assume that a tier-one service provider will have to replace half its current revenue with new sources every decade. And that might be a reasonable rule for suppliers of apps, platforms, devices and components as well.

Im 2012, for example, Intel earned nearly 70 percent of revenue from “PC and mobile” platforms. By 2018, PC/mobile had dropped to about half of total revenue. By 2023 or so, Intel should generate 60 percent or more of total revenue from sources other than PC/mobile.


If you hear executives talking so much about innovation and new services, that is why: companies need to replace half their revenue every decade, and do so in every decade, from now on.


The good news is that, as tough as that sounds, firms have shown they can do so.

Tuesday, September 18, 2018

Verizon as Disruptor

As accustomed as we might be to seeing Google, Netflix, Amazon, Facebook, cable TV companies, wireless internet service providers, metro fiber specialists or T-Mobile US as market attackers and share takers, we are unaccustomed to seeing either AT&T or Verizon in such roles.

But Verizon is about to take that role, in fixed networks.

Verizon is launching Verizon 5G Home, its 5G fixed wireless service, on October 1, 2018 in parts of Houston, Indianapolis, Los Angeles and Sacramento, providing the first U.S. real-world test of customer demand for 5G fixed wireless.


And Verizon has specific business reasons for doing so. Simply, footprint, or homes passed, in its fixed networks business is a key driver for Verizon. Simply put, Verizon has far fewer homes passed than its major fixed network competitors.



Comcast has (can actually sell service to ) about 57 million homes passed. Charter Communications has some 50 million homes passed.


AT&T’s fixed network represents perhaps 62 million U.S. homes passed. Verizon, on the other hand, passes perhaps 27 million homes passed.


As dominant as Verizon is in the mobile services segment, it lacks scale in the fixed networks segment. And that means Verizon can gain revenue by taking market share in the fixed network business.


The companion issue is simply that, similar to Spring and T-Mobile US, Verizon’s revenue is heavily weighted to mobile services. As much as 69 percent of Verizon’s revenue is earned from mobility services. That is less than Sprint or T-Mobile US earn from mobile services, but is highly significant, as it means Verizon, the biggest revenue producer in U.S. mobile, has less room to grow.


As cable companies have fueled growth by taking market share in voice services, business services and internet access, so Verizon expects to take share in fixed network internet access.

Thursday, September 13, 2018

Would a U.S. Mobile Market with Merged T-Mobile/Sprint be Stable?

Assume a merger between Sprint and T-Mobile US is approved by U.S. regulators, and the tier-one mobile service provider business becomes a contest of three relatively equally-situated contestants, in terms of market share. Is the market stable, long term?

I would argue it remains unstable, even with new T-Mobile US at 31 percent share, AT&T at 30 percent share and Verizon at 36 percent (share of accounts). The rationale is partly strategic and partly historical.

The immediate rationale for the merger is that a bigger T-Mobile US will be better able to compete with Verizon and AT&T, and the rearranged market would arguably feature three firms with roughly similar mobility market shares. But that virtually certainly creates a new market that is as unstable as the four-provider market that is replaced.


The strategic rationale for an unstable market is that, if one assumes the “service provider of the future” owns both fixed and mobile assets, then new T-Mobile US is still half way through a repositioning exercise.

The larger T-Mobile US would remain “mobile only in an industry that is moving rapidly towards “integrated” operations involving ownership of both fixed and mobility assets. And that suggests yet one more big transaction where T-Mobile US and some other entity merge again, to create an integrated competitor. But the size of new T-Mobile US narrows the list of potential acquirers.

On the other hand, to the extent 5G makes a mobile platform a more-perfect substitute for fixed networks, and if the backhaul issues can be finessed, then it is even more likely that a non-traditional buyer of T-Mobile US assets could emerge, as there would be no need for such a non-traditional buyer to worry about ownership of the fixed network.

In general, such potential acquirers might be tier-one platform, device or app providers.


There also is an historical argument for further instability.  Established (oligopoly) markets tend to feature a structure with disparate and unequal market shares that encourage suppliers not to launch disruptive attacks.

A ratio of 2:1 in market share between any two competitors seems to be the equilibrium point at which it is neither practical nor advantageous for either competitor to increase or decrease share.

A market with three roughly equally-situated contestants means there always will be a temptation to launch disruptive attacks, especially if one of the three has such a strategy already.

Some studies suggest a stable market of three firms features a market share pattern of approximately 4:2:1, where each contestant has double the market share of the following contestant.

The hypothetical stable market structure is one where market shares are unequal enough, and the leader financially strong enough, to whether any disruptive attack by the number two or number three providers.

In a classic oligopolistic market, one might expect to see an “ideal” (normative) structure something like:

Oligopoly Market Share of Sales
Number one
41%
Number two
31%
Number three
16%

As a theoretical rule, one might argue, an oligopolistic market with three leading providers will tend to be stable when market shares follow a general pattern of 40 percent, 30 percent, 20 percent market shares held by three contestants.

Under most circumstances, firms that have a higher share of the markets they serve are considerably more profitable than their smaller-share rivals, according to the Marketing Science Institute and Profit Impact of Market Strategies (PIMS) database.

And it is that disparity in profitability that allows the leader to weather pricing attacks, while making disruptive attacks often perilous.

To be sure, most financial observers believe new T-Mobile US will have reduced incentives to launch more disruptive attacks, allowing general price levels and profit margins to rise across the whole tier-one part of the industry.

Still, there is little historical precedent for stability in a three-competitor market that is roughly equally balanced in terms of market share. And mobility is but one element of competition.

Will AI Actually Boost Productivity and Consumer Demand? Maybe Not

A recent report by PwC suggests artificial intelligence will generate $15.7 trillion in economic impact to 2030. Most of us, reading, seein...