Monday, October 23, 2017

AI Will Take Decades to Produce Clear Productivity Results

General purpose technologies (GPT) tend to be important for economic growth as they tend to transform consumer and businesses do things. The issue is whether artificial intelligence is going to be a GPT.  

The steam engine, electricity, the internal combustion engine, and computers are each examples of important general purpose technologies. Each increased productivity directly, but also lead to important complementary innovations.

The steam engine initially was developed to pump water from coal mines. But steam power also revolutionized sailing ship propulsion, enabled railroads and increased the power of factory machinery.

Those applications then lead to innovations in supply chains and mass marketing and the creation of standard time, which was needed to manage railroad schedules.

Some argue AI is a GPT, which means there will be significant and multiple layers of impact.

Machine learning and applied artificial intelligence already can show operational improvements in all sorts of ways. Error rates in labeling the content of photos on ImageNet, a collection of more than 10 million images, have fallen from over 30 percent in 2010 to less than five percent in 2016 and most recently as low as 2.2 percent, according to Erik Brynjolfsson, MIT Sloan School of Management professor.


Likewise, error rates in voice recognition on the Switchboard speech recording corpus, often used to measure progress in speech recognition, have improved from 8.5 percent to 5.5 percent over the past year. The five-percent threshold is important because that is roughly the performance of humans at each of these tasks, Brynjolfsson says.

A system using deep neural networks was tested against 21 board certified dermatologists and matched their performance in diagnosing skin cancer, a development with direct implications for medical diagnosis using AI systems.

On the other hand, even if AI becomes a GPT, will we be able to measure its impact? That is less clear, as it has generally proven difficult to quantify the economic impact of other GPTs, at least in year-over-year terms.

It took 25 years after the invention of the integrated circuit for U.S.  computer capital stock to reach ubiquity, for example.

Likewise, at least half of U.S. manufacturing establishments remained unelectrified until 1919, about 30 years after the shift to alternating current began.

The point is that really-fundamental technologies often take decades to reach mass adoption levels.

In some cases, specific industries could see meaningful changes in as little as a decade. In 2015, there were about 2.2 million people working in over 6,800 call centers in the United States and hundreds of thousands more work as home-based call center agents or in smaller sites.

Improved voice-recognition systems coupled with intelligence question-answering tools like IBM’s Watson might plausibly be able to handle 60 percent to 70 percent  or more of the calls. If AI reduced the number workers by 60 percent, it would increase U.S. labor productivity by one percent over a decade.

But it also is quite possible that massive investment in AI could fail to find correlation with higher productivity, over a decade or so.

It might well be far too early to draw conclusions, but labor productivity growth rates in
a broad swath of developed economies fell in the mid-2000s and have stayed low since then, according to Brynjolfsson.

Aggregate labor productivity growth in the United States averaged only 1.3 percent per
year from 2005 to 2016, less than half of the 2.8 percent annual growth rate sustained over 1995
to 2004.

Fully 28 of 29 other countries for which the OECD has compiled productivity
growth data saw similar decelerations.

So some will reach pessimistic conclusions about the economic impact of AI, generally. To be sure, there are four principal candidate explanations for the discontinuity between advanced technology deployment and productivity increases: false hopes, mismeasurement,  concentrated distribution and rent dissipation or  implementation and restructuring lags.

In other words, new technology simply will not be as transformative as expected. The second explanation is that productivity has increased, but we are not able to measure it. One obvious example: as computing devices have gotten more powerful, their cost has decreased. We cannot quantify any qualitative gains people and organizations gain. We can only measure the retail prices, which are lower.

The actual use cases and benefits might come from “time saved” or “higher quality insight,” which cannot be directly quantified.

Another possible explanations are concentrated distribution (benefits are reaped by a small number of firms and rent dissipation (where everyone investing to reap gains is inefficient, as massive amounts of investment chase incrementally-smaller returns).

The final explanation is that there is a necessary lag time between disruptive technology introduction and all the other changes in business processes that allow the new technology to effectively cut costs, improve agility and create new products and business models.

Consider e-commerce, which was recognized as a major trend before 2000. In 1999, though, actual share of retail commerce was trivial, 0.2 percent of all retail sales in 1999. Only now, after 18 years, have significant shares of retailing shifted to online channels.

In 2017, retail e-commerce might represent eight percent of total retail sales (excluding travel and event tickets).


Two decades; eight percent market share. Even e-commerce, as powerful a trend as any, has taken two decades to claim eight percent share of retail commerce.  

Something like that is likely to happen with artificial intelligence, as well. If AI really is a general purpose technology with huge ramifications, it always take decades for full benefits to be seen.

It will not be enough to apply AI to “automate” existing business processes and supply chains. Those processes and supply chains have to be recrafted fundamentally to incorporate AI. Personal computers could only add so much value when they were substitutes for typewriters. They became more valuable when they could use spreadsheets to model outcomes based varying inputs.

Computing devices arguably became more valuable still when coupled with the internet, cloud-based apps, video, rich graphics, transaction capability and a general shift to online retailing.

Sunday, October 22, 2017

Can AI Help Move Beyond "Something Happened" to "Something Happened and the Network Fixed Itself"?

To say artificial intelligence is trendy is an understatement, experienced routinely by a growing number of consumers in the form of their smartphones and voice-activated assistants, invisibly in their consumption of content.

Investments in artificial intelligence have been highest, to date, in banking, retail, healthcare and manufacturing, IDC estimates. In the communications business, AI use cases arguably have been most pronounced in smartphones, customer service automation and possibly billing.


But it is logical to ask whether AI should not logically come to play a role in network operations and marketing, among other basic functions of communications networks.


Can AI be used by networks to make decisions based on customer activities or location? And does that create incremental value and revenue opportunities?


Can AI help network supervisors move beyond “I know what happened” to “I know what will happen” to “something happened and the network fixed itself?” That is not a terribly new idea, as the notion of “self-healing” networks has been around for some time in the form of ring networks that switch to backup facilities in the event of a primary ring failure.


The promise of AI is the ability to extend self-healing to more parts of the network and its functions.


The former might take the form of informing the creation and tear-down of actual connections and features, provisioning and monitoring of actual network requirements, internally and behalf of customers. The latter might take the form of such important insights as figuring out which customers are about to churn, and then matching new offers to them to address the churn drivers.


In other cases, AI arguably should help determine which customers, devices and services need upgraded features and querying those customers about the upgrades, without human intervention.


In other cases, AI should help inform service providers about which customers have needs for additional products, what solutions are appropriate and then pitching and provisioning without human intervention.


AI should play a role in security as well, but the broader issue is how many mundane, necessary activities could be enhanced by AI in ways that not only reduce costs and waste, but also allow the network to learn to operate more effectively. Right now, almost nothing can be done autonomously.


Ideally, AI would uncover new needs that the network actually can create and then deliver.


In other words, AI should help service providers with the long-held goal of virtualizing the network and enabling instant changes.


In its data centers, Google has used DeepMind to reduce energy consumption 40 percent. Similar benefits should be wrung from AI as applied to the operations of networks and the creation and marketing of its services. The issue is how much more self optimization is possible.

Given the need to continue reducing network operations costs, the use of AI would seem an almost-inevitable outcome.

Friday, October 20, 2017

How Much Can a Telco Afford to Invest in Faster Internet Access?

How much should any tier-one service provider invest in its internet access capabilities?

Much depends on the market dynamics: whether that firm’s role is wholesale-only; wholesale and retail; or retail only or mostly.

But in every case, the fundamentally-sound position is to invest only to the point that an adequate return on capital can be made. The level of return might be dictated wholly, or in part, by a government entity that caps the rate of return. In other markets the rate of return is limited by the amount of competition and risk of stranded assets.

In the U.S. market, some are not optimistic. Jonathan Chaplin, New Street Research equity analyst, believes cable companies could have 72 percent market share in 2020, with as much as 78 percent share of the internet access market.

Some might argue, given such trends, that telcos should simply harvest their internet access customer base. Of course, such forecasts likely include an assumption that telcos must either upgrade to fiber to home or stay with copper access of some sort, and also assume that, for capital availability reasons, the upgrades will occur relatively slowly.

The other assumption is that “telcos” are not the same as “AT&T and Verizon,” which actually are seeing very-modest declines in internet access share, with most of the losses coming from other telcos, especially the large former-rural-carrier ranks (CenturyLink, Windstream, Frontier).

AT&T and Verizon have other options, including both fiber to home, mobile substitution and fixed wireless access options that will improve dramatically with 5G. In fact, in most cases, AT&T and Verizon are likely to find the business case for mobile or fixed wireless much more compelling.


The point is that  a service provider has to invest enough in its internet access capabilities to remain competitive in the market, but not more than that level. There are, in some markets, good reasons why the upside is limited.

Consider the U.S. market, where a cable operator is the market share leader, approaching 60 percent share in most instances. That leaves a bit less than 40 percent share for the local telco.

Ignore for the moment the growing cable share, a situation many would argue exists since some key telco providers rely mostly on less-capable digital subscriber line platforms. In the second quarter of 2017, for example, internet access account losses by AT&T and Verizon were infinitesimal (on the level of hundredths of a percent of their installed base).

The logical investment criteria should then be, at a minimum, what is necessary to hold 40 percent market share.

The “maximum” position is a bit less clear, namely the level of investment that could allow either firm to take market share away from competitors. It is not so clear that taking share is possible, no matter what the level of investment, some might argue. Others might argue that this is possible, if mobile and fixed wireless offers can be used to create a superior value proposition, compared to cable.

Ironically, that might be especially true if cable companies start to raise prices as much as double current rates. That would create a higher pricing umbrella underneath which telco offers could operate.

The “take share” position is complicated, as the value proposition includes a range of value (types and quality of services, plus price, plus bundling effects, plus threat of new entrants). The “hold share” position is easier, as it mostly involves offering packages that are roughly competitive with what cable offers, in terms of speed, price, value and role in bundles.

The point is that some telcos might not be able to do much to prevent lost market share. AT&T and Verizon have other options, based on their coming 5G profiles.  Even in its FiOS areas, Verizon tends to get only about 40 percent share. Perhaps that is as good as it gets.

"Insight" is the Outcome AI Delivers

All of us have heard the phrases “data-driven business” and “digital transformation” as hallmarks of the way firms will have to evolve


Add insights-driven to that list. Though we are in the early days, that phrase is supposed to refer to the way firms mine the data they own to develop insights about customer behavior that can, in turn, be used to drive sales, retention and profit margins.




“Insight” is another way of saying “knowledge” or “understanding” about actual patterns in customer and prospect behavior, with the ability to apply such understanding to actual product features, processes and delivery, in a predictive way.


And without belaboring the point, such insights, the result of data mining using artificial intelligence or machine learning, already have been deployed in some business processes such as customized content, search, customer service operations and e-commerce.


Some firms have ad advantage, though. Etsy, the e-commerce site, created a “dedicated research department to blend quantitative and qualitative insight and embed customer insights into every department, leading to high levels of user satisfaction and smarter product decisions,” says  Brian Hopkins, Forrester VP.  “Insights-driven businesses not only excel at data analytics, but also bring quantitative insight to bear on problems then embed insights in their business models, operations, and company culture.”

Tesla auto performance data likewise is streamed in real-time to Tesla’s data scientists, who build models to help diagnose driving issues and provide software or firmware updates over the air.

Thursday, October 19, 2017

One Interesting Factoid from Verizon's 3Q 2017 Report

Just one interesting observation from Verizon’s third quarter earnings report, which probably was better than most had expected. Note just one indicator, voice connections, which shrank seven percent. At that rate, Verizon loses half its voice revenue in a decade.

That is one illustration of the argument that tier-one service providers must replace half their current revenue every decade.


Despite the shift to unlimited plans and heightened competition in the mobile services market, Verizon managed to add a net 603,000 mobile connections, 486,000 of those being the highly-regarded postpaid accounts.

Operating revenues also were up, year over year. Even Verizon’s wireless segment posted higher revenue, year over year.



Wednesday, October 18, 2017

Massive MIMO Deployments are Inevitable

It is not hard to predict that use of massive multiple input-multiple output radio technologies is going to grow, as advanced 4G and 5G networks are built. Massive MIMO is required to make use of vast new spectrum resources to be released in the millimeter wave region to support 5G.

In fact, massive MIMO is intrinsically related to use of small cells, ultra-dense cell networks and millimeter wave frequencies.

Massive MIMO trials or limited deployments in 2017 were undertaken by Sprint, Deutsche Telekom, China Mobile, China Telecom, China Unicom, Singtel, T-Mobile Netherlands, Vodafone Australia, Optus, and Telefónica. Massive MIMO also is being developed by Telecom Infra, the open source telecom infrastructure effort.

The spectrum bands at which many of these trials have taken place include 2.5 GHz, 2.6 GHz, 3.5 GHz, 1.8 GHz, and 2.3 GHz. Except for 3.5 GHz, the remaining frequencies are also allocated for LTE in many countries. Telecom Infra is testing much-higher frequencies (60 GHz), designed in the U.S. market to use unlicensed spectrum.

MIMO antenna technology has been in use since the launch of 802.11n WiFi systems, but was first ratified for use in cellular systems in 3GPP’s Release 7 in 2008.

Deployments below 1 GHz are most likely to support eight or 16 antenna elements at most. Very-high frequencies above 30 GHz can have hundreds of antenna elements with some research citing below 500 antennas as an upper limit.

Why a Massive New Gigabit Upgrade, Instead of DirecTV Acquisition, Made No Sense

Two years ago, when DirecTV was acquired by AT&T, it would have been easy to find detractors arguing that AT&T should have spent that money investing in fiber to home infrastructure. With linear video cord cutting possibly accelerating, the new version of that story is being heard again.


So what should AT&T have done with $67 billion, assuming a 4.6 percent cost of capital? Cost of capital is the annualized return a borrower or equity issuer (paying a dividend) incurs simply to cover the cost of borrowing.


In AT&T’s case, the breakeven rate is 4.6 percent, which is the cost of borrowing itself. To earn an actual return, AT&T has to generate new revenue above 4.6 percent.


First of all, AT&T would not have borrowed $67 billion if it needed to add about three million new fiber to home locations per year. Assume that was all incremental capital, above and beyond what AT&T normally spends for new and rehab access facilities.


Assume that for logistical reasons, AT&T really can only build about three million locations each year, gets a 25-percent initial take rate, spends $700 to pass a location and then $500 to activate a customer location. Assume account revenue is $80 a month.


AT&T would spend about $2.1 billion to build three million new FTTH locations. At a 25-percent initial take rate, AT&T spends about $525 million to provide service to new accounts. So annual cost is about $2.65 billion, to earn about $720 million in new revenue (not all of which is incremental, as some of the new FTTH customers are upgrading from DSL).


The simple point is that building three million new FTTH locations per year, and selling $80 in services to a quarter of those locations, immediately,  does not recover the cost of capital.


The DirecTV acquisition, on the other hand, boosted AT&T cash flow about 40 percent.


Basically, even if AT&T had not purchased DirecTV, and began a new program adding three million new gigabit passings per year, those investments would not increase AT&T cash flow in the near term, and possibly never would do so.

The same logic applies to the Time Warner acquisition, which not only moves AT&T into new segments of the content ecosystem, but also boosts cash flow and profit margins.

But the linear video business is declining, many obviously will note. All true. But linear assets create the foundation for over-the-top assets, which also come with lower operating cost (much lower fulfillment and provisioning, for example).

Also, no matter what the long-term impact might be, a huge boost in free cash flow still matters, as markets evolve.

The point is that the alternative of plowing capital into faster gigabit upgrades sounds reasonable, but simply fails to move the needle on cash flow.

SD-WAN Growing 70% Annually, MPLS 4%

Only one fact about software-defined wide area network services is incontestable: its growth rates dwarf  growth rates for MPLS, a service some believe eventually could emerge as a replacement for some portion of MPLS demand.

A new forecast from International Data Corporation estimates that worldwide SD-WAN infrastructure and services revenues will see a compound annual growth rate of 69.6 percent and reach $8.05 billion in 2021.

MPLS, on the other hand, will grow at about four percent rates through 2021.


The most significant driver of SD-WAN growth over the next five years will be driven by increased reliance on cloud computing, big data analytics, mobility, and social business, IDC says.

Use of those tools generally increases network workloads and elevates the network's end-to-end importance to business operations, including support at all branch locations.

Tuesday, October 17, 2017

Which Way for Retail Internet Access Pricing?

We are about to see an unusual test of internet access pricing; unusual only in the sense that the direction of retail pricing in the internet era has been down, in a cost per bit basis and generally even in an absolute cost basis.

The test is a thesis some advance that U.S. cable companies--especially Comcast--will be able to boost retail internet access prices dramatically in coming years. That would run counter to past trends, and assumes that competition in internet access space will not increase.

Prices are complicated though, as one broad pattern has been for prices to remain roughly flat while speeds have grown dramatically, in some cases as fast as Moore’s Law might predict, at the high end of the market (what it is possible for a consumer customer to buy from an internet service provider such as Comcast).

That means sharp declines in cost per megabyte per second of speed might not be seen in posted retail prices. Also, pricing trends also reflect consumer decisions to spend more for their access, in the form of tiers of service that are faster.  


According to the International Telecommunications Union, broadband prices have declined as much as 50 percent in developing nations, between 2008 and 2010, for example, and about 35 percent in developed nations.



Granted, as a supplier, Comcast might “need” to raise prices to counter lost video revenues.
That effort would aim to sustain average revenue per account as the linear video business declines.

But supplier “need for higher prices”  always must contend with market dynamics. And there, one might well question whether Comcast will be able to maintain pricing power. Not only will mobile alternatives become more reasonable in the 5G era, new suppliers are entering the market (both new retail providers, such as Google Fiber, Ting Internet and others, as well as enterprises building their own networks and removing demand from the market).

So the interesting test is whether Comcast and cable can maintain pricing based on market power, in the 5G era, or whether competition will escalate, diminishing both market power and the ability to raise prices.

Netflix, AT&T, Comcast: Same Strategy?

Netflix strategy has been clear for some time: “Become HBO faster than HBO becomes us.”


By that, Netflix means becomes a source of original programing, not a retransmission or distribution vehicle for third party content.


Not just incidentally, the Netflix strategy sheds light on what AT&T and Comcast have done, despite criticisms of AT&T for doing so. Netflix occupies what formerly were a few distinct roles in the video entertainment ecosystem: content owner, program network and content distributor.




In the old model, programming networks were one segment of the business, while distributors (broadcast TV, cable TV, telco and satellite TV) were in a distinctly-different part of the ecosystem. The content creation business (studio function) was a third role.


But what Netflix has done is create a new position that simultaneously combines those several former roles.


Its investments in original programing make it a content owner, like a studio. Its distribution function (streaming) makes it a distributor. But its total content assets also make it a programming network.


“Our future largely lies in exclusive original content,” says Reed Hastings, Netflix CEO. “Our investment in Netflix originals is over a quarter of our total P&L content budget in 2017 and will continue to grow.”


Netflix already has $17 billion in content commitments “over the next several years” and a growing library of owned content ($2.5 billion net book value at the end of the quarter).


Netflix expects to spend $7 billion to $8 billion on content in 2018.


“Just as we moved from second-run content to licensed originals and then to Netflix-produced originals, we are progressing even further up the value chain to work directly with talented content creators,” Hastings says.

In other words, Netflix, aside from being a content network and a content distributor, now is moving in the direction of becoming a content owner and developer, like a studio.

By buying Time Warner, AT&T is using the same strategy, moving from one role to several: distribution to content creation; distribution to content network.

AT&T Critics are Simply Wrong About Linear Video

Inevitably, aside from claims that the “wheels are coming off” the linear video business, there will be renewed criticism that AT&T should instead have spent the capital used to acquire DirecTV, and then (if approved by regulators) Time Warner, to upgrade its consumer access networks.

The critics are wrong; simply wrong, even if it sounds reasonable that AT&T could have launched a massive upgrade of its fixed networks, instead of buying DirecTV or Time Warner (assuming the acquisition is approved).

AT&T already has said it had linear video subscriber losses of about 90,000 net accounts in the third quarter. In its second quarter, net losses from U-verse and DirecTV amounted to about 351,000 accounts.

Keep in mind that, as the largest U.S. linear video provider, AT&T will lose the most customers, all other things being equal, when the market shrinks.

Some have speculated that AT&T potential losses could be as high as 390,000 linear accounts.

Such criticisms about AT&T video strategy might seem reasonable enough upon first glance.

Sure, if AT&T is losing internet access customers to cable operators because it only can offer slower digital subscriber line service, then investing more in internet access speeds will help AT&T stem some of those losses.

What such criticisms miss is that that advice essentially is an admonition to move further in the direction of becoming a “dumb pipe” access provider, and increasingly, a “one-service” provider in the fixed business.

That key implication might not be immediately obvious.

But with voice revenues also dropping, and without a role in linear or streaming subscription businesses, AT&T would increasingly be reliant on access revenues for its revenue.

Here is the fundamental problem: in the competitive era, it has become impossible for a scale provider (cable or telco) to build a sustainable business case on a single anchor service: not video entertainment, not voice, not internet access.

In fact, it no longer is possible to sustain profits without both consumer and business customers, something the cable industry is finding.

So the argument that AT&T “should have” invested in upgraded access networks--instead of moving up the stack with Time Warner and amassing more accounts in linear video with the DirecTV buy--is functionally a call to become a single-service dumb pipe provider.

That will not work, and the problem is simple math. In the fiercely-competitive U.S. fixed services market, any competent scale player is going to build a full network and strand between 40 percent and 60 percent of the assets. In other words, no revenue will be earned on up to 60 percent of the deployed access assets.

No single service (voice, video, internet access) is big enough to support a cabled fixed network. Period.

That is why all scale providers sell at least three consumer services. The strategy is to sell more units to fewer customers. Selling three services per account is one way to compensate for all the stranded assets.

Assume revenue per unit is $33. If one provider had 100-percent adoption, 100 homes produce $3,000 in gross revenue per month. At 50 percent penetration (half of all homes passed are customers), just $1650 in gross revenue is generated.

At 40-percent take rates, gross revenue from 100 passed locations is $1320.

But consider a scenario where--on average--each account buys 2.5 services. Then, at 50-percent take rates, monthly gross revenue is $4125 per month. At 40-percent adoption, monthly revenue is $3300. You get the point: selling more products (units) to a smaller number of customers still can produce more revenue than selling one product to all locations passed.

The point is that it is not clear at all that AT&T could have spent capital to shore up its business model any more directly than by buying DirecTV and its accounts and cash flow.

That the linear model is past its peak is undeniable. But linear assets are the foundation of the streaming business, and still throw off important cash flow that buys time to make a bigger pivot.

One might argue AT&T could have purchased other assets, though it is not clear any other assets would have boosted the bottom and top lines as much as did DirecTV.

What is relatively clear is that spending money to become a dumb pipe internet access provider will not work for AT&T, even if all the DirecTV capital had been invested in gigabit networks. At best, AT&T might have eventually slowed the erosion of its dumb pipe internet access business. It would not have grown its business (revenue, profits, cash flow) enough to justify the diversion of capital.

Would AT&T be better off today, had it not bought DirecTV, and invested that capital in gigabit internet access? It is hard to see how that math would play. Just a bit after two years since the deal, AT&T would not even have finished upgrading most of the older DSL lines, much less have added enough new internet access accounts to justify the investment.

AT&T passes perhaps 62 million housing units. In 2015, it was able to deliver video to perhaps 33 million of those locations. Upgrading just those 33 million locations would take many years. A general rule of thumb is that a complete rebuild of a metro network takes at least three years, assuming capital is available to do so.

Even if AT&T was to attempt a rebuild of those 33 million locations, and assuming it could build three million units every year, it would still take a decade to finish the nationwide upgrade.

In other words, a massive gigabit upgrade, nationwide, would not have generated enough revenue or cash flow to justify the effort, one might well argue.

Assume AT&T has 40 percent share of internet access accounts in its former DSL markets. Assume that by activating that network, it can half the erosion of its internet access accounts. AT&T in recent quarters has lost perhaps 9,000 accounts per quarter. Assuming AT&T saves 10 percent of those accounts, that amounts to only about 900 accounts, nationwide.

That is not enough revenue to justify the effort, whatever the results might be after a decade, when all 33 million locations might be upgraded.


The simple point is that AT&T really did not have a choice to launch a massive broadband upgrade program, instead of buying DirecTV, and instead of buying Time Warner. The financial returns simply would not have been there.

On the Use and Misuse of Principles, Theorems and Concepts

When financial commentators compile lists of "potential black swans," they misunderstand the concept. As explained by Taleb Nasim ...