Thursday, June 28, 2018

Where is the Edge and What Does One Do There?

One practical issue with edge computing is that not everybody uses the same definition of “edge,” even if everyone seems to agree edge computing means putting compute capabilities at the logical edge of a network.

As with many earlier architectural decisions app providers and transport providers must address, the issue boils down to “intelligence at the network edge” or “intelligence in the network core,” even if some functions might logically need to be centralized, while others can be distributed.

Consider even the hyperscale data center. Is such a computing facility, by definition located “at the network edge,” functionally a “core” or an “edge” function? Functionally, many would have to agree, it functions as a “core” element, not an edge element, though the webscale data center actually resides at a particular edge.

The important point is that for all the other endpoints, any particular data center actually “acts” like a “core” element, as it is remote.

The Linux Foundation defines edge computing as “the delivery of computing capabilities to the logical extremes of a network in order to improve the performance, operating cost and reliability of applications and services.” The key concept there is the plurality of “extremes.”

“Edge” only makes logical sense if “you” and your device are in the same local area as your  compute facilities. In other words, your use of cloud computing happens generally as you would use a localized (metro) area network, and not as you use a wide area network.

In a real sense, you move from reliance on WAN-based hyperscale cloud centers to use of metro-level cloud facilities, and the issue is latency performance.

“By shortening the distance between devices and the cloud resources that serve them, and also reducing network hops, edge computing mitigates the latency and bandwidth constraints of today's Internet, ushering in new classes of applications,” the Linux Foundation definition states.


In practical terms, this means distributing new resources and software stacks along the path between today's centralized data centers and the increasingly large number of devices in the field, concentrated, in particular, but not exclusively, in close proximity to the last mile network, on both the infrastructure and device sides.

That has likely implications for where and how augmented intelligence (artificial intelligence) gets implemented in wide area networks. Just how much applied machine learning or augmented intelligence gets deployed in the core, as opposed to the edge, to create new service capabilities or gain operational advantages, remains an open question.

By definition, the whole advantage of edge computing is to avoid “hairpinning” (long transits of core networks), when local access can be provided. If so, when edge computing is widely deployed, the upside to using AI to groom traffic or reduce latency is less.

Nor is it entirely clear what new network-based capabilities can be created in the core network, using AI (for example), and which AI-based features actually are possible only, or implemented easier, at the edge. Some security features or location features might be possible only at the edge, some argue.


How to apply new technologies such as AI will remind you of similar debates and decisions that happened around “making core networks smarter” or relying on “smart edge” and simply building high-performance but “dumb” core networks. Those of you with long memories will recall those precise debates around use of ATM and IP for core networking.

Wednesday, June 27, 2018

Big Assumptions Underlie Big Forecasts for 5G

How big 5G revenue streams could be depends largely on how big an impact 5G can have in enabling all sorts of other businesses requiring 5G connectivity and features (low latency, ultra-high bandwidth, retail user costs as low as provided by fixed networks).

In the area of what 5G might provide, in terms of connectivity revenues, hinges largely on incremental new activity, above and beyond what retail customers are willing to spend on 4G.

If most customers wind up substituting 5G for mobile internet access, there will be some incremental revenue potential, but not so much incremental revenue growth. If 5G winds up supporting many new use cases (sensors, for example), then brand new revenue will be created.

Just how much revenue depends on connection volume more than per-connection prices, which might be quite low.

On the other hand, perhaps 60 percent to 90 percent of total 5G-enabled revenue might come from applications, platforms, devices and other services to implement 5G-using applications for business or consumer markets.

Truly-significant revenue generation might well depend on whether 5G enables brand new internet of things, virtualization use cases, automated vehicles and processes or solutions based on artificial intelligence and machine learning.

Those are very big assumptions.

Some researchers predict that 5G will become a “general purpose technology,” and that matters because GPTs are the foundation for huge waves of economic growth. Some past GPTs are said to include:
  • Interchangeable parts and mass production
  • Military and commercial aircraft
  • Nuclear energy
  • Computers and semi-conductors
  • The Internet
  • The space industries

Others might be more selective and cite electricity and information technology as general purpose technologies that have mattered. The steam engine and electricity are seen by others as GPTs. Some cite the internal combustion engine as being a GPT.

Spoken language, the wheel, written language, printing, railways, automobiles and mass production are other often-mentioned GPTs.


Few observers seem to count “communications” as a GPT, though some have considered the telegraph or telephone a GPT.

So it is not clear whether 5G will produce as much economic activity as some predict.

source: Graeme Chamberlin, Linda Yueh  

Tuesday, June 26, 2018

FairlawnGig Claims 50% Take Rates for its Municipal Broadband Service

FairlawnGig, an open access internet access network serving the city of Fairlawn, near Akron, Ohio, now is poised to expand outside Fairlawn. The gigabit services is sold to consumers for $75 a month. Prior to FairlawnGig’s launch, the fastest available internet access speed sold in the area was 50 Mbps.

It appears take rates are highest for the 300-megabit speed package, costing $55 per month (boosted from the initial 150-Mbps level). Business services sell for $500 a month, with speeds up to 100 Gbps.

FairlawnGig says it gets a 50-percent “sign up” rate, which might mean the initial take rate is quite high, compared to most other new internet access services launched by a competitor in any U.S. market.

FairlawnGig also sells phone service for $25 a month.

The city has 3,579 houses, of which 3,320 are occupied. The city also counts about 800 businesses. Assuming it cost $10 million to build the network, that implies a cost of perhaps $2284 per location, if the full cost of the $10 million in debt raised by the city was used to build the network.

The addressable consumer market is about 3,320 locations, implying FairlawnGig could have as many as 1,660 paying consumer accounts (not including business accounts).


It is possible that the cost of activated drops is as much as $2,000 per consumer customer location in additional spending.

Monday, June 25, 2018

Telco Industry's Existential Problem

The biggest single problem telecom service providers face--bar none, including the threat of government regulation--is that connectivity prices in the digital era have shown a “disturbing” tendency to drop relentlessly lower, in some cases even trending towards zero.

Ad any industry facing near-zero levels of pricing faces a big existential (“concerned with existence”) problem. Unless that industry does something radically different, it is virtually guaranteed to become extinct. That does not mean the function goes away, only that the current industry supported by connectivity revenues could go away.

The illustrative cases are easy to name. Domestic U.S. calling rates; international calling rates; internet transit pricing; mobile text messaging; mobile voice prices; voicemail and other calling features.

Consider the impact of just one form of product substitution. In 1998, AT&T launched “Digital One Rate,” conceived by Dan Hesse, (former Sprint CEO), who was at time in marketing for AT&T, that essentially eliminated the distinction between  local calling and long distance calling. Within several years, both purchasing of local voice lines and long distance revenues began a long plunge.


In all those cases, products that once cost quite a lot now face substitutes literally offered for free, or for very little. So the legacy carrier products now have to contend with that product substitution, meaning lower prices and, in many cases, unlimited or virtually unlimited use.

Add the changes in user behavior--less calling, less texting, less buying of linear video services, ability to use Wi-Fi or other access mechanisms, substitution of mobile for fixed internet access--and one faces a potentially toxic mix.

Under such conditions, one can argue that surviving tier-one service providers must move up the stack, must take on additional roles in the  value chain, must develop big new revenue sources beyond connectivity, as connectivity unit prices are going to keep dropping.

To reiterate, that belief flows directly from the marginal cost or near-zero levels of pricing for all connectivity services. Pricing that falls to nearly zero therefore is an obvious problem for the telecom or any other industry selling connectivity products as its main revenue sources.

Operating cost and some capex reductions are necessary, but not sufficient to remedy the near-zero pricing problem. The existential problem is that connectivity prices will continue to trend lower, and not even “higher consumption” will fix that issue, and demand for most of the products also is dropping as consumers switch to product substitutes.

The virtually universal set of solutions must include participation in much-wider parts of the ecosystems enabled by communications, as hard a challenge as that has, and will, prove.

Sunday, June 24, 2018

Does Cost Per Bit, Revenue Per Bit Still Matter?

Usage (data consumption) of communications networks is not related in a linear way to revenue or profit, all observers will acknowledge.

And that fact has huge implications for business models, as virtually all communication networks are recast as video transport and delivery networks, whatever other media types are carried.

Something on the order of 75 percent of total mobile network traffic in 2021, Cisco predicts. Globally, IP video traffic will be 82 percent of all consumer internet traffic by 2021, up from 73 percent in 2016, Cisco also says.

The basic problem is that entertainment video generates the absolute lowest revenue per bit, and entertainment video will dominate usage on all networks. Conversely while all narrowband services generate the highest revenue per bit, the “value” of narrowband services, expressed as retail price per bit, keeps falling, and usage actually is declining, in mature markets.

Some even argue that cost per bit exceeds revenue per bit, a long term recipe for failure. That has been cited as a key problem for emerging market mobile providers, where retail prices per megabyte must be quite low, requiring cost per bit levels of perhaps 0.01 cents per megabyte.

Of course, we have to avoid thinking in a linear way. Better technology, new architectures, huge new increases in mobile spectrum, shared spectrum, dynamic use of licensed spectrum and unlicensed spectrum all will change revenue per bit metrics.

Yet others argue that revenue per application now is what counts, not revenue per bit  or cost per bit. In other words, as for products sold in a grocery store, each particular product might have a different profit margin on sales, and what matters really is overall sales, and overall profit levels, not the specific profit levels of products sold.

So the basic business problem for network service providers is that their networks now must be built to support low revenue per bit services. That has key implications for the amount of capital that can be spent on networks to support the expected number of customers, average revenue per account and the amount of stranded assets.

Operating costs also become a continuing issue, as the cost per customer is high and getting higher, as competition shrinks the market share any proficient provider can expect to obtain.

As always, the problem is that propensity to spend is fairly linear, while data consumption and demand are non-linear. So the solution to maintaining a revenue-cost relationship that is positive is to reduce the cost of supplying a bit, add new revenue sources higher in the stack, add new geographies and accounts or otherwise gain scale.


Not many who were in the communications business 50 years ago would have believed that would be the case, and so dramatically necessary.

Friday, June 22, 2018

SP500 Changes Reflect Reality: Media, Communications, Internet Apps Increasingly are One Industry and Market

The S&P500 telecom services sector index will change in September, and be renamed “communications services.” And that is the least of the changes.

Internet app firms Facebook, Netflix, Google-owner Alphabet, Disney and Comcast will be in the index as well.

Irrespective of you views on the wisdom of putting traditional value plays into the same index with media and internet apps, the new index might lead to a reevaluation of the price-earnings multiple for telecom assets.

The new communications sector index implies a price-to-earnings ratio of perhaps 19 times (the impact of adding internet app firms) expected earnings, nearly double the sector’s current multiple, according to Thomson Reuters, reflecting the different profiles of internet sector firms.

That does not mean an automatic reevaluation of the underlying values of assets in the index, which will continue to feature some slow-growing and high-dividend firms, plus some fast-growing, low or no dividend growth plays.

Still, many observers believe there is some potential lift for “telecom” asset values, if the mix of assets held by the constituent firms continues to evolve, producing companies with a mix of assets and growth profiles.


Some might argue the index changes do reflect a larger underlying transformation of the media, communications and internet application industries, though. The new index assumes connectivity services, content and applications are logically part of a single ecosystem.

The boundaries between media, content, apps and access are porous. Firms such as Comcast and AT&T already derive significant percentages of core revenue outside the realm of connectivity services.

In fact, some might accuse Comcast of running away from its core business as it contemplates buying Twenty-First Century Fox assets, a move that, if successful, would further diversify Comcast away from connectivity revenues.

No matter. The growing reality is that the formerly-separate media, internet app and access businesses are “converging and fusing.” Any particular firm is going to own a mix of such assets.


And so the creation of a new communications index is suggestive of the new industry structure that is emerging. In the immediate future, the talk is going to be of big mergers in the media space. That is going to fuel even more talk--informed or uninformed--about the dangers of “bigness” in all of these formerly-separate spaces.

That brings dangers as well. We thought breaking up the AT&T system would enhance innovation and competition. It has, but perhaps less so than expected. We thought the Telecommunications Act of 1996 would also help. It has, though perhaps much less than was expected.

We might argue about whether the antitrust actions taken against Microsoft actually were effective, or whether the shift from “personal computing” to an “internet-lead” industry would have lead to a weakening of potential Microsoft monopoly in any case.

Now we have some observers arguing that traditional notions of monopoly are wrong; that even when there is lots of innovation; lots of new product commercialization; declining prices and higher value, consumer welfare might still be harmed.

That is a potential danger. Economists cannot measure everything that matters, and not all that matters can be measured. But numbers still matter when trying to assess consumer welfare and consumer harm. Perhaps the new thinking is that more weight needs to be placed on externalities.

But even externalities have to be quantified to assess potential benefit and risk. And regulatory history should not lead us to be too optimistic about the intended and unintended consequences of our actions.

What is clear is that content, media, apps, internet and communications can be viewed as aspects of the current reality of “computing.” And that means change is coming. No computing era lasts forever. In fact, every couple of decades there seems to be a new era arriving, as a new mobile generation arrives every decade or so.

The computing (and content, and app, and access) industry will evolve in the next era. We have no idea what we will call it. We have no idea how to quantify the impact on industries, firms and revenue models. We can predict that--based on history--not even Facebook or Google can dominate the next era as they dominate the present.

No firm that lead in one era has ever lead in the succeeding era. The implication is that regulating the leaders of the present era is less important than watching for, and promoting the next era, which naturally will produce new leaders.

We can debate the value of throttling today’s leaders to allow tomorrow’s leaders to emerge. History suggests those leaders are going to emerge anyhow, and that our policy actions might, or might not, help. In fact, they might well harm as much as they help.

source: Reuters

Thursday, June 21, 2018

67% of U.K. Fixed Network Consumers Buy a Dual or Triple Play Bundle

Dual pay and triple play offers now are the foundation of the U.K. fixed network consumer business, with 34 percent buying a package of voice and internet access, while 33 percent buy a triple-play bundle of voice, video entertainment and internet access, according to Ofcom. T
source: Ofcom

AI Will Improve Productivity, But That is Not the Biggest Possible Change

Many would note that the internet impact on content media has been profound, boosting social and online media at the expense of linear form...