Saturday, January 6, 2018

Share of Wallet Shifting to Devices, Apps

One consequence of the “telecom” industry now being a part of the broader internet ecosystem is a shift in industry share of profits. In other words of consumer spending on “telecom” products and services, share of wallet has moved to devices, apps and services.

And most of the consumer spending growth has shifted to devices and over the top apps (such as Apple and Netflix). In at least a few markets, the share gains by OTT services has been dramatic.

The other shift, in some markets like the United States, is a shift of market share from legacy providers to newer challengers (such as cable TV operators).

`



Era of Pervasive Computing Shapes Communications Revenue Drivers

Eras of computing matter for telecom professionals and the broader telecom industry for any number of reasons, but chief among the implications is that computing eras create, shape and form demand for communications.

The era of pervasive computing, which is likely to supplant the era of mobile computing, provides an example. At one level, the idea that computing devices will be embedded all around people implies communication as well. And since sensors and pervasive computing devices (things) will vastly outnumber people, that suggests a lot more communication connections.

But computing eras also shape other parts of life, such as who and what needs to communicate, over what distances, in what manner, how often and with what bandwidth requirements. Those issues in turn create potential demand for revenue-generating services, features and apps.

There are many ways to characterize eras of computing, but it is safe to say that the present era is the second of perhaps five eras where communications is essential for computing, since computing is largely accomplished remotely.

In other words, “everything” is networked and connected.



In the era of personal computing and use of the web, that mostly meant connecting PCs with remote computing facilities. In the cloud era, we reach a new stage where “most” applied computing tasks are partially, substantially or nearly-completely conducted remotely, making communications a necessary part of computing.

In the present era, demand for communications to support computing has been driven by global adoption of mobility, plus mobile data, plus video and other internet apps.

In the next era, communications demand will be driven by internet of things sensors and other forms of pervasive computing.  For communications providers, that is the good news.

The bad news is that in the era of pervasive computing, not every instance of communications necessarily generates incremental revenue. We already see that with Wi-Fi, Bluetooth and other forms of local and short-distance communications.

Nor, in the pervasive era, is it possible for any access provider to directly profit from most of the applications that use a network. Potential revenue exists in increased demand for wide area communications and therefore local connections to such networks.

But the relationships are far from linear. Basically, incremental revenue grows less robustly than increased data usage, and threatens to grow far more slowly than network capital investment.

That is among the key challenges for the “dumb pipe” internet access function. That is not to say the only revenue drivers are dumb pipe internet access. Access providers do provide owned applications (messaging, voice, video). But those legacy sources are either declining or morphing, with new suppliers providing effective substitutes.

That is why surviving retail suppliers must “move up the stack” into owned apps, platforms and services.

Was Negroponte Wrong?

Lots of things can change in four decades. The Negroponte Switch, for example, suggested that legacy networks made inefficient use of bandwidth. Broadband signals (at that time television) were moved through the air using wireless, while narrowband signals (at that point mostly voice) were moved using cables.

There was another angle, namely that mobile and personal endpoints (phones, at that time) were perversely limited to static fixed network connections, while devices that functioned in a “place-based” manner (television sets) were connected using wireless.

Prof. Negroponte argued we should do just the opposite, namely move narrowband and mobile signals over the air, and confine broadband to cables.

These days, the switch is really from cabled to wireless and mobile, since most traffic now is broadband, and increasingly that traffic is mobile and personal. By perhaps 2019, as much as two thirds of all data will use some form of (mobile, Wi-Fi or other untethered networks, short range or long range), Cisco has predicted.


Of course, assumptions matter. In the 1980s, it would have been impossible to foresee the huge explosion of mobile usage; the shift of TV consumption from place-based to mobile (and from linear to on-demand); the decline of fixed network voice; or the rise of the internet itself.

Nor would it have been possible to accurately foresee the impact of orders of magnitude decreases in the cost of computation and communications. Rather than a shift, we now see a move of virtually all communications to untethered modes.

These days, Wi-Fi is the preferred local connection technology in the office, home and indoor venues. Outdoors and on the go, mobile connections are the norm.

In the new developing areas, such as internet of things apps and sensors, untethered access also is expected to be the norm, not fixed access.

Negroponte was correct--within the limits of networks and costs at the time--in suggesting a shift of broadband to cables and narrowband to wireless.  

Some 40 years later, everything--all media types--are moving to untethered access. That is the result of mobility emerging as the dominant end user device, the growth of Wi-Fi as the universal endpoint connection method, the impact of Moore’s Law on computing and communications costs, the growth of the internet and ever-new ways to use communications spectrum more efficiently.

In the case of millimeter wave and spectrum aggregation, cheap computing means we can use bandwidth assets that were impractical in the past.

Computing power that would have cost $100 million in 1980 cost about $100 in 2010, less than that in 2017. In other words, costs have dropped at least eight orders of magnitude.

Predictions about the future always are perilous. What we have seen is partly a switch, but more profoundly an upheaval. Increasingly, the untethered and mobile networks are for actual access. The fixed network is--in the realm of consumer services--a backhaul network.

Friday, January 5, 2018

Technologies of Freedom

It now is a given that media, communications, content and delivery are converging, erasing former clear lines between industries and functions. That has important consequences.

And some ideas, even if abstract, really do matter, in that regard. Freedom, responsibility and fairness always come to mind, given my academic training in journalism. It now appears, for a variety of reasons having nothing much to do with those ideas, that freedom is imperiled.

Ironically, it is the leading app providers that now face threats to their freedom, as there are growing calls to “regulate” them in greater ways, globally.

Let me be clear: my own position has for decades been that more freedom for all in the ecosystem works best, and is the preferred approach to policy. Those of you who ever have read Technologies of Freedom will understand why.

Responsibility and fairness also are requirements, but something that has to happen at the personal, firm and industry level. Yes, this can be done “to” people, firms and industries, by government fiat. But freedom is the preferred course.

In a world where formerly-distinct endeavors and industries really are converging, we have a choice: extend freedom to former highly-regulated entities who now operate in entirely-new realms where freedom is the policy (First Amendment protections), or remove freedom from content providers and make them “utilities.”

The bigger challenge right now is getting the transition right. Somehow, we need to balance regulatory models, away from “utility” and “common carrier” regulation for app providers, but also away from such regulation for firms that now participate in activities that increasingly are inseparable from traditional First Amendment protected ideas, content and media.

At the same time, major app providers already operate as “access providers,” though without the obligations imposed on only a few access providers.

Some now are arguing essentially for “less freedom” for Facebook, Google, Amazon and others, and “even less freedom” for access providers who--despite becoming content providers at their core--deserve freedom no less than any other content provider.

The better policy is to extend the realm of freedom further, not restrict it. In other words, when harmonization is required, it is better to extend freedom to formerly-distinct industries (broadcast TV and radio, cable TV and other distribution entities, even telcos).

Yes, debates about First Amendment protections are abstract. But they are fundamental and consequential, when our old ways of regulating (freedom for media; some regulation for broadcast; common carrier for telcos) need changing, as the spheres converge.

We can take away freedom, or we can extend it. As argued in Technologies of Freedom, more freedom is the better course.

Thursday, January 4, 2018

Fort Collins Colo. to Build Own Gigabit Network

The City of Fort Collins will build its own retail municipal broadband network. The city expects to build the entire network over three to four years.

Target residential pricing is  $50 per month for 50-Mbps service, and $70 per month for 1-Gbps service.

An “affordable Internet” tier also will be offered, the business plan says. The city expects to borrow between $130 million and $150 million to fund network construction and activation.

The city estimates a cost per passed home to be $984, with the cost to connect a customer location at about $600 each.

It is obvious that most of the customers will come from one of the two dominant providers, Comcast and CenturyLink, as more than 91 percent of households already buy a fixed network internet connection.

Comcast has about 57 percent market share, while CenturyLink has about 37 percent share, the city says.

Comcast already has launched gigabit services in Fort Collins, ahead of the municipal network launch.

City consultants estimate the new municipal network could get as much as 30 percent share of market. That is based, in large part, on experience. Other municipal networks have gotten share in about that range.

One caveat is that it is unclear how the other networks measure penetration. One way is to count by connected homes. The other method, where a network offers multiple services, is to count “units sold” and then divide by the number of households.


In such cases, the actual number of connected homes is less than the penetration figures would suggest, as a single home, buying three services, generates three revenue units. When measuring penetration rates, that has the same impact as three homes buying one service.

So some of us would guess that the actual household penetration can range from less than 20 percent to perhaps 35 percent.

Much also will hinge on what Comcast and CenturyLink decide to do to hang on to existing customer accounts.

Comcast’s gigabit pricing originally was set at  $159.95 per month without a contract, and $110 per month with a one-year contract.

But few might predict Comcast is willing to lose huge chunks of market share rather than lower its prices to about $70 a month (or whatever level is needed to remain competitive with the municipal network).

Comcast has offered $70 a month pricing in other markets where it faces serious competition for gigabit internet access.



Some idea of operating costs (exclusive of marketing) can be seen in estimates for personnel.

The larger point is that more competition in the internet access space keeps coming, despite fears of a duopoly and limited consumer benefits. For most potential consumers, the real options are going to be mobile services, though, as 5G services are launched nationwide.

Mobile Substitution for Video and Internet Access is Coming

Mobile substitution, initially cannibalizing fixed network voice, now is on the cusp of taking usage away from subscription video and internet access as well.

T-Mobile could bypass the fixed broadband provider in the home and also enhance indoor wireless coverage, BTIG analyst Walter Piecyk thinks, as T-Mobile US launches a proposed video subscription service.

Mobile substitution has been growing for decades. Few now remember it, but the AT&T Digital One Rate plan--which abolished the difference between domestic local and long distance calls--fueled the switch to mobile voice, which already had been underway.


It can also be argued that Digital One Rate eliminated the difference between fixed and mobile voice usage entirely.

And one can argue that mobile substitution was precisely the plan.

Some at the time criticized the plan, but the elimination of distance as a cost barrier to domestic voice communications further boosted the value of mobile voice. Notably, consumers liked the service so much they temporarily boosted usage enough to cause network congestion.

Recall that also happened when AT&T launched the Apple iPhone, and more recently as demand for its DirecTV Now service has caused congestion issues.

It is no coincidence that long distance minutes of use and use of local networks for consumer calling began to fall right around the time Digital One Rate was launched.

In the U.S. market, domestic long distance began to fall in 2001, about two years after Digital One Rate was introduced, and then matched, by the other mobile service providers.Voice revenue fell in tandem.  

The key observation is that, every now and then, a huge shift in technology, retail pricing and packaging, new devices and new application use cases can radically reshape communications markets. As mobility has become the preferred way for consumers to use voice, it might increasingly become a preferred way of consuming subscription video and internet access as well.

5G Marketing Wars Heat Up

AT&T says it will be the first U.S. mobile operator to launch mobile 5G--in a dozen U.S. cities--in 2018.

T-Mobile US says it will launch the first commercial mobile 5G network in 2020, when, he argues, AT&T and Verizon will still be focused on fixed implementations of 5G.

Verizon, for its part, plans to launch fixed 5G in several U.S. cities in 2018.

Some observers say no 5G networks going commercial before 2020 are “true standards-based 5G.” Consumers will not care, of course, so long as there are performance advantages. And even the claim of “non-standard” implementations are judgment calls.

International standards bodies have authorized the 5GNR systems AT&T will activate in 2018.

Recall similar arguments about 4G, when there were both “global standard” Long Term Evolution and WiMAX networks in operation, and some might have quibbled about whether WiMAX was really 4G.

But marketing wars always are fought over concepts such as “which network is fastest?” They also are fought over “which firm operates the most-advanced network?” Hence the marketing claims being made about the timing of 5G launches in the U.S. market.

In the end none of this will matter, as important as “leadership” now will be claimed, just as nobody now cares about who was “first” to deploy 4G (Verizon), or how the implementation began (Verizon and others started with data cards only, as no mobile phone devices initially were available).

Sprint, for its part, used WiMAX. Customers still bought it, as it represented faster speeds, compared to 3G.  

As always, there are commercial drivers of all such claims. For starters, it is no clear at all that Verizon and AT&T will not have launched mobile 5G services by 2020, even if their 2018 and 2019 efforts might focus on fixed implementations.


Also, different contestants have assets that lead them to deploy in certain ways. As T-Mobile US says, it has lots of new 600-MHz spectrum that can underpin a 5G launch. Verizon arguably is the most capacity-constrained contestant, and is relying on new troves of millimeter wave spectrum (28 GHz and 39 GHz), which are better suited to small cell deployments, and therefore fixed implementations.

T-Mobile US, on the other hand, will launch mobile 5G most likely in only some areas of the country where it has spectrum shortages, and not nationwide.

As most spectrum-related business issues revolve around whether new capacity is used to support coverage or capacity, it might be argued that T-Mobile US is going to focus on coverage, while Verizon is going to focus on capacity.

In substantial part, those decisions are based on physical signal propagation characteristics of radio waves. Lower frequencies have better reach, but offer less capacity; higher frequencies propagate less well, but support higher bandwidth.

Verizon and AT&T also will be refarming 2G and 3G spectrum to support their coverage and capacity. In that regard, both AT&T and Verizon will be boosting 4G capacity, to support higher consumer access speeds.

That, in turn, represents one of the key 5G business model issues: for consumer smartphone end users, it will not matter whether 4G or 5G is the platform, so long as faster internet access is possible.

Ultimately, marketing claims aside, AT&T, Verizon and T-Mobile US will carefully deploy new capacity, whether augmenting 4G or launching 5G. For consumer smartphone users, the advantages of 5G as a platform are likely to be quite subtle, or perhaps non-existent, where it comes to experienced speed.

Better 5G latency performance might not be noticeable or valuable in most instances, as 4G latency performance should get better as well.

The bottom line: each carrier will deploy new assets in the way that drives most incremental value, and likely not on a ubiquitous basis, initially.

Costs of Creating Machine Learning Models is Up Sharply

With the caveat that we must be careful about making linear extrapolations into the future, training costs of state-of-the-art AI models hav...