Monday, January 8, 2018

Is Architecture Destiny?

“Architecture is destiny” is one way of looking at the ways networks are able to support--or not support--particular use cases. Coverage, latency and capacity always are key issues. So one reason low earth orbit satellite constellations are important is that such constellations potentially change architecture, changing latency and capacity constraints that traditionally have been architectural constraints for use of satellite networks as point-to-point networks.

On the other  hand, one-to-many use cases are the classic advantage of broadcast networks (TV, radio, satellite broadcasting), in terms of efficient use of capacity. It is hard to beat the cost per delivered bit advantage of any multicast (broadcast) network that is optimized for one-to-many broadcast use cases.

On the other hand, architecture also shapes other potential use cases, beyond the matter of bandwidth efficiency.

Geosynchronous satellite networks have round-trip latency of about 500 milliseconds. That means geosynchronous satellites are not appropriate for real-time apps that require low latency (less than 100 milliseconds).

Where neither latency nor bandwidth is a particular concern, however, most two-way networks could find roles in supporting sensor communications, which are almost-exclusively many-to-one (point-to-point, or sensor to server).

In other words, most two-way networks (not TV or radio broadcast networks or simple bent-pipe uplink networks, including satellite networks supporting TV distribution) can theoretically support some internet of things and machine-to-machine sensor networks.

Many of those apps are not latency dependent, nor do they require lots of bandwidth. Instead, the key real-world constraints are likely to be sensor network element cost and bandwidth cost (cost to move Mbytes).

That, in fact, is the battleground for mobile and low-power wide area networks. The argument has been that LPWANs could move sensor data at far lower cost than mobile networks, in addition to having a transponder cost advantage. Some note that is likely to change over time, with cost differentials narrowing substantially, if not completely.

One way to describe the unique role for 5G is to say that 5G will have unique advantages for real-time apps requiring ultra-low latency or ultra-high bandwidth. Autonomous driving is a good example of the former use case, while augmented reality and virtual reality apps are good examples of the latter, requiring both ultra-low latency and ultra-high bandwidth.

Mobile cloud-based enterprise apps might be an example of new use cases where ultra-high bandwidth is a requirement.

The point is that 5G and IoT use cases will hinge--as all apps running at scale do--on the architectural capabilities of various networks and the cost of communicating over those networks.

Non-real-time apps of any bandwidth can be handled by any number of networks. Content distribution arguably can be supported by both point-to-point and multicast (broadcast) networks.

But ultra-low-latency apps or ultra-high-bandwidth apps arguably require 5G (advanced 4G might work as well).

Low-bandwidth sensor networks arguably can be supported by almost any two-way network in a technology sense, but might vary based on cost-to-deploy and cost-to-use dimensions.

High bandwidth uplinks will work best on bi-directional networks with lots of capacity in the upstream direction, when such networks operate at scale. So long as actual demand is low or highly distributed, more networks could work.


Sunday, January 7, 2018

Telcos and Fintech, Blockchain

Caution is a reasonable attitude for most communications services providers to take towards any of blockchain-related or other fintech ventures, though baby steps already seem to be underway.

The best reasons for caution are based on history. “Telcos” in general have a poor track record of creating sustainable new app or platform businesses with scale, beyond their core access operations.

Also, Blockchain is potentially-transformative financial technology (fintech) development, and tier-one telcos have in recent years tried to create a role for themselves in retail mobile payments, without much success.

Fintech generally includes a huge range of functions and applications, all of which essentially could disrupt banking and financial services:
  • Payments
  • E-commerce
  • Credit
  • Ordering
  • Insurance
  • Savings
  • Banking
  • Risk assessment
  • Accounting
  • Remittances
  • Corporate finance
  • Investing
  • Consumer lending
  • Mortgages
  • Crypto currency
  • Mobile wallets

That noted, some mobile payments and banking services have achieved moderate success. Mobile banking services have proven sustainable in several markets (small business lending and consumer remittances and payments) and countries.  Africa, Scandinavia, Eastern Europe, India and Mexico are among regions where mobile operators have had success with mobile banking and payments.  



But there have been big failures--mostly in other developed countries, where telcos have failed in recent years to get much traction in mobile payments.

All that noted, as access providers who wish to survive and thrive, moving up the stack into new platforms, apps and services beyond connectivity is essential. If fintech, like internet of things, proves to be a huge growth area, telcos are almost forced to consider how they will become a bigger part of those ecosystems.



Saturday, January 6, 2018

Share of Wallet Shifting to Devices, Apps

One consequence of the “telecom” industry now being a part of the broader internet ecosystem is a shift in industry share of profits. In other words of consumer spending on “telecom” products and services, share of wallet has moved to devices, apps and services.

And most of the consumer spending growth has shifted to devices and over the top apps (such as Apple and Netflix). In at least a few markets, the share gains by OTT services has been dramatic.

The other shift, in some markets like the United States, is a shift of market share from legacy providers to newer challengers (such as cable TV operators).

`



Era of Pervasive Computing Shapes Communications Revenue Drivers

Eras of computing matter for telecom professionals and the broader telecom industry for any number of reasons, but chief among the implications is that computing eras create, shape and form demand for communications.

The era of pervasive computing, which is likely to supplant the era of mobile computing, provides an example. At one level, the idea that computing devices will be embedded all around people implies communication as well. And since sensors and pervasive computing devices (things) will vastly outnumber people, that suggests a lot more communication connections.

But computing eras also shape other parts of life, such as who and what needs to communicate, over what distances, in what manner, how often and with what bandwidth requirements. Those issues in turn create potential demand for revenue-generating services, features and apps.

There are many ways to characterize eras of computing, but it is safe to say that the present era is the second of perhaps five eras where communications is essential for computing, since computing is largely accomplished remotely.

In other words, “everything” is networked and connected.



In the era of personal computing and use of the web, that mostly meant connecting PCs with remote computing facilities. In the cloud era, we reach a new stage where “most” applied computing tasks are partially, substantially or nearly-completely conducted remotely, making communications a necessary part of computing.

In the present era, demand for communications to support computing has been driven by global adoption of mobility, plus mobile data, plus video and other internet apps.

In the next era, communications demand will be driven by internet of things sensors and other forms of pervasive computing.  For communications providers, that is the good news.

The bad news is that in the era of pervasive computing, not every instance of communications necessarily generates incremental revenue. We already see that with Wi-Fi, Bluetooth and other forms of local and short-distance communications.

Nor, in the pervasive era, is it possible for any access provider to directly profit from most of the applications that use a network. Potential revenue exists in increased demand for wide area communications and therefore local connections to such networks.

But the relationships are far from linear. Basically, incremental revenue grows less robustly than increased data usage, and threatens to grow far more slowly than network capital investment.

That is among the key challenges for the “dumb pipe” internet access function. That is not to say the only revenue drivers are dumb pipe internet access. Access providers do provide owned applications (messaging, voice, video). But those legacy sources are either declining or morphing, with new suppliers providing effective substitutes.

That is why surviving retail suppliers must “move up the stack” into owned apps, platforms and services.

Was Negroponte Wrong?

Lots of things can change in four decades. The Negroponte Switch, for example, suggested that legacy networks made inefficient use of bandwidth. Broadband signals (at that time television) were moved through the air using wireless, while narrowband signals (at that point mostly voice) were moved using cables.

There was another angle, namely that mobile and personal endpoints (phones, at that time) were perversely limited to static fixed network connections, while devices that functioned in a “place-based” manner (television sets) were connected using wireless.

Prof. Negroponte argued we should do just the opposite, namely move narrowband and mobile signals over the air, and confine broadband to cables.

These days, the switch is really from cabled to wireless and mobile, since most traffic now is broadband, and increasingly that traffic is mobile and personal. By perhaps 2019, as much as two thirds of all data will use some form of (mobile, Wi-Fi or other untethered networks, short range or long range), Cisco has predicted.


Of course, assumptions matter. In the 1980s, it would have been impossible to foresee the huge explosion of mobile usage; the shift of TV consumption from place-based to mobile (and from linear to on-demand); the decline of fixed network voice; or the rise of the internet itself.

Nor would it have been possible to accurately foresee the impact of orders of magnitude decreases in the cost of computation and communications. Rather than a shift, we now see a move of virtually all communications to untethered modes.

These days, Wi-Fi is the preferred local connection technology in the office, home and indoor venues. Outdoors and on the go, mobile connections are the norm.

In the new developing areas, such as internet of things apps and sensors, untethered access also is expected to be the norm, not fixed access.

Negroponte was correct--within the limits of networks and costs at the time--in suggesting a shift of broadband to cables and narrowband to wireless.  

Some 40 years later, everything--all media types--are moving to untethered access. That is the result of mobility emerging as the dominant end user device, the growth of Wi-Fi as the universal endpoint connection method, the impact of Moore’s Law on computing and communications costs, the growth of the internet and ever-new ways to use communications spectrum more efficiently.

In the case of millimeter wave and spectrum aggregation, cheap computing means we can use bandwidth assets that were impractical in the past.

Computing power that would have cost $100 million in 1980 cost about $100 in 2010, less than that in 2017. In other words, costs have dropped at least eight orders of magnitude.

Predictions about the future always are perilous. What we have seen is partly a switch, but more profoundly an upheaval. Increasingly, the untethered and mobile networks are for actual access. The fixed network is--in the realm of consumer services--a backhaul network.

Friday, January 5, 2018

Technologies of Freedom

It now is a given that media, communications, content and delivery are converging, erasing former clear lines between industries and functions. That has important consequences.

And some ideas, even if abstract, really do matter, in that regard. Freedom, responsibility and fairness always come to mind, given my academic training in journalism. It now appears, for a variety of reasons having nothing much to do with those ideas, that freedom is imperiled.

Ironically, it is the leading app providers that now face threats to their freedom, as there are growing calls to “regulate” them in greater ways, globally.

Let me be clear: my own position has for decades been that more freedom for all in the ecosystem works best, and is the preferred approach to policy. Those of you who ever have read Technologies of Freedom will understand why.

Responsibility and fairness also are requirements, but something that has to happen at the personal, firm and industry level. Yes, this can be done “to” people, firms and industries, by government fiat. But freedom is the preferred course.

In a world where formerly-distinct endeavors and industries really are converging, we have a choice: extend freedom to former highly-regulated entities who now operate in entirely-new realms where freedom is the policy (First Amendment protections), or remove freedom from content providers and make them “utilities.”

The bigger challenge right now is getting the transition right. Somehow, we need to balance regulatory models, away from “utility” and “common carrier” regulation for app providers, but also away from such regulation for firms that now participate in activities that increasingly are inseparable from traditional First Amendment protected ideas, content and media.

At the same time, major app providers already operate as “access providers,” though without the obligations imposed on only a few access providers.

Some now are arguing essentially for “less freedom” for Facebook, Google, Amazon and others, and “even less freedom” for access providers who--despite becoming content providers at their core--deserve freedom no less than any other content provider.

The better policy is to extend the realm of freedom further, not restrict it. In other words, when harmonization is required, it is better to extend freedom to formerly-distinct industries (broadcast TV and radio, cable TV and other distribution entities, even telcos).

Yes, debates about First Amendment protections are abstract. But they are fundamental and consequential, when our old ways of regulating (freedom for media; some regulation for broadcast; common carrier for telcos) need changing, as the spheres converge.

We can take away freedom, or we can extend it. As argued in Technologies of Freedom, more freedom is the better course.

Thursday, January 4, 2018

Fort Collins Colo. to Build Own Gigabit Network

The City of Fort Collins will build its own retail municipal broadband network. The city expects to build the entire network over three to four years.

Target residential pricing is  $50 per month for 50-Mbps service, and $70 per month for 1-Gbps service.

An “affordable Internet” tier also will be offered, the business plan says. The city expects to borrow between $130 million and $150 million to fund network construction and activation.

The city estimates a cost per passed home to be $984, with the cost to connect a customer location at about $600 each.

It is obvious that most of the customers will come from one of the two dominant providers, Comcast and CenturyLink, as more than 91 percent of households already buy a fixed network internet connection.

Comcast has about 57 percent market share, while CenturyLink has about 37 percent share, the city says.

Comcast already has launched gigabit services in Fort Collins, ahead of the municipal network launch.

City consultants estimate the new municipal network could get as much as 30 percent share of market. That is based, in large part, on experience. Other municipal networks have gotten share in about that range.

One caveat is that it is unclear how the other networks measure penetration. One way is to count by connected homes. The other method, where a network offers multiple services, is to count “units sold” and then divide by the number of households.


In such cases, the actual number of connected homes is less than the penetration figures would suggest, as a single home, buying three services, generates three revenue units. When measuring penetration rates, that has the same impact as three homes buying one service.

So some of us would guess that the actual household penetration can range from less than 20 percent to perhaps 35 percent.

Much also will hinge on what Comcast and CenturyLink decide to do to hang on to existing customer accounts.

Comcast’s gigabit pricing originally was set at  $159.95 per month without a contract, and $110 per month with a one-year contract.

But few might predict Comcast is willing to lose huge chunks of market share rather than lower its prices to about $70 a month (or whatever level is needed to remain competitive with the municipal network).

Comcast has offered $70 a month pricing in other markets where it faces serious competition for gigabit internet access.



Some idea of operating costs (exclusive of marketing) can be seen in estimates for personnel.

The larger point is that more competition in the internet access space keeps coming, despite fears of a duopoly and limited consumer benefits. For most potential consumers, the real options are going to be mobile services, though, as 5G services are launched nationwide.

Zoom Wants to Become a "Digital Twin Equipped With Your Institutional Knowledge"

Perplexity and OpenAI hope to use artificial intelligence to challenge Google for search leadership. So Zoom says it will use AI to challen...