Sunday, July 24, 2016

If Portland Does Not Get Google Fiber, the Business Model Most Likely Will be the Reason

Sometimes the business case--not “evil” Internet service providers or clueless municipal officials--is responsible for some hoped-for new service, product or network not to be launched.

One might argue that is precisely the point where it comes to a potential Google Fiber launch in Portland, Ore. It is one thing for Google Fiber to come to market with the “only” gigabit Internet access service in a market.

It is something else again if the incumbent suppliers (cable and telco) up their game before Google Fiber can launch, and deploy their own gigabit networks.

If that happens, the suppliers of nearly 100 percent of the consumer Internet access connections have a much more compelling value proposition, while any new Google Fiber offer--even if better--differs mostly incrementally.

That might also be the case if the incumbents come up with “hundreds of megabits per second offers” that cost less than Google Fiber, and also meet virtually all present customer requirements.

It is “Marketing 101.” An attacker has to come to market with a value proposition that makes sense, has clear value and often, “costs less.”

While Google Fiber arguably has technical advantages over the current CenturyLink and cable offers (symmetrical bandwidth, for example), it is not clear that most consumers actually believe they get much incremental value from a gigabit service, compared to one operating “up to” 300 Mbps or 500 Mbps.

In fact, many would argue that, for most consumers--and most multi-person households--a 100-Mbps to 500 Mbps downstream connection does “everything” a gigabit connection does, with the possible exception of some upstream apps.

Some of us would argue that, in most cases, even a 100-Mbps connection actually supports all typical applications for a multi-person household. Beyond that, it is not clear that actual perceived value exists.

If Google Fiber increasingly finds the incumbents (telco and cable) offering gigabit connections, the business case for launching Google Fiber might not be attractive.

That is not to say a fixed wireless service has the same economics. The business case arguably will be better with the latest generation of fixed wireless platforms, and should be even better in the future.

Saturday, July 23, 2016

Why Tier One Access Providers Will Have to Create New Roles Elsewhere in the Ecosystem

Whether Verizon will succeed in the mobile advertising business remains to be seen. Whether AT&T will be a significant application or service provider in connected cars remains to be seen. Whether Orange, SK Telecom, NTT Docomo will be a significant provider of Internet of Things services is an open question at present.

But there is no question those firms, and all other tier one access providers, must try to create new roles in the broader Internet ecosystem. The reason is simple enough.

As a rule, we can expect that every tier-one provider will have to replace about half its present revenue in a decade's time. Competition, over the top services and ever-better technology will cannibalize that much revenue.

In the access business, relatively “small” changes in the number of competitors can lead to big swings in prices, packaging, gross revenue and profit. That is the case even ignoring the big changes a switch to Internet Protocol as the next generation network transport layer has brought.

In the mobile business, you can understand the dynamics through the debates about “how many service providers are necessary to maintain robust competition?” European and U.S. regulators have made clear a preference for four facilities-based providers, rather than just three.

Japanese and South Korean regulators have made clear a belief that two facilities-based competitors is not enough in the mobile market, and that three would be better.

In the Indian mobile market, mobile executives have concluded that, with the market entry of Reliance Jio, five to eight facilities-based competitors is way too many.

In the fixed networks business, we clearly see the effect of non-facilities-based competition (wholesale) as well as facilities-based competition (cable TV operators, satellite video competitors, Google Fiber) on former monopoly markets.

In virtually every U.S. fixed network locale, a telecom provider that once had 90-percent-plus take rates now has perhaps 40 percent share of the consumer services market (with a cable TV and two satellite providers taking some Internet access; voice or video entertainment market share).

In the high speed Internet access segment, the local cable operator typically has 60 percent share, in fact, and the telco is the number-two provider with about 40 percent share. Where Google Fiber operates, incumbent shares are lower, still.

Eventually, in many countries, the relevant share statistics will be mobile operator share, as the fixed network will be relatively insignificant. By 2020, in fact, mobile should be delivering more than half of all Internet access revenue, for example.


A variety of forces--deregulation, privatization, competition, new technology, mobile and Internet--have combined to radically reduce retail prices for voice services. Those same forces are reducing retail prices and sales volumes for carrier messaging and entertainment video.

Internet access speeds have gone up--dramatically--while prices per bit are falling (just as dramatically) because of competition from Google Fiber’s symmetrical gigabit services. That also means lower revenue per bit.

Up to to some point, higher volume compensates for lower average revenue per bit. Still, as we have seen with voice, even that strategy has limits.

At the same time, the access business has gone from a stand-alone industry to a part of a bigger Internet ecosystem sweeping commerce, retailing, content and all communications services into one much-bigger “industry.”

In that bigger ecosystem, all access services are a smaller part of a bigger whole, and all “apps and services” conceptually can be created and delivered “without the permission” of the access provider.

And much more is coming.

The access services business has in the past been ruled by assumptions of “scarcity,” with direct implications for business models, namely high prices and high profits. Even after the advent of competition, scarcity of mobile and Wi-Fi spectrum, as well as the huge sunk costs of fixed networks, have modified the assumption of scarcity only a little.

And even if access networks remain relatively costly, we can envision a future where relative abundance (if not absolute abundance) is the rule, not the exception.

Consider only one example. In the U.S market, less than one gigaHertz of bandwidth presently is authorized for all mobile and Wi-Fi communications.

But the Federal Communications Commission is preparing to release seven gigaHertz of new unlicensed spectrum, plus four gigaHertz of licensed mobile and wireless spectrum at first, and then 18 GHz more spectrum in a second phase, for mobile and wireless use.

That represents at least a 36-fold increase in available spectrum. Add to that the small cell architectures and higher-order modulation techniques and one can plausibly argue that each gigaHertz of new millimeter wave spectrum will deliver an order or magnitude--or perhaps two orders of magnitude--more usable capacity than existing mobile or Wi-Fi networks.

Any business defined by scarcity might have high prices and high profit margins. Any business defined by abundance has the opposite problem. Get ready to solve those problems.

AT&T, Verizon Strategies Diverge

AT&T and Verizon--like most other tier one service providers--have increasingly adopted different business strategies since the era of competition began in the 1980s. Back then, most tier one providers were very similar to each other in that regard.

These days, firm business and product strategies can be quite different. That is the case for AT&T and Verizon, on the subject of product bundling. Simply, Verizon is less enthusiastic about quadruple play offers, never having found what it believes is clear evidence of end user demand.

AT&T, on the other hand, is much more optimistic about the value of quadruple plays and bundles in general.

In that respect, AT&T holds views more similar to most tier-one service providers in Europe, who generally believe the quadruple play is a fundamental strategy.

To a large extent, the difference in views between AT&T and Verizon also flows from their respective positioning in the market. Verizon has pitched itself as the “premium” brand and generally abhors competing on price.

Verizon sees consumer demand for quad play offers as fundamentally a matter of price savings, something that goes against the company positioning. Also, Verizon has less fixed network revenue to protect and grow, and has focused mostly on its mobile business.

Compared to AT&T, Verizon has less to gain from bundling that lowers churn of its consumer fixed network customer base.

AT&T, on the other hand, has a much-larger fixed network profile and a correspondingly smaller--though still significant--contribution from mobile services. Quad plays arguably represent more value for AT&T, as it has more customers to potentially lose in the consumer fixed network segment.

In the second quarter of 2016, for example, AT&T--which reports its business segments differently than Verizon--said it had about 23 million video accounts and about 13 million high speed access accounts, split between its DirecTV and wireline networks.

In its first quarter of 2016, Verizon said it had about seven million FiOS Internet accounts.

So looking only at consumer high speed access, AT&T has nearly twice the number of fixed network accounts as does Verizon.

Trailing Verizon in the mobile accounts area, but leading Verizon in fixed network accounts, AT&T has more to gain in mobile, and more to lose in fixed network services, than Verizon.

The other notable divergence is international operations. While both AT&T and Verizon sell enterprise services globally, AT&T has more exposure internationally, with its Mexico mobile operations, in particular.

With the exception of its fixed network global enterprise business, Verizon remains a U.S.-focused company.

Friday, July 22, 2016

Telecom Agenda Being Set by "Newcomers," to a Large Extent

Facebook’s efforts to help extend Internet access “to everyone”  now are taking several different and complementary paths. Internet.org is working on app packaging. Its regulatory teams are working to support release of more unlicensed and shared spectrum.

Its Aquila unmanned aerial vehicle program is working on backhaul. Its Teragraph development effort seeks to enable lower-cost access networks in rural areas. Its Terragraph 60-GHz wireless mesh network is designed to enable lower-cost Internet access in dense, urban areas. That has lead to Facebook’s millimeter wave mesh network concept.

Project Aries is working on more spectrally efficient radio access capabilities, so any radio network can deliver more bits, in any given amount of bandwidth.

OpenCellular expects to create open source cellular network technology that can be used by any entity wanting to build a mobile or wireless access network.

The Telecom Infra Project is an effort to create lower-cost, more-efficient telecom access networks, and is modeled on what Facebook did with its open data center efforts.

All of that effort, as well as Google’s efforts, suggest a coming new world where access platforms and networks are created and perhaps operated by any number of new providers, not traditional telecom access providers (cable and telco).

Google acts as an Internet service provider through Google Fiber; as a mobile operator through Google Fi; supplies the world-leading Android mobile operating system; has created its reference platform Nexus line of devices; is working on unmanned aerial vehicles and balloons for backhaul and access; has deployed Wi-Fi hotspot networks and has--many say--set the Internet agenda in the United States and Europe.


What if Social Media Were Your Neighbors, What if Operating Systems were Airlines?

After two decades, I still find if operating systems were airlines hilarious. More operating systems are compared in this version. Here’s a slightly more updated version.  

Nobody has tried to do something similar for cloud-era processes, though what if social media were your neighbors has some of that flavor.  


Google Applies Artificial Intelligence to Cut Data Center Power for Cooling Up to 40%

It is not yet clear how, when and how much machine learning and other forms of artificial intelligence will start to reshape the way customers buy and use communication services. For the moment, AI likely will make its mark on non-customer-facing processes.

Google’s data centers, for example, should soon be able to reduce energy consumption cooling by up to 40 percent by applying machine learning.

Servers generate lots of heat, so cooling drives power consumption requirements. But peak heat dissipation requirements are highly dynamic, complex and non-linear, Google Deepmind says.

The machine learning system was able to consistently achieve a 40 percent reduction in the amount of energy used for cooling, which equates to a 15 percent reduction in overall power usage effectiveness.

“Because the algorithm is a general-purpose framework to understand complex dynamics, we plan to apply this to other challenges in the data centre environment and beyond in the coming months,” said Rich Evans, DeepMind research engineer, and  Jim Gao, Google data center engineer.

Probably few would debate the potential use of machine learning and artificial intelligence to improve industrial processes.

Sales processes, though, likely are not an area where most would expect big changes. Products sold to business customers and larger organizations generally are considered complex matters, requiring customization.

Enterprise communications requirements are more complicated than data center power consumption processes, many could argue. But are they? Google and Deepmind applied historical data and then AI on top, to develop new rules for managing a complex system.

In essence, do sales and engineering personnel not have an accumulated wisdom about the general problems, existing processes and typical software and hardware used by enterprise customers, to a relatively high degree?

And where the typical solution involves recommendations for removing, adding or altering services and features to solve enterprise communication problems, are there not patterns designers and sales personnel can rely upon?

If so, might it not be possible to radically simplify the process of understanding and then “quoting” a solution? And if this cannot be done on a fully-automated basis, might it still be done on a wide enough scale to deliver business value for a communications supplier?

In other words, could AI simplify substantial parts of the enterprise solutions business? Most who do such things for a living might argue the answer is “no.” But are enterprise solutions completely unique? Are there not functional algorithms engineers and network architects work with that are, in fact, bounded in terms of potential solutions?

And, if so, could not large amounts of the analysis, design and reconfiguration not be done using AI? Airline reservation systems were, and are, quite complex. And yet consumers now use tools built on those systems to buy their own tickets.

Business communication solutions are complex. But they are not unmanageably complex. People can, and do, create solutions based on the use of what effectively are algorithms. We might call it experience or skill. That it is. But it is based on rules, formal or informal.

Rules-based systems can be modeled. And that could have huge implications for how business communications solutions are designed, provisioned and sold.

Thursday, July 21, 2016

Google Fiber Hits Pause in Portland Market

Google Fiber has put its plans to build a fiber access network in Portland in the fall of 2016 on at least temporary hold. “Why” is the question everyone should be asking.

In principle, Google Fiber could be wrestling with the business model, as competitors Comcast and CenturyLink have themselves been upgrading their own networks in the Portland area. That might make for a more-difficult payback model.

CenturyLink, for example, already is gigabit services in Portland, as does Comcast. That arguably makes harder the business model for Google Fiber or any other ISP that might be contemplating entering the Portland market.

It will not be so easy to attack as the only provider of 1 Gbps service if the other two leading ISPs already are doing so.  

"We're continuing to explore the possibility of bringing Google Fiber to Portland and other potential cities," Google says. "This means deploying the latest technologies in alignment with our product roadmap, while understanding local requirements and challenges, which takes time."

Also, it might be reasonable to assume that Google Fiber is about to try and become “Google Internet” and use fixed wireless as the access platform, not optical fiber.

That would be a major development. Facebook, AT&T and Verizon are other entities expected to promote or use fixed wireless as a major access platform for Internet access, and eventually gigabit access services.

Personally, I think Google Fiber has looked at the numbers and concluded a gigabit network that might cost $300 million is simply too big a risk in the existing Portland market, compared to its potential prospects several years ago: before CenturyLink and Comcast moved to upgrade.

This could be the leading edge of a very big change in access strategy and business models.

Zoom Wants to Become a "Digital Twin Equipped With Your Institutional Knowledge"

Perplexity and OpenAI hope to use artificial intelligence to challenge Google for search leadership. So Zoom says it will use AI to challen...