Tuesday, July 26, 2016

Internet Access Prices, % of GNI Per Capita is the Problem in Developing Countries

One frequently hears complaints that retail prices for Internet access are too high. Actually, by one common measure, Internet access prices in developed nations are quite low, less than one percent of gross national income per capita.


That is why Spectrum Futures exists. Here is the Spectrum Futures schedule, with speaker and topics. 

53% of World Population Still Does Not Use the Internet

source: ITU
Global Internet access in one picture. Important: note that the figures represent mobile access to voice communications and the Internet.

Fixed access adds some additional number of connections, but essentially is irrelevant to the broad trend--either for voice communications or Internet.

Fixed-network Internet access adoption remains at below one percent in Africa and other less developed countries. Though China is driving fixed broadband in Asia, fixed-broadband penetration is just about 10 percent in 2016, according to the International Telecommunications Union.

But mobile coverage is not ubiquitous, and not all mobile networks support fast or relatively fast Internet access. In 2016, 66 percent of the population lives within an area covered by a mobile broadband network.

Seven billion people (95 percent of the global population) live in an area that is covered by a cellular network.

Mobile-broadband networks (3G or above) reach 84 percent of the global population but only 67 percent of the rural population.

LTE networks have spread quickly over the last three years and reach almost four billion people today (53 percent of the global population).

Still, 3.9 billion people, representing 53 percent of the world’s population is not using the Internet.


That is why a Spectrum Futures exists. Here is the Spectrum Futures schedule.

Monday, July 25, 2016

In U.S., Internet Access Speed Doubles or Triples Every 5 Years

Some things do not seem to change. Among them: the high end of U.S. Internet access service speeds roughly double every year. That implies an increase of an order of magnitude about every five years.


If you assume access speeds (for lead users) are somewhere north of 100 Mbps now, they will be in excess of a gigabit in five years. In a growing number of U.S. local markets, typical offers for consumers already have reached a gigabit.


But what about average speeds, for typical users? By some estimates, including those of the U.S. Federal Communications Commission,  average Internet access speeds increased 300 percent in the last five years (2011 to 2016).


source: Nielsen Norman Group

Verizon Gets Bigger Role in Media, Advertising; Yahoo Becomes an Investment Vehicle

The Verizon Communications acquisition of Yahoo’s operating business will leave Yahoo with significant cash, its shares in Alibaba Group Holdings, its shares in Yahoo Japan, Yahoo’s convertible notes, certain minority investments, and Yahoo’s non-core patents (called the Excalibur portfolio).

These assets will continue to be held by Yahoo, which will change its name at closing and become a registered, publicly traded investment company.

Combined with Verizon’s purchase of AOL less than a year ago, the move makes Verizon a more potent force in media and advertising.

Yahoo has a global audience of more than one billion monthly active users, including 600 million monthly active mobile users.

“The acquisition of Yahoo will put Verizon in a highly competitive position as a top global mobile media company, and help accelerate our revenue stream in digital advertising,” said
Lowell McAdam, Verizon Chairman and CEO.

One might debate the success Verizon might have in its new role as a mobile media and advertising services company. There is little reason to doubt the imperative of seeking such new roles in the content or other parts of the Internet ecosystem.

After 4 Decades, Business Model is Still the Issue for Next-Generation Access Networks

We now have been debating the business model for next-generation access platforms for at least four decades.

Somewhat oddly, we continue to debate whether fiber to the home or some other platform is “best” for ubiquitous next-generation networks. And though we undoubtedly will continue to debate where and when any particular platform is best, the business model increasingly is going to shape answers.

Some wonder why Google Fiber, which has been actively investigating deploying its gigabit Internet access service in Portland, Ore., has suddenly put the project on apparent hold. Others might wonder why BT is “dragging its feet” on a more-aggressive fiber-to-home build.

To paraphrase: “It’s the business model, stupid.” In Portland, since news of Google Fiber interest, both major Internet service providers--CenturyLink and Comcast--have moved to upgrade their existing networks for gigabit speeds.

That means Google Fiber now faces a key challenge. Where it might have been the “only provider of gigabit Internet access” in Portland, it now becomes “the latest of three” to do so. Granted, Google Fiber’s features (symmetrical bandwidth) and price could be differentiators.

But the big market opening--entering the market as the only provider with a disruptive gigabit offer--has substantially closed.

In the United Kingdom, BT’s “reluctance” to invest more heavily in fiber to home likewise has its roots in the payback model. As many tier-one service providers already discovered when opening their networks to wholesale customers, robust wholesale policies can lead to a loss of 60 percent or more retail market share.

The issue might be worse if the network is upgraded to fiber access. Losing 60 percent retail is compensated for by “gaining” that same percentage in wholesale customers. But wholesale customers represent less gross revenue than retail customers, and probably lower profits as well.

Having that, plus the high capital investment, could exacerbate the problem. Some see structurally-similar issues in the U.S. market, where regulators want to tighten price controls on legacy special access services. The facilities owners object to being forced to sell price-controlled services to competitors who gain most of the advantages of using the network, and none of the capital investment or risk.

At this point, it might be more fruitful to stop arguing about the technology platforms and look at the matter of incentives to invest in next-generation networks. That is the real problem.

Sunday, July 24, 2016

If Portland Does Not Get Google Fiber, the Business Model Most Likely Will be the Reason

Sometimes the business case--not “evil” Internet service providers or clueless municipal officials--is responsible for some hoped-for new service, product or network not to be launched.

One might argue that is precisely the point where it comes to a potential Google Fiber launch in Portland, Ore. It is one thing for Google Fiber to come to market with the “only” gigabit Internet access service in a market.

It is something else again if the incumbent suppliers (cable and telco) up their game before Google Fiber can launch, and deploy their own gigabit networks.

If that happens, the suppliers of nearly 100 percent of the consumer Internet access connections have a much more compelling value proposition, while any new Google Fiber offer--even if better--differs mostly incrementally.

That might also be the case if the incumbents come up with “hundreds of megabits per second offers” that cost less than Google Fiber, and also meet virtually all present customer requirements.

It is “Marketing 101.” An attacker has to come to market with a value proposition that makes sense, has clear value and often, “costs less.”

While Google Fiber arguably has technical advantages over the current CenturyLink and cable offers (symmetrical bandwidth, for example), it is not clear that most consumers actually believe they get much incremental value from a gigabit service, compared to one operating “up to” 300 Mbps or 500 Mbps.

In fact, many would argue that, for most consumers--and most multi-person households--a 100-Mbps to 500 Mbps downstream connection does “everything” a gigabit connection does, with the possible exception of some upstream apps.

Some of us would argue that, in most cases, even a 100-Mbps connection actually supports all typical applications for a multi-person household. Beyond that, it is not clear that actual perceived value exists.

If Google Fiber increasingly finds the incumbents (telco and cable) offering gigabit connections, the business case for launching Google Fiber might not be attractive.

That is not to say a fixed wireless service has the same economics. The business case arguably will be better with the latest generation of fixed wireless platforms, and should be even better in the future.

Saturday, July 23, 2016

Why Tier One Access Providers Will Have to Create New Roles Elsewhere in the Ecosystem

Whether Verizon will succeed in the mobile advertising business remains to be seen. Whether AT&T will be a significant application or service provider in connected cars remains to be seen. Whether Orange, SK Telecom, NTT Docomo will be a significant provider of Internet of Things services is an open question at present.

But there is no question those firms, and all other tier one access providers, must try to create new roles in the broader Internet ecosystem. The reason is simple enough.

As a rule, we can expect that every tier-one provider will have to replace about half its present revenue in a decade's time. Competition, over the top services and ever-better technology will cannibalize that much revenue.

In the access business, relatively “small” changes in the number of competitors can lead to big swings in prices, packaging, gross revenue and profit. That is the case even ignoring the big changes a switch to Internet Protocol as the next generation network transport layer has brought.

In the mobile business, you can understand the dynamics through the debates about “how many service providers are necessary to maintain robust competition?” European and U.S. regulators have made clear a preference for four facilities-based providers, rather than just three.

Japanese and South Korean regulators have made clear a belief that two facilities-based competitors is not enough in the mobile market, and that three would be better.

In the Indian mobile market, mobile executives have concluded that, with the market entry of Reliance Jio, five to eight facilities-based competitors is way too many.

In the fixed networks business, we clearly see the effect of non-facilities-based competition (wholesale) as well as facilities-based competition (cable TV operators, satellite video competitors, Google Fiber) on former monopoly markets.

In virtually every U.S. fixed network locale, a telecom provider that once had 90-percent-plus take rates now has perhaps 40 percent share of the consumer services market (with a cable TV and two satellite providers taking some Internet access; voice or video entertainment market share).

In the high speed Internet access segment, the local cable operator typically has 60 percent share, in fact, and the telco is the number-two provider with about 40 percent share. Where Google Fiber operates, incumbent shares are lower, still.

Eventually, in many countries, the relevant share statistics will be mobile operator share, as the fixed network will be relatively insignificant. By 2020, in fact, mobile should be delivering more than half of all Internet access revenue, for example.


A variety of forces--deregulation, privatization, competition, new technology, mobile and Internet--have combined to radically reduce retail prices for voice services. Those same forces are reducing retail prices and sales volumes for carrier messaging and entertainment video.

Internet access speeds have gone up--dramatically--while prices per bit are falling (just as dramatically) because of competition from Google Fiber’s symmetrical gigabit services. That also means lower revenue per bit.

Up to to some point, higher volume compensates for lower average revenue per bit. Still, as we have seen with voice, even that strategy has limits.

At the same time, the access business has gone from a stand-alone industry to a part of a bigger Internet ecosystem sweeping commerce, retailing, content and all communications services into one much-bigger “industry.”

In that bigger ecosystem, all access services are a smaller part of a bigger whole, and all “apps and services” conceptually can be created and delivered “without the permission” of the access provider.

And much more is coming.

The access services business has in the past been ruled by assumptions of “scarcity,” with direct implications for business models, namely high prices and high profits. Even after the advent of competition, scarcity of mobile and Wi-Fi spectrum, as well as the huge sunk costs of fixed networks, have modified the assumption of scarcity only a little.

And even if access networks remain relatively costly, we can envision a future where relative abundance (if not absolute abundance) is the rule, not the exception.

Consider only one example. In the U.S market, less than one gigaHertz of bandwidth presently is authorized for all mobile and Wi-Fi communications.

But the Federal Communications Commission is preparing to release seven gigaHertz of new unlicensed spectrum, plus four gigaHertz of licensed mobile and wireless spectrum at first, and then 18 GHz more spectrum in a second phase, for mobile and wireless use.

That represents at least a 36-fold increase in available spectrum. Add to that the small cell architectures and higher-order modulation techniques and one can plausibly argue that each gigaHertz of new millimeter wave spectrum will deliver an order or magnitude--or perhaps two orders of magnitude--more usable capacity than existing mobile or Wi-Fi networks.

Any business defined by scarcity might have high prices and high profit margins. Any business defined by abundance has the opposite problem. Get ready to solve those problems.

AT&T, Verizon Strategies Diverge

AT&T and Verizon--like most other tier one service providers--have increasingly adopted different business strategies since the era of competition began in the 1980s. Back then, most tier one providers were very similar to each other in that regard.

These days, firm business and product strategies can be quite different. That is the case for AT&T and Verizon, on the subject of product bundling. Simply, Verizon is less enthusiastic about quadruple play offers, never having found what it believes is clear evidence of end user demand.

AT&T, on the other hand, is much more optimistic about the value of quadruple plays and bundles in general.

In that respect, AT&T holds views more similar to most tier-one service providers in Europe, who generally believe the quadruple play is a fundamental strategy.

To a large extent, the difference in views between AT&T and Verizon also flows from their respective positioning in the market. Verizon has pitched itself as the “premium” brand and generally abhors competing on price.

Verizon sees consumer demand for quad play offers as fundamentally a matter of price savings, something that goes against the company positioning. Also, Verizon has less fixed network revenue to protect and grow, and has focused mostly on its mobile business.

Compared to AT&T, Verizon has less to gain from bundling that lowers churn of its consumer fixed network customer base.

AT&T, on the other hand, has a much-larger fixed network profile and a correspondingly smaller--though still significant--contribution from mobile services. Quad plays arguably represent more value for AT&T, as it has more customers to potentially lose in the consumer fixed network segment.

In the second quarter of 2016, for example, AT&T--which reports its business segments differently than Verizon--said it had about 23 million video accounts and about 13 million high speed access accounts, split between its DirecTV and wireline networks.

In its first quarter of 2016, Verizon said it had about seven million FiOS Internet accounts.

So looking only at consumer high speed access, AT&T has nearly twice the number of fixed network accounts as does Verizon.

Trailing Verizon in the mobile accounts area, but leading Verizon in fixed network accounts, AT&T has more to gain in mobile, and more to lose in fixed network services, than Verizon.

The other notable divergence is international operations. While both AT&T and Verizon sell enterprise services globally, AT&T has more exposure internationally, with its Mexico mobile operations, in particular.

With the exception of its fixed network global enterprise business, Verizon remains a U.S.-focused company.

Friday, July 22, 2016

Telecom Agenda Being Set by "Newcomers," to a Large Extent

Facebook’s efforts to help extend Internet access “to everyone”  now are taking several different and complementary paths. Internet.org is working on app packaging. Its regulatory teams are working to support release of more unlicensed and shared spectrum.

Its Aquila unmanned aerial vehicle program is working on backhaul. Its Teragraph development effort seeks to enable lower-cost access networks in rural areas. Its Terragraph 60-GHz wireless mesh network is designed to enable lower-cost Internet access in dense, urban areas. That has lead to Facebook’s millimeter wave mesh network concept.

Project Aries is working on more spectrally efficient radio access capabilities, so any radio network can deliver more bits, in any given amount of bandwidth.

OpenCellular expects to create open source cellular network technology that can be used by any entity wanting to build a mobile or wireless access network.

The Telecom Infra Project is an effort to create lower-cost, more-efficient telecom access networks, and is modeled on what Facebook did with its open data center efforts.

All of that effort, as well as Google’s efforts, suggest a coming new world where access platforms and networks are created and perhaps operated by any number of new providers, not traditional telecom access providers (cable and telco).

Google acts as an Internet service provider through Google Fiber; as a mobile operator through Google Fi; supplies the world-leading Android mobile operating system; has created its reference platform Nexus line of devices; is working on unmanned aerial vehicles and balloons for backhaul and access; has deployed Wi-Fi hotspot networks and has--many say--set the Internet agenda in the United States and Europe.


What if Social Media Were Your Neighbors, What if Operating Systems were Airlines?

After two decades, I still find if operating systems were airlines hilarious. More operating systems are compared in this version. Here’s a slightly more updated version.  

Nobody has tried to do something similar for cloud-era processes, though what if social media were your neighbors has some of that flavor.  


Google Applies Artificial Intelligence to Cut Data Center Power for Cooling Up to 40%

It is not yet clear how, when and how much machine learning and other forms of artificial intelligence will start to reshape the way customers buy and use communication services. For the moment, AI likely will make its mark on non-customer-facing processes.

Google’s data centers, for example, should soon be able to reduce energy consumption cooling by up to 40 percent by applying machine learning.

Servers generate lots of heat, so cooling drives power consumption requirements. But peak heat dissipation requirements are highly dynamic, complex and non-linear, Google Deepmind says.

The machine learning system was able to consistently achieve a 40 percent reduction in the amount of energy used for cooling, which equates to a 15 percent reduction in overall power usage effectiveness.

“Because the algorithm is a general-purpose framework to understand complex dynamics, we plan to apply this to other challenges in the data centre environment and beyond in the coming months,” said Rich Evans, DeepMind research engineer, and  Jim Gao, Google data center engineer.

Probably few would debate the potential use of machine learning and artificial intelligence to improve industrial processes.

Sales processes, though, likely are not an area where most would expect big changes. Products sold to business customers and larger organizations generally are considered complex matters, requiring customization.

Enterprise communications requirements are more complicated than data center power consumption processes, many could argue. But are they? Google and Deepmind applied historical data and then AI on top, to develop new rules for managing a complex system.

In essence, do sales and engineering personnel not have an accumulated wisdom about the general problems, existing processes and typical software and hardware used by enterprise customers, to a relatively high degree?

And where the typical solution involves recommendations for removing, adding or altering services and features to solve enterprise communication problems, are there not patterns designers and sales personnel can rely upon?

If so, might it not be possible to radically simplify the process of understanding and then “quoting” a solution? And if this cannot be done on a fully-automated basis, might it still be done on a wide enough scale to deliver business value for a communications supplier?

In other words, could AI simplify substantial parts of the enterprise solutions business? Most who do such things for a living might argue the answer is “no.” But are enterprise solutions completely unique? Are there not functional algorithms engineers and network architects work with that are, in fact, bounded in terms of potential solutions?

And, if so, could not large amounts of the analysis, design and reconfiguration not be done using AI? Airline reservation systems were, and are, quite complex. And yet consumers now use tools built on those systems to buy their own tickets.

Business communication solutions are complex. But they are not unmanageably complex. People can, and do, create solutions based on the use of what effectively are algorithms. We might call it experience or skill. That it is. But it is based on rules, formal or informal.

Rules-based systems can be modeled. And that could have huge implications for how business communications solutions are designed, provisioned and sold.

Thursday, July 21, 2016

Google Fiber Hits Pause in Portland Market

Google Fiber has put its plans to build a fiber access network in Portland in the fall of 2016 on at least temporary hold. “Why” is the question everyone should be asking.

In principle, Google Fiber could be wrestling with the business model, as competitors Comcast and CenturyLink have themselves been upgrading their own networks in the Portland area. That might make for a more-difficult payback model.

CenturyLink, for example, already is gigabit services in Portland, as does Comcast. That arguably makes harder the business model for Google Fiber or any other ISP that might be contemplating entering the Portland market.

It will not be so easy to attack as the only provider of 1 Gbps service if the other two leading ISPs already are doing so.  

"We're continuing to explore the possibility of bringing Google Fiber to Portland and other potential cities," Google says. "This means deploying the latest technologies in alignment with our product roadmap, while understanding local requirements and challenges, which takes time."

Also, it might be reasonable to assume that Google Fiber is about to try and become “Google Internet” and use fixed wireless as the access platform, not optical fiber.

That would be a major development. Facebook, AT&T and Verizon are other entities expected to promote or use fixed wireless as a major access platform for Internet access, and eventually gigabit access services.

Personally, I think Google Fiber has looked at the numbers and concluded a gigabit network that might cost $300 million is simply too big a risk in the existing Portland market, compared to its potential prospects several years ago: before CenturyLink and Comcast moved to upgrade.

This could be the leading edge of a very big change in access strategy and business models.

No Demand for Fractional T-1?

AT&T has asked the Federal Communications Commission for permission to stop selling fractional T-1 services that have very little demand in e in Arkansas, California, Illinois, Indiana, Kansas, Michigan, Missouri, Nevada, Ohio, Oklahoma, Texas and Wisconsin.

In fact, says AT&T, the company “has no customers subscribing to this service in Arkansas, California, Kansas, Missouri, Nevada, Oklahoma, and Texas.”

Once upon a time, a fractional T-1 (128 kbps, 256 kbps, 384 kbps, 512 kbps or 768 kbps) service was an affordable alternative to purchase of full T-1 services. In the 1990s, some of you might even have purchased a fractional T-1 service (consumer or business).

These days, even if some legacy applications remain, you would be hard pressed to point to any widely-used or mission-critical service that depends on fractional T-1, and fewer and fewer applications for full T-1 services as well.

As AT&T points out, people and businesses simply do not buy fractional T-1 anymore.

Dish Network 2Q: Revenue and Subscriber Losses

Dish Network reported second-quarter 2016 earnings that topped expectations, but Dish Network also had a net loss of 281,000 pay-TV subscribers, including satellite and the Sling web TV service, said to be the biggest quarterly subscriber loss ever, for Dish.

DirecTV, owned by AT&T, seems to have added customer accounts in the second quarter.


Here’s the importance: every legacy service provider is in a race to create new revenue streams at least as fast as each service provider loses legacy accounts. Pressure on top-line revenue and customer account attrition might mean Dish Network is losing that battle, despite the launch of Sling TV streaming services.


That leaves speculation about Dish Network entering the mobile business.


Opinions about what Dish Network might be able to do with its amassed mobile spectrum have varied. Some seem never to have believed Dish Network really would become a mobile service provider, and eventually would simply sell its spectrum.


Others believed Dish Network might well try and enter the mobile business.


The “problem” for observers is that much hinges on whether Dish Network concludes it is time to sell, time to build to create value before selling, or time to transition to a new business model and grow over the long term.

It is not clear anybody outside Dish Network, and aside from Charlie Ergen, Dish CEO, have any idea what the company will do.

Verizon Enterprise Solutions Launches "Virtual Network" for Enterprises

Verizon Enterprise Solutions is launching Virtual Network Services, allowing enterprises access to what we have long called “bandwidth on demand” features. Verizon Enterprise solutions calls it a “virtual infrastructure model.”

Verizon says it will offer three models for deploying virtualized services including: premises-based universal customer premises equipment (CPE), cloud-based virtual CPE services (available fall 2016) and hybrid services where clients can mix premises-based and cloud-based deployment.

Service providers have wanted such capabilities since the 1980s, generally referring to the concept as bandwidth on demand, and generally believing it would happen first for business customers, especially enterprises preprovisioned with optical access.

These days those concepts are more likely to be known as “virtual infrastructure” or “virtual networks.”

Verizon’s initial Virtual Network Service packages are: Security, WAN Optimization, and SD WAN services, and include:
Verizon’s initial Virtual Network Service packages
Verizon’s new services can be delivered across public, private and wireless networks from Verizon or other service providers, or a combination of multiple providers across multiple networks.

Access Network Limitations are Not the Performance Gate, Anymore

In the communications connectivity business, mobile or fixed, “more bandwidth” is an unchallenged good. And, to be sure, higher speeds have ...