Thursday, July 28, 2016

"Five Nines" Now is Effectively Impossible for Consumer Web Experience

It probably goes without saying that the Internet is a complex system, with lots of servers, transmission paths, networks, devices and software all working together to create a complete value chain.

And since the availability of any complex system is the combined performance of all cumulative potential element failures, it should not come as a surprise that a complete end-to-end user experience is not “five nines.”

Consider a 24×7 e-commerce site with lots of single points of failure. Note that no single part of the whole delivery chain has availability of  more than 99.99 percent, and some portions have availability as low as 85 percent.

The expected availability of the site would be 85%*90%*99.9%*98%*85%*99%*99.99%*95%, or  59.87 percent. Redundancy is the way performance typically is enhanced at a data center or on a transmission network.

For consumers, “hot” redundancy generally is not possible for devices. One can keep spare devices around, but manual restoration (switch to a different device, power it up) is required. Most often, “rebooting” is the restoration protocol, as “I will call you back” is the restoration protocol for a dropped mobile call.

Component
Availability
Web
85%
Application
90%
Database
99.9%
DNS
98%
Firewall
85%
Switch
99%
Data Center
99.99%
ISP
95%

Some of us are old enough to remember joking about “rebooting your TV,” a quip meant to suggest what would happen as TV signal formats switched from analog to digital, from standard to high-definition formats, from playback devices to Internet-connected devices.

Of course, we sometimes find we actually must reboot our TVs, set-top decoders, Wi-Fi and other access routers, so the quip was not without foundation.

In the past, some might have contrasted the availability (uptime) of televisions compared to computing devices. There are many issues.

Software with lots of code, and little fault isolation, can lead to some amount of crashing, and therefore lower availability. Drivers are known to cause faults.

One study of server availability found that 58 percent of IBM servers operated at 99.999 percent availability, but 46 percent of Hewlett-Packard servers and 40 percent of Oracle servers. Such issues normally are dealt with by building in automatic failover to redundant machines.

But many servers have “two nines” 99 percent availability, off the shelf.

Still, although 79 percent majority of corporations now require a minimum of 99.99 percent uptime or better for mission critical hardware, operating systems and main line of business applications, that target obviously is less than the “five nines” standard for telecom services.

On the other hand, IBM “fault tolerant” servers are supposed to operate at “six nines” of availability, higher than the telecom standard.

Whether software is as reliable, or less reliable, than a “five nines” network is debatable. But most would agree that software and hardware (without redundancy) operates at less than 99.999 percent availability.

There is a big difference between 99 percent availability (88 hours of downtime per year) and 99.9 percent availability (8.8 hours of downtime per year); or 99.99 percent availability (53 minutes each year) and 99.999 percent availability (a bit more than five minutes a year).

It is a myth that “five nines” remains the operational definition of availability for modern IP-based systems supporting voice, web and other over-the-top applications, even if service providers can produce reams of data proving that their core networks actually perform at that level.

In other words, even if networks are highly reliable, human beings use devices and applications that never work close to “five nines” in terms of availability.

The fundamental problem is that end user appliances, applications and operating systems cannot reach “five nines” levels of performance. And the whole calculation of availability is based on concatenated chains of devices. Element A might operate at “five nines.”

But, without redundancy, any transmission chain with three elements would be calculated as 99.999 times 99.999 times 99.999. By definition, the total chain involves the downtime caused by any single element in the chain.

Traditionally, telecom networks have considered 99.999 percent availability the standard for fixed network voice services.

These days, it is hard to find anyone arguing that actual end user application or service experience actually ever approaches “five nines.” The reason is that most of the applications people want access to on the Internet actually are processed in data centers whose servers cannot operate at five nines availability.

To cope with that issue, data centers use redundancy. In other words the issue is not how reliable any server is. The issue is how fast an entity can detect a fault and switch to a backup server.

That same approach (redundancy) is used by transport networks and business access networks.

But many apps still are delivered over networks that are unmanaged, even if availability on the part of the delivery chain any single network can control, is “five nines.”

A new way of thinking about reliability or availability is that modern application delivery systems cannot actually meet the old “five nines” standard, end to end, because the actual end-to-end systems are going to crash often enough that five nines is not possible, even when there are redundant “five nines” access and transport systems.

In other words, loss of local power alone is a threat to five nines for end user experience. Operating systems crash, access to websites hiccups, mobile phone calls drop. Devices run out of battery power.

Wi-Fi is the typical device connection in homes and offices, and no matter how well other elements and systems work, Wi-Fi operations alone would crash “five nines” performance, in terms of the actual experience of application and service availability.


The point is that “five nines” is a myth, when considered from the standpoint of a consumer end user of any Internet service or app, on any consumer device.

Wednesday, July 27, 2016

Apple Pay Shows Some Signs of Leadership in Mobile Payments

As a young business, it remains unclear where value and leadership will develop within the mobile payments business. In principle, retailers who are the immediate buyers of payment services; the processing networks; issuing banks; device suppliers; new service suppliers or access services providers could emerge in a driving role.

And, so far, there have been some notable misfires. A consortium including AT&T, Verizon and T-Mobile US failed to gain traction. A consortium of leading retailers likewise failed to get leadership.

Google’s own efforts have encountered modest success, while device-oriented systems linked to Samsung phones or Apple iPhones have been launched.

There have been some notable successes, though. Starbucks might be the largest mobile payment system in regular use in the United States. Square likewise has been a success in the small business point of sale transaction processing segment of the business.

Apple Pay has been gaining traction as well. According to Apple, Apple Pay now represents 75 percent of all contactless payment transactions made in the United States.

Apple says half of transaction value from payments made through Apple Pay are coming from non-U.S. markets.

Apple Pay is currently available in the U.S. the U.K., Switzerland, Canada, Australia, China, France, Hong Kong, and Singapore.

Some evidence of Apple’s key role in the payments ecosystem can be gleaned by the actions of other in the value chain.

Four of the largest banks in Australia — Commonwealth Bank of Australia, Bendigo and Adelaide Bank, National Australia Bank, and Westpac Banking Corp — have asked the Australian Competition and Consumer Commission (ACCC) to be allowed to join forces and negotiate with Apple as a single block.

While Apple does allow some of the banking apps be loaded on iPhones, it limits their access to the handset's hardware, such as antennas or NFC. As a result, the bank apps are more Internet banking tools than full contactless payment platforms.

Leadership still is not a settled matter in the mobile payment or contactless payments business.

Tuesday, July 26, 2016

Business Imperative: Replace 1/2 of Revenue Every 10 Years

“Over the last 16 years we have grown from approximately 25 million customers using wireless almost exclusively for voice services to more than 110 million customers using wireless for mostly data services,” said Lowell McAdam, Verizon Communications CEO, during the firm’s second quarter 2016 earnings call.

It is an illustrative comment for several reasons. It illustrates Verizon’s transformation from a fixed network services company to a mobile company. But the comment also illustrates an important business model trend, notably that of firms in telecom needing to replace about half their current revenues every 10 years or so.

In the U.S. telecom business, for example, we already have seen that roughly half of all present revenue sources disappear, and must be replaced, about every decade.

According to the Federal Communications Commission data on end-user revenues earned by telephone companies, that certainly is the case.

In 1997 about 16 percent of revenues came from mobility services. In 2007, more than 49 percent of end user revenue came from mobility services, according to Federal Communications Commission data.

Likewise, in 1997 more than 47 percent of revenue came from long distance services. In 2007 just 18 percent of end user revenues came from long distance.

That change in revenue sources is going to continue. Mobile voice and messaging already is declining, and in its place mobile Internet access is growing. For fixed network operators, video revenues are growing, while voice is shrinking, and high speed access has become the anchor service.

The point is that there is a very good reason for all service providers to assume they will have to replace half their current revenue in 10 years, and possibly for every decade thereafter. It now appears the auto industry is about to experience that same sort of change.

That is why so much of the content at the upcoming Spectrum Futures conference will focus on app development and app partnerships. Venture capitalists Wish Ronquillo and Jay Fajardo, as well as app development consultant and VC V. Shrinath will be speaking at the event.


Wish Ronquillo, Venture Partner, Ruvento Ventures, Singapore

Jay Fajardo, CEO, Launchgarage, Philippines

Shrinath V, Venture Capitalist and Google Developer Expert, India

Verizon: Multiple New Network Investments to Support Fixed Wireless

There was more evidence of Verizon’s plan to deploy fixed wireless rather widely in its network during the firm’s second quarter 2016 earnings call. Verizon CEO Lowell McAdam spoke about a number of interrelated technologies supporting its coming networks.

Millimeter wave radio, small cells, fixed wireless and optical fiber backhaul were among those trends.

“I think of 5G initially as, in effect, wireless fiber, which is wireless technology that can provide an enhanced broadband experience that could only previously be delivered with physical fiber to the customer,” said  Lowell McAdam, Verizon Communications CEO, during the firm’s second quarter 2016 earnings call. “With wireless fiber the so called last mile can be a virtual connection, dramatically changing our cost structure.”

In fact, Verizon’s decision to deploy FiOS in Boston is based on creation of a single fiber optic network platform capable of supporting wireless and wireline technologies.

“Our announced agreement to acquire XO Communications will also be a key part of this strategy, providing us with the deep fiber assets, including 40 metro fiber rings in major cities and millimeter wave spectrum in a significant part of the country that will give us a critical competitive edge,” said McAdam.

The XO deal also supplies a good amount of millimeter wave spectrum.

Reza Arefi
McAdam also illustrated Verizon’s use of network architecture--rather than acquiring new spectrum--to increase capacity. “The farther we push fiber out into the network, the more small cell technology works for us,” he said.  

“The cost trade off that we expected prior to the last auction told us that we would be better off going with the small cells,” he noted.

“And then as we densify the network for 4G, it sets us up perfectly for deploying 5G with the millimeter wave technology,” McAdam added. “Now we have a clear field in front of us to not only densify with 4G, but use that same capital dollar to get the infrastructure in place for 5G. So we think we're in a very strong competitive position here.”

Greg Leon
Verizon is not the first telecom company to tout a wireless platform as a substitute for fiber. But it might emerge as one of the most influential and widespread users of fixed wireless technology.

That is among the reasons the business implications of millimeter wave platforms and fixed wireless will be featured at the upcoming Oct. 20-21, 2016 Spectrum Futures conference in Singapore.

Reza Arefi of Intel will talk about the business implications of millimeter wave spectrum at Spectrum Futures.

At the conference, Greg Leon, Google fixed wireless product manager, will explain the role Google now sees for fixed wireless as a complement or substitute for fiber to the home.
Rajnesh Singh

Rajnesh Singh, Internet Society, Director, Asia-Pacific Regional Bureau, will explore the role of fixed wireless to serve rural villages in India.

Chris Weasler, Facebook, Director of Global Connectivity, likely also will talk about new platforms for fixed wireless Internet access.




Chris Weasler

Auto Insurance is About to Experience Disruption the Telecom Industry Already has Faced

The telecom business is not alone in facing huge business model disruption because of technology advances. Consider driverless cars. By some estimates, as much as $160 billion out of $200 billion in revenue (for insurance premiums) is at risk of disappearing or shifting because driverless cars will reduce accidents so much that premiums will fall.

For those of you doing the quick math, that is an 80-percent hit to existing revenues.

Deloitte, for example, forecasts today’s $200 billion in personal-car-insurance premiums is safe for about seven or eight years, then slide to about $40 billion by 2040.

On the other hand, Deloitte believes $100 billion could shift to product-liability insurance and coverage bought by ride-sharing businesses, for a net drop of about 50 percent in total auto insurance revenues.

Assuming that change happens over roughly a decade, it would fit a pattern of revenue shift we have seen, and likely will continue to see, in the global telecom business, where roughly half of all present revenue sources disappear, and must be replaced, about every decade.

According to the Federal Communications Commission data on end-user revenues earned by telephone companies, that certainly is the case.

In 1997 about 16 percent of revenues came from mobility services. In 2007, more than 49 percent of end user revenue came from mobility services, according to Federal Communications Commission data.

Likewise, in 1997 more than 47 percent of revenue came from long distance services. In 2007 just 18 percent of end user revenues came from long distance.

You can count those as one single change, or two changes. Either way, it literally is the case that half of revenue sources changed within a decade.

That change in revenue sources is going to continue. Mobile voice and messaging already is declining, and in its place mobile Internet access is growing. For fixed network operators, video revenues are growing, while voice is shrinking, and high speed access has become the anchor service.

The point is that there is a very good reason for all service providers to assume they will have to replace half their current revenue in 10 years, and possibly for every decade thereafter. It now appears the auto industry is about to experience that same sort of change.

Internet Access Prices, % of GNI Per Capita is the Problem in Developing Countries

One frequently hears complaints that retail prices for Internet access are too high. Actually, by one common measure, Internet access prices in developed nations are quite low, less than one percent of gross national income per capita.


That is why Spectrum Futures exists. Here is the Spectrum Futures schedule, with speaker and topics. 

53% of World Population Still Does Not Use the Internet

source: ITU
Global Internet access in one picture. Important: note that the figures represent mobile access to voice communications and the Internet.

Fixed access adds some additional number of connections, but essentially is irrelevant to the broad trend--either for voice communications or Internet.

Fixed-network Internet access adoption remains at below one percent in Africa and other less developed countries. Though China is driving fixed broadband in Asia, fixed-broadband penetration is just about 10 percent in 2016, according to the International Telecommunications Union.

But mobile coverage is not ubiquitous, and not all mobile networks support fast or relatively fast Internet access. In 2016, 66 percent of the population lives within an area covered by a mobile broadband network.

Seven billion people (95 percent of the global population) live in an area that is covered by a cellular network.

Mobile-broadband networks (3G or above) reach 84 percent of the global population but only 67 percent of the rural population.

LTE networks have spread quickly over the last three years and reach almost four billion people today (53 percent of the global population).

Still, 3.9 billion people, representing 53 percent of the world’s population is not using the Internet.


That is why a Spectrum Futures exists. Here is the Spectrum Futures schedule.

Monday, July 25, 2016

In U.S., Internet Access Speed Doubles or Triples Every 5 Years

Some things do not seem to change. Among them: the high end of U.S. Internet access service speeds roughly double every year. That implies an increase of an order of magnitude about every five years.


If you assume access speeds (for lead users) are somewhere north of 100 Mbps now, they will be in excess of a gigabit in five years. In a growing number of U.S. local markets, typical offers for consumers already have reached a gigabit.


But what about average speeds, for typical users? By some estimates, including those of the U.S. Federal Communications Commission,  average Internet access speeds increased 300 percent in the last five years (2011 to 2016).


source: Nielsen Norman Group

Verizon Gets Bigger Role in Media, Advertising; Yahoo Becomes an Investment Vehicle

The Verizon Communications acquisition of Yahoo’s operating business will leave Yahoo with significant cash, its shares in Alibaba Group Holdings, its shares in Yahoo Japan, Yahoo’s convertible notes, certain minority investments, and Yahoo’s non-core patents (called the Excalibur portfolio).

These assets will continue to be held by Yahoo, which will change its name at closing and become a registered, publicly traded investment company.

Combined with Verizon’s purchase of AOL less than a year ago, the move makes Verizon a more potent force in media and advertising.

Yahoo has a global audience of more than one billion monthly active users, including 600 million monthly active mobile users.

“The acquisition of Yahoo will put Verizon in a highly competitive position as a top global mobile media company, and help accelerate our revenue stream in digital advertising,” said
Lowell McAdam, Verizon Chairman and CEO.

One might debate the success Verizon might have in its new role as a mobile media and advertising services company. There is little reason to doubt the imperative of seeking such new roles in the content or other parts of the Internet ecosystem.

After 4 Decades, Business Model is Still the Issue for Next-Generation Access Networks

We now have been debating the business model for next-generation access platforms for at least four decades.

Somewhat oddly, we continue to debate whether fiber to the home or some other platform is “best” for ubiquitous next-generation networks. And though we undoubtedly will continue to debate where and when any particular platform is best, the business model increasingly is going to shape answers.

Some wonder why Google Fiber, which has been actively investigating deploying its gigabit Internet access service in Portland, Ore., has suddenly put the project on apparent hold. Others might wonder why BT is “dragging its feet” on a more-aggressive fiber-to-home build.

To paraphrase: “It’s the business model, stupid.” In Portland, since news of Google Fiber interest, both major Internet service providers--CenturyLink and Comcast--have moved to upgrade their existing networks for gigabit speeds.

That means Google Fiber now faces a key challenge. Where it might have been the “only provider of gigabit Internet access” in Portland, it now becomes “the latest of three” to do so. Granted, Google Fiber’s features (symmetrical bandwidth) and price could be differentiators.

But the big market opening--entering the market as the only provider with a disruptive gigabit offer--has substantially closed.

In the United Kingdom, BT’s “reluctance” to invest more heavily in fiber to home likewise has its roots in the payback model. As many tier-one service providers already discovered when opening their networks to wholesale customers, robust wholesale policies can lead to a loss of 60 percent or more retail market share.

The issue might be worse if the network is upgraded to fiber access. Losing 60 percent retail is compensated for by “gaining” that same percentage in wholesale customers. But wholesale customers represent less gross revenue than retail customers, and probably lower profits as well.

Having that, plus the high capital investment, could exacerbate the problem. Some see structurally-similar issues in the U.S. market, where regulators want to tighten price controls on legacy special access services. The facilities owners object to being forced to sell price-controlled services to competitors who gain most of the advantages of using the network, and none of the capital investment or risk.

At this point, it might be more fruitful to stop arguing about the technology platforms and look at the matter of incentives to invest in next-generation networks. That is the real problem.

Sunday, July 24, 2016

If Portland Does Not Get Google Fiber, the Business Model Most Likely Will be the Reason

Sometimes the business case--not “evil” Internet service providers or clueless municipal officials--is responsible for some hoped-for new service, product or network not to be launched.

One might argue that is precisely the point where it comes to a potential Google Fiber launch in Portland, Ore. It is one thing for Google Fiber to come to market with the “only” gigabit Internet access service in a market.

It is something else again if the incumbent suppliers (cable and telco) up their game before Google Fiber can launch, and deploy their own gigabit networks.

If that happens, the suppliers of nearly 100 percent of the consumer Internet access connections have a much more compelling value proposition, while any new Google Fiber offer--even if better--differs mostly incrementally.

That might also be the case if the incumbents come up with “hundreds of megabits per second offers” that cost less than Google Fiber, and also meet virtually all present customer requirements.

It is “Marketing 101.” An attacker has to come to market with a value proposition that makes sense, has clear value and often, “costs less.”

While Google Fiber arguably has technical advantages over the current CenturyLink and cable offers (symmetrical bandwidth, for example), it is not clear that most consumers actually believe they get much incremental value from a gigabit service, compared to one operating “up to” 300 Mbps or 500 Mbps.

In fact, many would argue that, for most consumers--and most multi-person households--a 100-Mbps to 500 Mbps downstream connection does “everything” a gigabit connection does, with the possible exception of some upstream apps.

Some of us would argue that, in most cases, even a 100-Mbps connection actually supports all typical applications for a multi-person household. Beyond that, it is not clear that actual perceived value exists.

If Google Fiber increasingly finds the incumbents (telco and cable) offering gigabit connections, the business case for launching Google Fiber might not be attractive.

That is not to say a fixed wireless service has the same economics. The business case arguably will be better with the latest generation of fixed wireless platforms, and should be even better in the future.

U.S. Consumers Still Buy "Good Enough" Internet Access, Not "Best"

Optical fiber always is pitched as the “best” or “permanent” solution for fixed network internet access, and if the economics of a specific...