Wednesday, December 9, 2015

Can Verizon Transform Itself?

For decades, observers have argued about whether telcos could fundamentally transform themselves--and their businesses--for the Internet age, meet competitive challenges and protect or grow the value of their businesses. To be sure. that is a problem for any large enterprise.


But it is a challenge all tier one telcos must master, or else. And though many would agree most cable TV companies are more nimble than most telcos, both face similar structural issues (saturation and decline of core markets). So success in one industry segment offers at least some expectation that feat is repeatable.

As it no longer is possible to view Comcast as a “cable TV” company with a business model built on distribution services and communications access, so Verizon has made some efforts to shift its business model incrementally towards “over the top” services separated from Verizon’s access services.

Those Verizon initiatives include Hum, the vehicle communications service, and go90, the video entertainment service. Both are available to everyone, not just to Verizon mobile subscribers.

To be sure, neither contributes a significant portion of Verizon revenue.

So it is hard to say, yet, whether Verizon will be able, over time, to significantly diversify its revenue sources away from “captive” access and distribution services, and towards over the top apps. Few seem to question the strategy; more might worry about execution.

But Verizon is going to try. "We are disrupting ourselves," said Lowell McAdams, Verizon CEO. "We think of ourselves as a technology company with solid telecom assets."

IP Really Has "Changed Everything"

The way “communications” problems are perceived, markets created and business models are created has undergone a fundamental shift, enabling all  “over the top” business models.


That creates big new opportunities for all sorts of firms that do not “own facilities” or physical networks. The existence of mobile virtual network operators, huge Internet app businesses, wholesale-based telecom companies and platforms to support business operations on a virtual basis provide examples of the new approaches.


The shift in business possibility is hard to grasp using old categories in the value chain.


In the pre-Internet era, all networks were purpose built and ownership of the apps and services  were integrated with the ownership of the networks that delivered those services.


By definition, Internet Protocol networks are general purpose networks, and by design, IP separates the ability to create apps, services and business models from the need to tightly integrate apps with network ownership.


Think of “cloud computing” as another example of the separation of physical facilities from creation and delivery of apps and services. Wi-Fi provides yet another example of app access divorced from ownership of access facilities.


So “loosely-coupled” value chains allow value chain participants to pick and choose the places they wish to focus. That explains why Facebook, Google, Amazon and others create apps, provide Internet transport or access, build and market devices, finance the creation of content and function as content distributors.


In other words, at a fundamental level, “over the top” is the way all applications are designed to operate, even if, in some cases, the owners of physical networks also create and distribute such OTT apps.


Think of the other implications. The fundamental difference between a telco, cable TV company, satellite network or fixed wireless network, and virtually any other retail provider in the  content and communications business, is whether physical access networks are owned.

MVNO Configurations.
source: Frank Rayal


MVNOs, OTT app providers, Wi-Fi hotspot network aggregators, cloud-based services, content producers, studios, neutral co-location exchanges, competitive local exchange carriers, system integrators and others all use or buy services from networks, but do not own them.


The application layer truly is separate from the physical and transport layers.


The other obvious implication is that suppliers now have huge amounts of freedom to create complex or specialized offers; geographic scope; customer segments; marketing strategies and business roles.


A logical corollary is that although some of the new contestants are huge and global; many more are regional or national; while most are “smaller” and more specialized. Almost by definition, specialization implies niches.


Niches, in turn, imply more-limited scale. In global markets, scale is essential; lack of scale dangerous. In regional, national or local markets, scale is less a requirement.


The key take-away, from a business model perspective, is that most contestants, in most markets, will be smaller, and lack scale, compared to global suppliers we used to think of as “tier one” telecom providers, or Fortune 500 companies, by way of analogy.


Though lack of scale is a weakness for a global supplier, scale is more nuanced matter for most
companies. Most OTT participants across every value chain can build a sustainable business without scale (in a global sense).


What is difficult, though, is a way to create more value, and find partners, when the number of potential roles and suppliers is overwhelming. If you want to know why specialized marketplaces, conferences, trade shows and other venues exist, that is why.


Most smaller OTTs cannot find potential partners using the same venues dominated by tier one suppliers with scale.


The biggest take-away, though, is that nearly every company, in every value chain or market, now operates “over the top.”

That also implies that any participant can create new roles and enter new market segments by partnering with other OTT suppliers. Look around, that is a hallmark of business strategy for nearly all providers in the broad communications and content sphere.

Tuesday, December 8, 2015

21% of U.S. Residents are Online "Almost Constantly"

About 21 percent of Americans now report that they go online “almost constantly,” according to a Pew Research Center survey.

Overall, 73 percent of Americans go online on a daily basis. Along with the 21 percent who go online almost constantly, 42 percent go online several times a day and 10 percent go online about once a day.

Fully 36 percent of 18- to 29-year-olds go online almost constantly and 50 percent go online multiple times per day.



For Most, Gigabit Doesn't Matter; Neither Does 2 Gig Access

If you happen to want to pay for symmetrical 2-Gbps Internet access, and you live in a location served by Comcast, within about a third of a mile from the nearest optical node, you can buy it for about $300 a month, after paying the Comcast Gigabit Pro $500 activation fee, a $500 installation fee, and a $1000 early termination fee.

Here’s how Comcast displays, in a general sense, which states it serves with Gigabit Pro service.

Unless you are a business customer, someone who does not pay his or her own Internet access bills, or simply a person who wants to pay for the service for any other reason, the offer  might not make any sense.

While it often is possible to detect a performance advantage (website loading, for example) when upgrading from a sub-10 Mbps connection to any other faster connection, most consumer will not be able to actually experience a change when moving from any connection above perhaps 20 Mbps to any other faster speed, no matter how “fast” that new connection happens to be.

It is not clear that a typical consumer end user will actually be able to detect any application performance advantage when buying the 2-Gbps service, compared to the 1-Gbps service, or even a 300-Mbps service.

According to a study by Mike Belshe, “if users double their bandwidth without reducing their Round Trip Time (RTT), the effect on Web browsing will be a minimal improvement (approximately five percent).”

“However, decreasing RTT, regardless of current bandwidth always helps make web browsing faster,” Belshe argues.

Faster local Internet access connections do help, up to a point. After about 10 Mbps, no single user is likely to see much improvement, if at all, in page load times, for example. The U.S. Federal Communications Commission and Ofcom agree: beyond 10 Mbps per user, experience is not measurably improved--if at all--by faster Internet access speeds.

With the caveat that a business customer’s use of bandwidth differs from the pattern typical of a consumer customer, small business customers of Cogent Communications tend to use about 12 Mbps of the 100 Mbps services bought to replace T1 connections of about 3 Mbps, said BTIG researchers.

According to Cogent, only about 12 to 24, out of perhaps 17,400 customers ever have reached 50 percent utilization of the 100 MB pipe.  Likewise, customers who buy gigabit connections have usage that does not likely differ materially from 100 MB customers, according to Cogent.  

One might well argue that consumer consumption is growing faster than business customer usage, certainly. But Sandvine data suggests U.S. median household data consumption over a fixed network connection is about 20 Gb a month. Granted “Gbps” is a measure of speed, while “Gb” is a measure of consumption, but monthly consumption of 20 Gb does not suggest most households likely are taxing their access downlinks.

To be sure, households with faster connections tend to consume more data. But that might be because households consuming more data disproportionately buy the faster connections. As more locations are able to use connections operating from 40 Mbps up to 1 Gbps, we should get a better idea of how much a “typical” user consumes, when access speeds exceed the ability of far-end servers to respond.


Bandwidth (in Mb/s)
Page Load Time via HTTP
1
3106
2
1950
3
1632
4
1496
5
1443
6
1406
7
1388
8
1379
9
1368
10
1360


Although there is a considerable jump in the early bandwidth speed increases, the returns as the pipe gets bigger continue to diminish until they are almost negligible.

The important observation is that the measure of a digital experience isn’t just--or primarily--about the speed of download.

Latency, or round trip delay, is more fundamental, beyond a minimum amount of access speed.





The point is that few consumers or even businesses will experience a perceivable benefit when upgrading to gigabit connections from connections in the 20-Mbps to hundreds of megabits per second ranges.

But lots of people also buy luxury goods for reasons other than sheer performance, eh?

Voice Down, Fixed Broadband and Cable TV Flat, Mobile Data Spending Up, Since 2010

U.S. consumer spending, since 2010, shows steadily-increasing spending on mobile data, slightly-growing spending on cable TV and fixed network broadband, and declines in spending on mobile voice and landline voice.

Mobile data alone now accounts for a third of total U.S. spending on communications services.


Not much surprise there.


source: Chetan Sharma

"All Consumer Trends Involve the Internet," Ericsson Says

“All consumer trends involve the internet,” argues Ericsson’s 10 Hot Consumer Trends. A corollary is that “prosumption,” where consumers participate in the production process now is routine and widespread. Online user reviews, opinion sharing, petitions and instant crowd activities are now becoming the norm more than an exception.


Ericsson also predictably argues that mobile broadband and Wi-Fi mean “we may not be physically more mobile, but our online activities are less restricted by our surroundings than ever before.”


Ericsson also notes that adoption cycles are faster, with a key implication: early adopters are less important, since mass market adoption happens faster than ever, and the period where adoption is driven by early adopters is shorter than ever.

Ericsson argues that artificial intelligence and virtual reality are perceived by consumers as realities as soon as five years hence.

Universal Log-In Might be a Feature, Not a Service

The value of many useful innovations is hard to measure. We aren’t really able to measure the value of improving computing power in consumer devices beyond its price per unit, or price per function.

We might note that a personal computer costs N units of currency now, and used to cost more than N three decades ago. But today’s appliance is vastly better, despite lower prices.

We have similar issues for many Internet apps and services. Many apps that are quite valuable are hard to quantify in terms of direct revenue, even when they indirectly create revenue opportunities.

For example, Juniper Research estimates that mobile operators will generate $700 million  annually by 2020 from new universal log-in and mobile identity services, up from $20 million in 2015.

Under other circumstances, that forecast would simply indicate a market way too small for mobile operators to bother with.

Among the likely business model issues is the fact that device and app suppliers are likely going to be competitors for biometric and other forms of security, and it is not so clear that their business models require the traditional subscription fee mobile operators and other service providers tend to lean on for the overwhelming bulk of their revenue.

Juniper Research also notes that, for such an approach to achieve widespread adoption, all mobile operators in a single market might have to collaborate.

Universal sign-in using a subscriber phone number, like all other universal sign-in methods that offer better security than passwords, is useful, no doubt. What is not clear is whether it is a revenue opportunity for mobile operators, or better approached as a “feature” with indirect revenue benefits (lower churn, higher distinctiveness and value).

Monday, December 7, 2015

Dish Network Mobile Spectrum Value Will Avoid "Zero," But How, When Still are Questions

Dish Network--somehow, some way, some day--will avoid a "zero" valuation of its spectrum assets. We just do not know what will occur, with whom, or when.

Back when regulatory and antitrust authorities blocked the merger of Sprint and T-Mobile US in 2014,  some of us speculated that, ironically, that deal could happen, albeit not until one or both carriers had become more damaged than they had been when the deal was proposed.

Developments since 2014, though, suggest other alternatives would be received more favorably, such as a change of ownership that has Sprint assets going to another entity (Comcast, Charter Communications, Suddenlink or a name-brand app or device supplier), not T-Mobile US.

Policymakers and antitrust authorities would not be able to easily justify any combination of T-Mobile US and Sprint assets, when other financially-viable bidders have both the strategic need for such assets, the ability to finance a deal and the willingness to consider such a move.

Some, such as analyst Mark Lowenstein, might suggest such a merger, with Dish Network assets thrown in, as well, is feasible.  

Some of us might suggest the market hasn’t yet worsened enough, for Sprint, to make regulators change their minds about reducing the number of leading U.S. mobile providers from four to three. Also, T-Mobile US arguably has gotten stronger since the Sprint merger bid was dropped.

Dish Network, and possibly LightSquared, also figure into the picture. Dish Network has licenses for 50 MHz of downlink and 20 MHz of uplink spectrum to support a mobile network.

Whether Dish would do better delaying any deals regarding its spectrum until after the 600-MHz spectrum auction is unclear, but some might argue its options would be better if a clear decision about deploying those assets is made before the auction process begins, since nobody bidding in the auction would be able to move until after the auction.

On the other hand, Dish Network management obviously believes the 600-MHz auction would only boost the valuation of its spectrum, as arguably was the case for the 700-MHz auctions.

Even though nearly 100 percent of Dish Network revenue comes from its linear video business, the Dish spectrum holdings are valued at $35 billion to 50 billion, representing some 80 percent of the company’s current valuation.

Of course, failure to deploy that spectrum on a timely basis reduces its value to zero. Something will happen, but what, and when, remain unclear.

European Mobile Data Consumption to Climb to 6 GB by 2019, at 45% CAGR

The average monthly data usage for Western Europe is set to grow from less than 1 GB per month in 2014 to nearly 6 GB in 2019, a compound annual growth rate of 45 percent, according to GSMA Intelligence.  

Faster networks are correlated with higher data consumption. Telefónica, for example, says that its 4G customers’ usage is 60 percent higher than 3G.

Vodafone says its 4G customers across four European markets use twice as much data as its 3G customers (50 percent more in Germany to 1.3 times as much in the United Kingdom and three times as much in Spain).

Telefónica as well has said it is “actively bundling content to drive data usage up”. Overall, data consumption by video is expected to rise to almost 75 percent of total usage in 2019 in Western Europe, up from 56 percent in 2014.


source: GSMA Intelligence

Sunday, December 6, 2015

Mobile Ad Blocking is a Growing Problem Because Users Want to Save Money on Data Costs

Mobile ad blocking is a business model problem, one might argue, not so much because ads are so intrusive, but because ads represent so much bandwidth overhead, especially in markets where mobile plans can cost as much as 4.4 percent of per-person gross national income, and data charges are extra.

Fixed network Internet access can cost as much as 21 percent to 29 percent of per-person GNI in developing nations, and as much as 98 percent of GNI per capita in the least developed nations.

By 2014, mobile service cost an average to 5.6 percent of GNI per capita  in developing countries. In the less developed countries, mobile costs 14 percent of GNI, per capita.

In the developed countries, mobile service costs about  1.4 percent of GNI per person.

Under those sorts of conditions, users have ample incentive to block ads that represent significant costs.
source: ITU

When Will ISPs Reach Same Conclusions PC Suppliers Did?

Computer suppliers long ago learned that marketing focused on “speeds and feeds” was not especially helpful. Internet service providers eventually will likely come to the same conclusion, though perhaps not soon.

The reason is simply a mismatch between a typical user’s ability to perceive or use most higher-speed connections, with one clear exception.

ISPs believe they gain marketing traction when able to advertise higher speeds than other providers. Whether the higher speeds make a difference, in terms of individual user experience is the issue.

Beyond a fairly low level, higher speed does not improve any single user’s experience.

In general, 10 Mbps appears to be the tipping point beyond which most consumers rate their broadband experience as “good,” Ofcom says. That threshold also tends to be the ceiling for experience. Beyond 10 Mbps, app experience does not improve.

There is one exception to that rule. Multi-user households, especially those using lots of higher-definition video, do benefit to the extent that aggregate bandwidth better ensures a minimum of 10 Mbps for every user, at peak usage periods.

“A minimum of 10 Mbps is required by the typical household,” according to Ofcom, the U.K. communications regulator.

The “average” fixed network download speed is 28 Mbps, according to Ofcom, and 83 percent of U.K. households can buy service between 30 Mbps and 300 Mbps.

The other issue is that, for a growing range of apps--especially cloud-based apps--latency matters as much as speed.

It is a truism that availability and uptake are correlated. That is to say, higher speeds, and higher uptake of higher speeds, are correlated. Likewise, higher speeds are correlated with higher data consumption.

Households with connections above 40 Mbps are consuming significantly more data, Ofcom notes.

Previously, data use was relatively flat above 10 Mbps. “This change indicates that consumers who particularly value and use their superfast broadband services are now opting for higher-speed packages,” says Ofcom.


That correlation is nuanced. As Ofcom notes, people who consume more video are going to buy higher-speed packages that also come with higher usage allotments.

The proportion of video traffic delivered over fixed broadband networks in 2015 has risen to about 65 percent, up from 48 percent of total traffic in 2014.

The other issues are that, at higher access speeds, more data is consumed in any given unit of time.

Also, a more-pleasant experience will create an incentive for users to spend more time engaging with Internet apps and services.

Ofcom also notes that in-home networks now are a significant experience issue. In fact, the quality of home-network connections plays some role in over 75 percent of households with poorly performing broadband connections.

The quality of home-network connections is responsible for more than 25 percent of the connection problems in 20 percent of households with a poorly performing broadband connection, Ofcom notes.

But the “assembled” nature of ad-supported apps also plays a part in experience.

Many popular websites and services use advertising. In many cases, advertising represents as much as 99 percent of total consumed bandwidth, Ofcom says.

Saturday, December 5, 2015

LIghtSquared Cleared to Emerge from Bankruptcy

LIghtSquared has received U.S. Federal Communications Commission approval to transfer spectrum licenses to a new entity, allowing the company to plan for emergence from Chapter 11 bankruptcy.

Under new leadership, including Ivan Seidenberg, the incoming company’s new chairman of the board, the new LightSquared will be able to resume its efforts to build a national Long Term Evolution (LTE ) fourth generation network using former satellite spectrum in the “L band.”

LightSquared ran into a regulatory buzz saw when GPS interests complained about interference with GPS devices in neighboring frequencies. LightSquared has a license in the 1525 MHz to 1559 MHz band, while GPS devices operate in the 1559 MHz to 1591 MHz region.

LightSquared will have up to 40 megahertz of spectrum to support its national network.

The company originally filed for bankruptcy protection in May 2012. GPS users complained that the network would interfere with equipment that requires precise location data.

Ironically, some might note that GPS interference with LightSquared was demonstrably greater than LightSquared interference in the GPS bands.

In principle, Lightsquared and GPS will now have to reach a deal that would satisfy both while leaving consumers much better off.

Title II Common Carrier Regulation is "Inept" Says Martin Geddes

The wrong analogies and metaphors, as consultant Martin Geddes points out, can be hazardous, even if the right metaphor can help clarify the logic of a position. Consider “paid prioritization,” one aspect of “network neutrality” that is controversial.

Geddes attended the District of Columbia court hearing on the Federal Communications Commission's "Open Internet" rules, commonly referred to as the imposition of Title II common carrier regulation.

Asks Geddes; “The existence of 'fast lanes' must mean everything else becomes a 'slow(er) lane'. Is this a good or bad thing?”

“We already have ubiquitous and uncontroversial paid peering,” Geddes notes. Apparently one justice also asked questions by way of analogy. “The railroads were at liberty to charge for refrigerated containers for goods that needed special handling, so it seems ‘utterly reasonable’ that ISP should be able to do the same,” Geddes reports.

Indeed, "users who create a cost should bear that cost,” a line of reasoning that also bears on the matter of metered usage, one might argue.

“All other transport businesses have tiered services that align price and cost to timeliness of delivery,” Geddes notes.  “The FCC is pushing a hypothetical ‘dread’ which has absolutely no factual substance behind it, and has lost tremendous credibility as a regulator as a result.”

“Banning a market for quality is an anti-innovation policy,” says Geddes. “It creates a distortion by preventing rational resource pricing through market mechanisms. A simple general rule on equal access to paid priority is plenty enough.”

“The Title II reclassification is an attempt to constrain ISP power, but is a politically, technically and economically inept one,” Geddes argues. His reflections on the D.C. court hearing are here.  

Yes, Follow the Data. Even if it Does Not Fit Your Agenda

When people argue we need to “follow the science” that should be true in all cases, not only in cases where the data fits one’s political pr...