Thursday, June 16, 2016

"Wearable" Value Proposition Seems Unclear to Most

A new study of wearable adoption suggests the value proposition is a bit unclear. “Of all the users of wearables surveyed, around one in 10 said they no longer used their wearable devices, with one third of these owners abandoning them within a couple of weeks of purchase,” said Ericsson.

So, at least so far, it does not appear that wearables--especially watches--are the "next big thing."

Ericsson says abandonment rates are declining, but user expectations also are rising. A common cause of dissatisfaction is feeling tethered to their smartphone.

Among those who have abandoned wearables, 14 percent did so because wearables were lacking standalone connectivity.


In fact, 83 percent of all smartphone users surveyed expect wearables to have some form of standalone connectivity.

When asked what form they would expect this connectivity to take in the future, around 40 percent of existing wearables owners and 46 percent of non-wearables users preferred Wi-Fi and cellular connectivity, with existing users expressing a two times higher preference for built-in mobile connectivity compared to non-users of wearables.

On the other hand, the cost of an additional mobile data connection charge is a barrier to wearables use. About 33 percent non-users indicated that the cost of keeping digital devices connected is a key reason why they haven’t invested in wearable technology.

The survey also suggests consumers believe additional drivers, such as personal safety, could be future adoption drivers by 2020. Demand for a wide range of other potential use cases seems unclear, in the medium term.


As was the case for the smartphone, which functionally displaced use of other consumer devices (cameras, clocks, GPS devices), many consumers and industry executives believe wearables could replace a wide range of other existing devices.

source: Ericsson

Wednesday, June 15, 2016

U.S. Fixed Network Internet Access Actually is Dropping

More U.S. households now seem to be abandoning even fixed Internet access in favor of mobile access, as it now is common for households to rely on mobile voice, instead of fixed network voice, or over the top video entertainment in place of traditional subscription services.

In fact, because of mobile use, fixed network Internet access rates actually are dropping in the United States, having reached an apparent peak in 2011.


Those sorts of trends pose key questions for service providers and regulators, since one has to ask important questions. If consumers are abandoning the fixed network, making the business case more difficult, what is the appropriate service provider investment strategy?

What is the right regulatory approach for services that are declining? Even if “parity” of regulation makes sense when technology platforms change, how much additional cost should regulators be placing on declining services--even services based on the newer platforms?

In an almost-unnoticed trend, more U.S. households, in every income bracket, are going “mobile-only” for Internet access at home, in 2015, compared to 2013.

In fact, says the National Telecommunications and Information Administration, between 15 percent and 29 percent of surveyed households now rely solely on mobile platforms for Internet access.

That trend has been gaining strength since at least 2011.

Trends such as the substitution of mobile for fixed access now have altered, or are poised to alter, the methods people use for voice, messaging, TV and Internet access. That has clear implications for service provider strategy and investment and for regulation of fixed services as a whole.

Some of us would argue that it makes little sense to apply more burdensome regulations on platforms that are in decline. Likewise, if the shift is to mobile and wireless communication, then it makes sense to continue allocating more bandwidth for mobile and wireless access.

There are a whole range of subsidiary issues where such choices must be made.

The U.S. Federal Communications Commission wants all ISPs offering IP-based voice services to provide battery backup, as legacy “plain old telephone service” has been line powered.

Frontier Communications says there is very little demand for battery backup for VoIP services, a finding that is consistent with prior consumer behavior.

That position is not surprising, as any universal battery backup plan would impose additional cost. Cable TV providers, for example, seem to have no objection to offering optional battery backup, but oppose mandatory backup for all carrier IP voice lines.

Frontier argues (as have cable TV operators for years) that, even when offered for sale, consumers do not generally buy battery backup services.

Frontier, in fact, argues, that few of its customers actually buy an IP telephony service, in marked contrast to cable TV operators and “fiber to home” providers, which universally offer IP telephony as their only voice option.

Frontier’s experience, in large part, also shows that the company continues to rely on legacy voice, instead of offering IP telephony. In large part, that is because the company does not use fiber to home platforms on a ubiquitous basis, which virtually requires that voice be provided using IP telephony.

Frontier notes that, according to the FCC’s most recent voice telephone services report, for all ILECs nationwide there are only 26,000 residential customers that actually use an OTT interconnected voice (IP telephony) product.

That hardly seems right, but appears to be a definitional issue. The FCC reported there were 48 million interconnected voice subscriptions in use by December 31, 2013. So Frontier clearly is using some definition of “connected voice” that is different from that used by the FCC.

That noted, given the prevalence of mobile service, rare in the extreme is a household that actually relies solely on fixed network voice.

Roaming Rate Controversy Illustrates "Large Carrier, Small Carrier" Economics

In wholesale communication markets, it matters whether a supplier is small or large, and whether most traffic is on-network, or off-network. At any given level of rates, smaller carriers will have a larger percentage of traffic roaming off network, while larger networks will tend to have less roaming traffic, all other things being equal.

That is a controversy within the European Union as regulators move to institute a complete ban on roaming tariffs for voice, text  and mobile data, by June 2017.

As you would guess, smaller service providers and larger service providers with high amounts of customers who roam are at odds about the actual remaining roaming rates. Within the EU, the big divide is between net receivers and net providers of traffic.

Operators in countries with lots of incoming roaming traffic such as Spain, Greece and France want high wholesale rates.

Operators in countries with low domestic rates and whose customers travel a lot, such as Baltic and eastern European countries, want low rates.

Wholesale prices are currently five euro cents a minute for calls, two euro cents for text messages and dive euro cents per megabyte of data.

Small carriers, with small customer bases, typically “send” more traffic than they “receive,” meaning in this instance that high rates are costly, as they will “get” less inbound roaming traffic (and revenue) than they send, and buy.

That is an issue for mobile virtual mobile networks, for example.

Some would argue that Internet interconnection involves the same traffic and cost dynamics, even when regulators ignore such differences. In other words, all other things being equal, large networks tend to accept more traffic than they send to smaller networks.

Small networks send less traffic to larger networks than they receive from large networks.

The big exception is content networks, which virtually always send more traffic than they receive. In fact, it is correct to say there are, among network types, "customer networks" operated by retail Internet service providers that mostly receive traffic, and "content" networks that mostly send traffic.

One might argue that the costs of unbalanced traffic are insignificant. That might be more true of the server and interconnection costs. It is manifestly untrue for ISPs that have to build gigabit and higher capacity access networks.

source: European Commission

Tuesday, June 14, 2016

Will 28 GHz Work for Consumer Gigabit Access?

“Will it work?” is a question more executives are going to be looking at, where it comes to fixed wireless platforms for Internet access. Technology issues (coverage, rain fade, interference) are important, as always.

Just as important are the payback questions: can the new networks deliver enough bandwidth, far enough, to be a functional substitute for fiber access, at prices consumers will pay?

Executives have been asking these sorts of questions for decades.

If you have been around long enough, you have seem several iterations of the “fixed wireless is the answer” for access operations, for a number of potential audiences and customer segments, in a number of different settings.

The rise of competition in the long distance voice communications business was pioneered by Microwave Communications Inc. (MCI), which used point-to-point microwave to compete with the AT&T system in the 1980s.

Also in the 1980s, point-to-point microwave was seen as a platform for educational TV. The MMDS bands (2.5 GHz to 2.7 GHz), also originally licensed for educational TV, were repurposed for delivery of multichannel television subscriptions. Now those bands are available for mobile communications.  

Sprint is a major holder of former MMDS spectrum, for example.

Fixed wireless now is enjoying something of a renaissance, as firms including Google, Facebook, Verizon and AT&T now are testing new forms of fixed wireless, including but not limited to use of new 5G spectrum, and also millimeter wave bands.

The attractions are the same as ever: fixed wireless promises lower deployment costs and access to additional customer locations that otherwise would lack a payback model.

That is especially important for fixed network providers faced with higher levels of competition, and therefore lower potential addressable market, higher risks of stranded investment and growing demands for consumer bandwidth at gigabit speeds.

The value is highest for providers not using the cable TV hybrid fiber coax platform, which can reach gigabit speeds on already-deployed platforms.

Some tests suggest signal propagation at 28 GHz will not be as big a problem as many fear.

In fact, the LMDS band has been available for decades, originally as a platform for TV distribution, and later viewed as a platform for high-bandwidth communications for business customers in urban markets.

What remains unknown is how much propagation distances might change as 28 GHZ is adapted for small cell network architectures, instead of point-to-point links. In an earlier period, reach of 1.5 miles was routine for point-to-point links, and distances up to three to five miles sometimes were possible.

In a new small cell deployment, transmitting at lower power, distances of 1,000 meters (about 0.6 miles) might be reasonable.

We do not have all the answers, yet. What we do know is that reasonable people believe technology advances will offer a better solution than possible in the past, just at a point when gigabit speeds are market necessities, and deployment costs are more crucial than ever.

High Speed Access Top of Amenities List for Renters, Study Finds

High speed access is among the top-two amenities for an apartment rental, a study by RVA for the FTTH Council suggests. Both potential renters and property owners rank high speed access at the top of a list of amenities, along with in-unit washers and dryers.


The study provides “direct evidence that satisfaction with broadband correlates with overall MDU property satisfaction itself.”

As always is the case with arguments and beliefs about the role of Internet access as a driver of economic growth or social well-being, correlation is not necessarily causation, a fact briefly noted by the study.


But “causation is not proven,” the study says.



Though we always act as though high speed access matters for economic activity, learning and social well-being, about all we really can say is that there is a correlation. We cannot prove causation.




Nearly 40% of All Internet Users are Members of LinkedIn

Time will tell whether Microsoft can spin value out of its LinkedIn acquisition. But LinkedIn's reach is impressive: nearly 40 percent of all Internet users appear to be members of LinkedIn, according to GlobalWebIndex. 

4 in 10 internet users are LinkedIn members

Monday, June 13, 2016

iPhone Prices Could Rise $60 to $80 for a "U.S. Made" Device

source: MIT Technology Review
Domestic production of iPhones with simple U.S. assembly could add $30 to $40 to the retail cost of devices built and sold in the United States, an analysis by MIT Technology Review suggests. If component parts also could be built domestically, retail prices could rise another $30 to $40.

Complete domestic sourcing is impossible, since the component rare earth elements are not found in the United States. The bottom line: if an iPhone were made of as much domestic components as possible, and fully assembled in the United States, an iPhone might cost $60 to $80 more than at present.


Thursday, June 9, 2016

Smart Cities Proposal Shows How Hard it is To Envision, Much Less Create, Any Part of a Smart City

Denver’s proposal to the  U.S. Department of Transportation as part of its Smart City Challenge might show why smart city programs are such complicated undertakings, under the best of circumstances. The DoT program will award one U.S. city about $40 million for a smart city project, with an expected award of up to $10 million also provided by Vulcan.

In fact, it probably is a bit of a misnomer to talk of “smart cities.” Instead, there are lot of potential ways intelligence, big data and sensor networks can actually produce valuable outcomes.

But those real outcomes are going to be few and far between for some time, because the infrastructures do not yet exist.

What Denver wants to do first is “establish a robust data management and sharing platform that will connect disparate data sets from multiple agencies.”

That information will be made available in the form of mobile apps and at kiosks, integrating information about  the city’s five car sharing and three ride sharing companies with the bicycle program and the city bus and rail services.

Denver also proposes to electrify taxi and City vehicle fleets and introduce wireless charging (including for transit buses). Useful, but not necessarily “smart.”

The proposed project also will test autonomous vehicles, introducing autonomous elements to fleet and transit vehicles while testing autonomous vehicle business models. Also useful, but not something that will affect consumers.
Denver proposes an integrated data system that would draw real-time information from many sources, providing a detailed picture of travel through the city.

That, in turn, is supposed to support programs that give residents and commuters access to more transportation options, initially. Useful, but essentially an “uber app” (in the sense of amalgamating and integrating other exists information sources, not in the sense of the ride-sharing app).

Eventually, the information system is seen as supporting “smart vehicles” and autonomous vehicles.  to the street grid and to each other, helping to pave the way for self-driving cars. Again, useful, but not something that will be seen or touched by citizens and users.

Some elements focus squarely on making it easier for residents of low-income neighborhoods — especially those without credit cards or smartphones — to connect with ride-sharing services such as Lyft, check out B-cycle bikes or find other ways to fill in transit gaps.

There will be some visible changes: charging stations and information kiosks. Lots of sensors likely will be deployed to gather and update the information base. But most consumers will “see and use” a new transportation app.

The project would convert a significant percentage of the city’s fleet vehicles to electric power. That leads to a greener city, but is it necessarily a “smart city” development?

More electric vehicle charging stations would be added. An additional 15 miles of new bike lanes would be added per year. Both good things, one might argue. But hardly “smart.”

Many of the other outcomes are process related, such as creating a policy and regulatory environment inviting for automated vehicles, or create plans for an 80 percent reduction of greenhouse gas emissions by 2050, supporting efforts to increase bike and pedestrian commuter mode share to 15 percent by 2020, or reduce single-occupant vehicle mode share to 60 percent by 2020.

The project also implements a safety program to reduce and ultimately eliminate vehicle-related crashes, injuries and fatalities.

The point is that the “smart” part of the projects involve collecting and making information available.

Most of the other activities are not necessarily “smart,” but green. There is nothing wrong with that.  But it does suggest how hard it will be to create even one element of a smart city (in this case, smart transportation).

AT&T Will Likely Have to Boost FTTH Investments to Meet FCC Requirements

At some point soon, AT&T likely will have to hike spending on fiber to home passings if it is to satisfy a Federal Communications Commission condition that is part of the FCC's approval of AT&T’s purchase of DirecTV.

The approval includes a requirement for AT&T to add 12.5 million fiber to premise locations and 13 million fixed wireless connections.

The fiber to customer deployments are supposed to happen over a four-year period, including about one million by the end of 2016.

Some question whether that is going to happen, and how soon, given fixed network capital investment of about $2 billion a year at the moment.

In addition to boosting capital spending, AT&T might be banking on the ability to build more affordably than was the case when Verizon Communications installed most of its fiber to home networks.

Indeed, Verizon’s unexpected decision to deploy FiOS in at least some neighborhoods in Boston suggests something has changed in the perceived business model.

Perhaps one key element is the difference between cost to “pass a location” and “cost to connect a customer.” AT&T could pass many more homes than it “connects,” as perhaps half the total cost of activating a customer is related to installing drops and network interface units for each paying customer.

The amount of deployed capital therefore includes a fixed element (fiber pass a location) and a variable component (cost to activate a location).

Forced to predict, some of us would argue that many more passings will be covered than “connected,” as initial take rates for consumer optical fiber connections can be as low as 20 percent in the first year.

So, of 100 locations passed, about 20 will require additional capital to activate. Assuming new deployments are targeted neighborhood by neighborhood, where propensity to buy is the highest, the amount of stranded capital is reduced.

Back in 2008, it might have cost $3,800 just to pass a location. Now it might cost $600 or less per passing. But you can see the reason for skepticism in some quarters.

It might cost $600 million to pass a million homes, at $600 per passing. At $500 per passing, it still costs $500 million to pass a million homes.

Using the $500 per passing figure, AT&T would have to invest at least $1 billion in capital to pass two million homes. On an annual fixed network capital budget of $2 billion, that suggests AT&T might be able to add about two million passings a year.

That works out to a total of about eight million over four years, short of the total of 12.5 over four years.

So, yes, one might argue, capital investment would have to climb beyond $2 billion a year to satisfy the FCC requirements.

App Store, Google Play to Boost App Provider Revenue Share to 85%


Driven by marketing concerns, both App Store and Google Play are poised to raise the share of revenue going to their app store partners. Apple plans to boost app provider share of revenue from 70 percent to 85 percent, if an app can maintain a subscription from a customer for at least a year.

Google Play also is said to be planning an increase in app provider share up to 85 percent.

Apple’s move seems driven, in part, by a new focus on growing subscription revenues. Google likely is moving more to keep pace with the App Store. Both, in doing so, will strengthen their dominance of the app store markets.

source: Wall Street Journal
Growing competition from Amazon likely also is a factor. But the shift of app revenue to developers will likely increase developer commitment to both platforms.

The new revenue splits might be especially interesting for content providers, especially streaming video providers. Sales of streaming content subscriptions is becoming more important for content owners and distributors.




Value-Added Services--Free or For Fee--Boost Retention, Data Consumption, Perceived Value of Mobile Services

Value-added services--provided at no incremental cost as part of a mobile subscription--possibly can increase retention by about 11 percent, boost customer perception of network quality by about 55 percent, and boost data consumption by about a gigabyte each month.

In mature markets, offers are focused around video and music streaming services, primarily as a way of affecting customer perceptions of network quality or value.

That is one example of the way access providers partner with app providers to provide services that are seen to boost perceived value.


Another important strategy involves creating and selling for-fee services, which might range from branded video or audio streaming to connected car services, home security and eventually, other Internet of Things or machine-to-machine (industrial Internet of Things) services.

40% of Mobile Operator Churn Driven by "Cost"

Not all customer churn is controllable by actions of the service provider. But about 40 percent of controllable churn still hinges on “cost and billing” issues, while 26 percent of churn is driven by network quality issues.

About 24 percent of churn is created by “customer care” issues, while “service and device portfolios” seem to drive about 10 percent of churn, a study sponsored by Nokia suggests.

In other words, prices deemed to be too high remain the biggest single driver of customer desertion, globally.

Perhaps also not surprisingly, prices are the top driver of customer decisions to choose a carrier. Asked why consumers chose a particular service provider, 45 percent indicated that “best prices” was the top reason.

Some 26 percent indicated that “network quality” was the chief reason for choosing a particular new service provider, while 25 percent said “geographical coverage” was the main reason for selecting a particular service provider.


source: Nokia

Wednesday, June 8, 2016

IoT Eventual Winners Cannot Yet be Predicted

It always is hard to say which companies or industry segments will do best--or which will lead the disruptive attack--whenever an existing market starts to be disrupted by new technology and business models.

In the mobile payments business, various participants, from different segments of the value chain, and some intending to create space for themselves in the value chain, have made a run at mobile payments. Mobile service providers were the first to admit at least temporary defeat in the U.S. market, as the Softcard business was sold to Google.
Google, in turn, has struggled to make mass market inroads with its own Google Wallet and then Android Pay service. Apple Pay and Samsung Pay also are among the device or operating system providers trying to create a position within the ecosystem.
The retailer consortium, CurrentC, is the latest to fail. In some ways, Currentc had a potent argument: it represented major retailers who are the “buyers” of payments systems and services.
Banks and card processing services, plus PayPal and other app-based payment systems therefore remain in the race to win share in the new business.
Some believe the device or operating system suppliers will win.
At the moment, the same sort of uncertainty exists in every part of the Internet of Things ecosystem.
There is just no way, for example, to tell how the “access” or “connectivity” market ultimately will develop, or who the leaders will be. Similar uncertainty exists in terms of operating systems, chipset suppliers and most importantly, in terms of the applications and services to emerge first.
As with the mobile payments business, value must be proven before consumers or businesses will adopt any particular service or approach. And there is just no way to know for sure which services will prove to have the clearest business model, early on.
source: ABI Research

67 Million Connected Car Subscriptions by 2025

By 2025, 67 million automotive 5G vehicle subscriptions will be active, according to ABI Research. That largely explains Verizon Wireless and AT&T Mobility interest in the connected car market.
About three million of those accounts will be low latency connections mainly deployed in autonomous and driverless cars.
So the connected car might be an early “killer app” for 5G networks, enabling broadband multimedia streaming, cloud services for vehicle lifecycle management, the capturing and uploading of huge volumes of sensor data, and vehicle-to-vehicle and vehicle-to-infrastructure communication.
ABI Research suggests that 5G’s most promising capability for automotive will be its low latency, which could be as low as one millisecond.

"Organized Religion" Arguably is the Cure, Not the Disease

Whether the “ Disunited States of America ” can be cured remains a question with no immediate answer.  But it is a serious question with eno...