Wednesday, October 12, 2016

Can Connected Health Help With Widespread Pain Issues?

More than half of U.S. adults (125 million) had a musculoskeletal pain disorder in 2012, according to the U.S. Centers for Disease Control. Though it is not yet clear that connected health devices can help people manage or alleviate such pain, the wide extent of such problems suggests the potential for such innovatins.

Just over 20 percent of U.S. adults had arthritic conditions (22.1 percent) or lower back pain (20.3 percent). A smaller percentage of persons reported having non-arthritic joint pains or other joint conditions (17.5 percent) and neck pain or problems (14.3 percent).

Some 9.8 percent had sciatica issues. A significant proportion of the population reported having at least one other musculoskeletal problem that was not examined independently (28.1 percent).

Pain also is correlated with another problem, namely lack of sleep. The 2015 Sleep in America Poll found that pain is a key factor in “sleep debt:” Some 21 percent of people have experienced chronic pain and lose 42 minutes of sleep due to it; 36 percent have experience acute pain, resulting in 14 minutes of lost sleep each night.

source: Sleep Foundation

Colt Creates Multi-Carrier SDN Capability

Colt says it now can enable its customers to configure and activate Ethernet networks across service provider boundaries software defined networking capabilities very rapidly. Colt says  
collaboration with an unnamed North American partner allows Colt customers to locate a service on the Colt On Demand portal, set up a data service, run video over the connection, flex the available bandwidth and then shut the service down all within 10 minutes.

That is important when enterprises need a temporary change in bandwidth at any specific location, or when connections to new partners must be quickly adjusted. According to a recent survey, the average European enterprise uses 987 cloud services. Furthermore, the number of cloud providers grew 61 percent, year over year, according to Colt.

Tuesday, October 11, 2016

"Average" U.S. Internet Access Speeds are Highly Dynamic

If in the second quarter of 2016, average connection speeds in U.S. cities ranged from 17 Mbps to 24 Mbps, while Internet service providers are selling gigabit access, as well as services in the 100 Mbps to 300 Mbps range, it is obvious that most consumers are not buying either gigabit or hundreds-of-megabits services.

On the other hand, speeds are increasing rapidly. In fact, according to Okla, fixed network customers in the first half of 2016 got an average of over 50 Mbps for the first time ever.

That represents a speed improvement of more than a 40 percent since July 2015.

Similarly, mobile Internet access customers saw speeds improve by more than 30 percent since last year, with an average download speed of 19.27 Mbps in the first six months of 2016.

In other words, by some measures, mobile Internet access download speeds are as fast, or nearly as fast, as the “average” U.S. fixed network connection in at least some cities.

Just as important are speed differences between various ISPs. Where Google Fiber might offer an average speed in excess of 300 Mbps, Comcast and other cable TV ISPs might average about 50 Mbps to 45 Mbps. Telcos tend to deliver slower speeds than that, with the exception of Verizon FiOS, which has an average speed of about 50 Mbps.



There's a Reason You Don't Hear Specifics About Gigabit Take Rates

Back in the days when cable TV operators first were rolling out consumer Internet access at speeds of 100 Mbps, it was virtually impossible to get subscriber numbers from any of the providers, largely because take rates were low.

In the United Kingdom, then planning on upgrading consumer Internet access speeds to “superfast” 30 Mbps, officials complained about low demand. In fact, demand for 40 Mbps was less than expected.

In 2010, for example, about 40 percent of U.S. consumers were buying Internet access at about 6 Mbps.   

It is possible the same remains true for gigabit access services. No Internet service provider of any size actually releases the number of accounts, though most are happy to cite cities and neighborhoods served, or homes able to buy the service (passings).

Those are significant indicators, but still do not address the question of how many customers actually buy.  

Early in 2016, Paul de Sa, Bernstein Research equity analyst, predicted Google Fiber would reach roughly 2.4 million homes by the end of 2017.
MoffettNathanson at roughly the same time predicted that AT&T would reach 5 million "customer locations" by the end of 2017. CenturyLink estimated in late 2015 that it would have 700,000 households passed by gigabit access networks in operation by the end of 2015.

Comcast, for its part, plans to upgrade 100 percent of its consumer base to gigabit access over the next few years.

The issue will still remain the take rates.

CenturyLink executives, for example, have said that gigabit marketing primarily drives new sales of accounts buying 20 Mbps or 40 Mbps service.

It is clear that price matters. When Internet service providers drop the price enough to create a really-compelling value-price offer, consumers respond.

If ISPs do not readily announce the number of gigabit accounts they have in service, it likely is because relatively few consumers are buying those services.

Municipal gigabit access provider NextLIght expects a take rate of about 37 percent after five years, selling gigabit service at a charter rate of $50 a month ($100 a month is the standard rate).

Based on experience from other markets, NextLight will have the best chance to reach those adoption goals if it sells at the $50 price, not the the $100 price.

Monday, October 10, 2016

New Special Access Rules Could Lower Frontier Revenue Modestly

Lower price caps for special access services (“business data services”) will be the result of proposed Federal Communications Commission rules. The moves will mean lower prices for enterprises and service providers who do not own fixed network assets, and lower revenues for firms with such assets.

AT&T will suffer the most, with Verizon, CenturyLink and Frontier Communications also losing revenue.

The proposed rules call for a one-time downward adjustment of 11 percent, phased in over three years, beginning in July 2017 (three percent in year one, four percent in year two, and four percent in year three).

As always, there are trade-offs. Enterprises will get rate relief. But lower investment in new facilities will happen, says Zacks Equity Research.

Though Frontier Communications once argued the FCC rules would not affect its revenue, Frontier now estimates the rules, if implemented on July 1, 2017, will have a revenue impact of approximately $10 million in 2017, $20 million in 2018 and 2019, with subsequent annual impacts declining after that.

Some estimate the rules will reduce overall industry revenue by $1.6 billion or so, per year.

The Communications Workers of America also anticipates lower investment and lower revenue will mean lost jobs as well.

Perhaps the biggest long-term impact will be felt by cable TV operators who supply such services, as the new rules appear to bring the cable TV industry into the framework for the first time.

The FCC has received confidential information on service provider revenues, without releasing that information, so it is difficult to predict precisely how much revenue might be affected the proposed rules, beyond what Frontier estimates.




Workplace Launches Facebook into Enterprise Collaboration

Workplace by Facebook, its enterprise tool for companies that allows workers to chat and collaborate with each other. Aside from the move into the enterprise collaboration space, Workplace is a subscription service, not an ad-suported service, representing a new business model for the product.

Available at no cost to educational or non-profit entities, Workplace is priced per user, based on the number of users at an organization, with fees ranging from $3 a month for entities with up to 1,000 active users, to $1 a month for entities with 10,001 users or more.

Workplace by Facebook started life as the internal system used by Facebook employees to share information relevant to their projects.

One of Workplace by Facebook’s core differentiators is the fact that it works well on smartphones and other mobile devices.

249 Billion Euros ($277 Billion) Needed to Deliver Fixed Network Gigabit in EU, Study Suggests

As you would guess, a study of gigabit Internet access costs, conducted by Analysys Mason for the European Community, suggests targeted enterprise connections cost the least of the fixed access alternatives, while ubiquitous fixed network gigabit networks cost most.

Ubiquitous 50-Mbps mobile access costs less than any fixed method, though not providing the same amount of bandwidth. That assessment could change over time.

Analysys Mason expects that by 2025, it will be possible for 1-Gbps peak speed to be provided from the macro cell network, with average speed around 180 Mbps.

The study authors note that other alternatives, including fixed wireless, cable TV technology and satellite will be capable of delivering gigabit speeds by 2025.

Still, Analysys Mason was asked to model only the “enterprise” deployment, mass market gigabit access (functionally limited to fiber to home or node) and mobile connectivity.

Fixed wireless, satellite and hybrid fiber coax were not modeled. Some of us might argue that might be reasonable for many of the European Community nations, if not necessarily the model that will develop on other regions.


source: European Commission

Webscale Telcos?

Moving “up the stack” will be necessary and possible for the webscale global giants. Beyond some limited scenarios, smaller providers will lack the scale to create viable new application or services.

That implies rather significant consolidation. It might also mean significant service provider failures.

Few tier-one service providers recover cost of capital, studies have found. That is one way of suggesting that capital borrowed to provide telecom services actually does not make enough money to repay the loans.

That has clear strategy implications. Only a handful of firms will credibly have a shot at remaining among the 10 or so global providers. For as many as 100 other firms, strategy will consist in remaining the best-possible local partner.

Industry or firm strategy in a new or growing market is fundamentally different from strategy in a declining market. You can draw your own conclusions about which fundamental paradigm is most relevant.

But some conclusions are simple enough. In a young, growing industry, a firm or industry wants to grab new customers as fast as possible. In a declining industry, a firm or industry wants to limit the rate of decline.

Firms in young industries need to focus on growth within the new business. Firms in declining industries must harvest revenue while they search for new businesses to create.

Beyond those key frameworks, the range of potential strategies has increased, compared to options 100 years ago, when telecom was universally a regulated monopoly.

One clear outcome of a massive global wave of asset privatization, deregulation and the shift to Internet as the framework for applications is that service providers are becoming more different from each other, as firms are free to pursue a nearly unlimited number of paths.

So there now likely is no universal “best” strategy for any telco, tier-one, regional or local. Nor, it might appear, will most service providers emerge as major suppliers of new apps and services.

How Many Service Providers Can Escape "Dumb Pipe" Status?

One universally hears service provider executives arguing they want to avoid becoming “dumb pipe” connectivity providers. What is not so clear is how many service provider entities will actually be able to do so in a significant way.

For many--perhaps most--suppliers, being an efficient dumb pipe provider is possibly the only viable path forward. The problem is that most proposed new services and applications require scale.

Whether it is entertainment video, mobile banking and payments, connected car, connected health or other Internet of Things apps, viable suppliers must achieve scale. Almost by definition, most smaller providers will be unable to do so.

That will mean an industry dominated by 10 global service providers, some predict. Those handful of firms can become branded suppliers of applications. Smaller providers will struggle to reduce costs enough to remain viable primarily as suppliers of access services.

In other words, the advice to “move up the stack” will be viable for a relative handful of firms. Most service providers will focus primarily on access. In the Internet era, that means being suppliers of “dumb pipe” Internet access.

Moving “up the stack” will be necessary and possible for the webscale global giants. Beyond some limited scenarios, smaller providers will lack the scale to create viable new application or services.

Reliance Jio Gets 16 Million Mobile Accounts in a Month

Reliance Jio Infocomm has signed up 16 million customers (net new accounts) in its first month of full commercial operations since September 2016.

Reliance Jio has been offering free services for free to anyone signing up before the end of the year, including four gigabytes (4 GB) of free 4G data each day, as well as unlimited voice calls until December 31, 2016.

Customers will begin paying for data charges in 2017 but domestic voice calls will continue to be free.

Also, subscriber identification modules (SIMs) are being given away for free to customers of the 4G-only new service. So the issue for Jio is how many of its customers will stay with Jio once the free data period ends.

There already are more than one billion mobile subscribers in India, so Jio potentially has gotten something like 1.6 percent market share, assuming it attracted no “mobile for the first time” accounts, and only shifted demand from the other existing carriers.

Observers are watching to see how much market share Reliance Jio will take in the first year and first few years after that. Some believe Reliance Jio will get about 10 percent to 12 percent mobile subscriber share over three to four years.  

Sunday, October 9, 2016

More Data Will be Created by Machines than People

It sometimes is hard to tell the difference between devices used by people (smartphones and other consumer devices) and those used by machines (Internet of Things). Health monitors and smart watches come to mind. They are worn by people, but “used” by servers.

Some refer to health monitor as “consume” IoT, in contrast to “industrial” IoT. The point is that there will be use cases that are hard to classify. “Smart clothing” might pose the same definitional issues.

Several years ago, for example, it would not have been unusual to find tablets classified in “connected devices” category that in some ways was the forerunner of today’s machine-to-machine or Internet of Things devices.

Perhaps few these days would count tablet connections in the IoT category. But there still are many consumer-focused appliances and machines that will be somewhat difficult to classify. Smart kitchen appliances might fall into that category.

Still, some would say connected PCs were the forerunners of today’s developing IoT markets. About the only widely-used device never really considered “IoT” are smartphones.

The point is that there is a difference between data or information created by people or machines.


Perhaps nothing is clearer than the expected benefits from deploying IoT in any setting. For product and service providers, revenue upside is the expected driver of behavior, often in indirect ways such as creating better user and customer experiences.


source: Business Insider

Saturday, October 8, 2016

AT&T to Launch LTE-M IoT Trial

source: Qualcomm
AT&T plans to pilot an LTE-M network in the San Francisco market starting in November 2016, followed by a full commercial launch in 2017.

LTE-M is a subset of the Long Term Evolution 4G network standard optimized for Internet of Things sensors requiring transmission speeds no greater than about 1 Mbps, as well as up-to-10-year battery life and ability to work underground.

source: Qualcomm
LTE-M technology is expected to connect a wide variety of IoT systems supporting smart utility meters, asset monitoring, vending machines, alarm systems, fleet, heavy equipment, mobile health and wearables.

Participants in the pilot include:
  • Badger Meter – analyze how the LTE-M network, which is dedicated to supporting the IoT, may be used to enhance communications for smart water devices.
  • CalAmp – explore how the LTE-M network can help companies more efficiently manage their connected vehicles and assets.
  • Capstone Metering – demonstrate how LTE-M can improve Smart Cities sensor technologies. It will look to increase battery life and improve connectivity and sensor monitoring for underground smart water meters.
  • PepsiCo – examine and test ways that sensors can improve the in-store experience with smart vending solutions for the thousands of PepsiCo products consumers love and enjoy.
  • Samsung – evaluate an LTE-M-based solution to enhance performance for consumer solutions. This may include wearables or other consumer devices.






Friday, October 7, 2016

When is a Terabyte Household Data Consumption Limit a Problem?

When is a terrabyte of usage on a single consumer Internet access account a problem? When a consumer user is part of the “one percent.” That is one percent in terms of data consumption on a Comcast network in a month’s time.

Roughly, that corresponds (Comcast’s estimates) to a household consuming about 21.7 hours of high-definition format video entertainment every day of the month, based on a terabyte supporting between 600 and 700 hours of HD video, and using 650 as the median case.

Netflix estimates an hour of its HD video consumes about 3 GB per hour, though. In 2014, according to Sandvine, a cord cutter household consumed about 212 GB a month (video and all other uses).


As a rule of thumb, a typical household using Netflix and streaming video should not experience any data consumption limit issues if a 500-gbyte cap is in place, according to WhistleOut.

India Spectrum Auction Nets about 11% of Government-Forecast Revenue

India’s big spectrum auction of 2,300 MHz worth of spectrum has ended, with spectrum sold at about 11 percent of what the government projected would be the case, or roughly US$9.8 billion (if I have converted the crore properly). The government had projected sales in the $83 billion range.

As mobile executives had warned, prices for 700 MHz spectrum were simply wildly overpriced. They behaved as they spoke: nobody made a bid for any of the 700-MHz assets. Mobile executives had suggested the government lower the prices and wait before auctioning the 700-MHz assets.

Of the total of 2,300 MHz of assets, the government sold 964.8 MHz of spectrum. Mobile operators purchased about 34 percent of spectrum in the 800-MHz band, about 75 percent in the 1800-MHz band, all of the spectrum available in the 2300 MHz band and about 60 percent of spectrum in the 2500 MHz band. About 20 percent of spectrum in the 2100 MHz band was bought.

Vodafone India and Bharti Airtel were the biggest buyers of 4G spectrum, followed by newcomer Reliance Jio Infocomm and Idea Cellular.

Vodafone spent over Rs 20,000 crore, Airtel Rs 14,244 crore, Jio Rs 13,672 crore and Idea Rs 12,798 crore.

The auction results, and the squabbling leading up the auction, illustrate several important facts about the Internet ecosystem. From a mobile operator’s perspective, though spectrum access is a necessary precondition for being in business, operators cannot pay “any amount” for that access.

And mobile operators demonstrated with their wallets that spectrum prices set by the government were too high. There is experience behind that thinking. In the past, mobile operators have overpaid for 3G spectrum, for example, in India and elsewhere.

Operators have learned, from experience, that the cost of spectrum has to be weighed in view of expected revenues that can be generated by those assets.

There also are a few larger points.

Since, in the end, consumers or advertisers are the ultimate sources of all ecosystem revenue, all costs--anywhere in the ecosystem--must be matched by revenues from those sources.

The Indian auction shows that government officials and mobile operators have vastly-different expectations about the revenues that can be generated by using mobile spectrum.

There are reasons mobile operators and others might rationally expect spectrum to prices to begin dropping. For starters, much more spectrum will be made available as 5G standards are set and regulators start to release brand new spectrum in the millimeter regions.

The role of unlicensed spectrum also is growing, reducing, to a real extent, the need to buy licensed spectrum.

In some markets, spectrum sharing also will add even more resources. Finally, small cell architectures are allowing service providers to make better use of any amount of finite spectrum.


Is Private Equity "Good" for the Housing Market?

Even many who support allowing market forces to work might question whether private equity involvement in the U.S. housing market “has bee...