Tuesday, June 6, 2017

Pervasive Computing Drives Narrowband Shift in a Broadband Market

Oddly enough, in an industry where the direction of technology development has been towards more and more capacity (“broadband”), the next wave of development includes a key focus on “narrowband” capacity (below 1.5 Mbps, and often in hundreds of kiloHertz per second, not megabits or gigabits per second).

But there are other differences. For the first time, device battery life is among the platform design goals, as well as end user device cost.

Also, though the mobile industry has been based on use of licensed spectrum, there now is a move towards greater use of unlicensed spectrum, in whole or in part.

Also, 5G networks are being designed with the business model for pervasive computing in mind.

Long battery life of more than 10 years is a universal design goal for all the proposed IoT networks. The reason is that the labor cost to replace batteries in the field is too high to support the expected business model.

Also, in a pervasive computing environment, low device cost below US$5 for each module is important as deployment volumes are expected to be in the billions of devices range, and many will add value only when deployment costs per unit are quite small.

At the same time, low deployment cost to reduce operating expense is necessary, again to support a business model that often requires very low capital investment and operating cost.

Coverage requirements also are different. Mobile networks always have been designed for operation “above ground.” That is not always the case for IoT deployments, which will happen in  reception challenging areas such as basements, parking garages or  tunnels.

IoT transmitter locations also will be expected to support a massive number of devices, perhaps up to 40 devices per household or 50,000 connections per cell, or roughly 1250 homes per IoT cell location, assuming mostly stationary devices are supported.

That is a transmitter density about 10 times greater than the designed coverage area  of a “typical” fixed network central office serving area.

But some matters do not change: the crucial unknown is the ability of new platforms for internet of things (based on use of 5G networks or other low-power, wide-area networks) to support and enable huge new businesses based on pervasive computing and communications.

The 3GPP specifies maximum coupling loss (MCL), a measure of coverage, in the 160 dB range, including signal loss from all sources in the link.

Note the difference in platform availability. The low-power, wide-area platforms are commercially available now. The mobile-based platforms will be commercialized, or have been activated by some tier-one carriers, this year (2017).

As often is the case, challengers enter markets before the legacy mobile or telco suppliers can respond. In the past, scale has mattered, however, and the legacy providers eventually have taken leadership of those new markets, even when the telcos were not “first to market.”
x

Monday, June 5, 2017

Use AI to Move Up the Stack?

It is fairly easy to see how artificial intelligence (AI) is a benefit for app and device suppliers. To use the obvious examples, voice interfaces and customization of content are applied examples of AI. And though AI enables features, not necessarily full business models, the issue is whether, as mobile operators attempt to move “up the stack,” AI can help, and if so, how?

According to Gartner analysts, there will be many practical applications for AI, in the near future, though most do not immediately and obviously have a “mobile” underpinning.

By 2018, for example, up to 20 percent of business content will be authored by machines.The obvious examples are structured content such as shareholder reports, legal documents, market reports, press releases, articles and white papers all are candidates for automated writing tools.

Likewise, financial services will undoubtedly move early to use AI to support investing, trading and forecast operations. Banking and insurance likewise will likely be early adopters.

Still, there are a few areas noted by Gartner that seem to have significant and more direct implications for mobile scenarios, and possibly, therefore, for opportunities to move “up the stack.”

Sensors and other devices themselves will begin generating huge numbers of “customer service” requests. According to Gartner, by 2018, six billion connected things will be requesting support. It is not clear how well horizontal services to support such requests can be created, but many of those requesting devices will use mobile and wireless connections.

Even as artificial intelligence is used to handle a growing number of human-initiated customer service requests, so we will have to develop ways of efficiently handling “machine” requests as well.

Also, by  2018, two million employees will be required to wear health and fitness tracking devices as a condition of employment, including first responders.

Employee safety is the issue. In addition to emergency responders, professional athletes, political leaders, airline pilots, industrial workers and remote field workers could also be required to use fitness trackers, and those devices will rely on mobile connections as a primary requirement.

By 2020, smart agents will support 40 percent of mobile interactions, Gartner also says. To be sure, it often will be the app providers and device suppliers that directly provide those capabilities. The point is that virtual assistants routinely will monitor user content and behavior in conjunction with AI-based inference engines that will draw inferences about people, content and contexts.

The goal will be prediction. If the agents can learn what users want and need, they also can act autonomously to fulfill those needs.


So it is easier to see how mobile networks and service providers could use AI to support their own operations than to see how they could create horizontal platforms or vertical applications, beyond the autonomous vehicle, connected vehicle spaces or perhaps consumer health technology.

Artificial Intelligence Will be Democratized

Source: Google  
 If artificial intelligence becomes a big part of the next big wave of growth for cloud computing, that should therefore allow firms of all sizes to use advanced machine-learning algorithms just as they today buy computing or storage.

In other words, cloud workloads of the future likely will include AI capabilities. “We believe AI will revolutionize almost all aspects of technology, making it easier to do things that take considerable time and effort today like product fulfillment, logistics, personalization, language understanding, and computer vision, to big forward-looking ideas like self-driving cars,” said Swami Sivasubramanian, Amazon AI VP.

“Today, building these machine learning models for products requires specialized skills with deep Ph.D. level expertise in machine learning,” he said. “However, this is changing.”




Increasingly,  AI will be part of cloud services and open source software as well, he argues.

Amazon Web Services has added predictive analytics for data mining and forecasting, to its cloud services, opening up machine-learning algorithms first developed for internal use, to customers of AWS.

Google application program interfaces are being made available to its cloud services customers to support translation, speech recognition and computer vision.

Microsoft likewise talks about  “conversation as a platform,” where voice-responsive systems use artificial intelligence to handle simple customer requests.

Over time, though, that capability will extend, allowing the AI-enhanced interfaces to integrate information from different sources, allowing more complicated transactions to be supported.

AI will be democratized, some would say.


Will Edge Computing Allow Mobile Operators to Move Up the Stack?

It is hard right now to know whether internet of things apps and services, enabled largely--but not exclusively--by 5G, are going to be as important as expected. But it is reasonable to argue that 5G is a platform that could enable mobile service providers “moving up the stack” in enterprise and some consumer services.

Edge computing, in other words, required by many proposed new apps, the most-frequently-mentioned being autonomous vehicles, which will require such low latency that cloud computing has to be done at the edge of the network. The issue, perhaps, is how many other new apps then could benefit from an edge computing network.

"Software gives us this capability to actually play in a different space than the connectivity
space for the consumer and the enterprise," said Ed Chan, Verizon SVP.

The assumption is that many new apps will require those interactions to be nearly real-time, requiring mobile edge computing. MEC is about packing the edge with computing power, like "making the cloud as if it's in your back pocket," Chan said.

Many of the apps benefitting from edge computing might be a bit prosaic. Real-time video at stadiums might provide one example. Even high-end metropolitan-area networks often have capacity to support about 100 Gbps, supporting uploads of 1080p streams from only 12,000 users at YouTube’s recommended upload rate of 8.5 Mbps. A million concurrent uploads would require 8.5 terabytes per second.

Some have predicted that,  by 2018, some 40 percent of IoT-created data will be stored, processed, analyzed, and acted upon close to, or at the edge, of a network, according to IDC.

Some even argue that analyzing data from offshore oil rigs, or managing automated parking systems or smart lighting, could require edge computing.


Friday, June 2, 2017

Linear Video Business is "Failing," Says ACA

Small U.S. telcos and cable TV companies have noted for a couple of decades that it is hard to make profits in the linear video subscription market. The reason is simply that scale is necessary, and, by definition, very small telcos and cable TV companies do not have scale.

Still, it is almost shocking to hear American Cable Association president Matt Polka say that the cable TV portion of the access business is "failing."

That is analogous to a major telco industry executive saying the voice business is failing.

And, of course, the same process has happened for telcos: voice, the traditional revenue driver, has ceased to support growth for quite some time. In 2013, for example, global revenues were dominated by mobility services. Voice services on fixed networks contributed less than 20 percent of total.

Already, internet access drives U.S. cable operator gross profit, while video contribution continues to shrink, even for the tier-one cable operators.


source: Insight Research

What Big Revenue Source Will a Technology Firm Discover Next?

In the past, technology firms were known either for making computers and devices or software widely used by computers. That still is largely true. But what is dramatically different are the new revenue models.

Alphabet (Google) and Facebook make nearly all their revenues from advertising. Amazon makes most of its revenue from retailing. Uber’s revenue comes from ride sharing. That explains the adage that “every company is a tech company” these days. That goes too far, but you get the point.

For a number of very-large firms, technology drives a revenue model based on sales of some product other than computing devices or computing software, and on a scale much more significant than that the enterprise uses computers, software, mobile phones and other devices.

That is why Airbnb, Hubspot, Expedia, Zillow, LinkedIn also are tech companies, whatever the revenue model.

source: Business Insider

90% of All Data Generated in the Last 2 Years

You are going to hear, quite often, that “90 percent of world data has been created in the past two years.” It is a evaluation made by IBM and illustrates the dramatic and exponential growth of largely unstructured data, generated by transactions, logs, records, social media, audio, visual and video consumption.

About 80 percent of all of that data is unstructured. Which is why big data and artificial intelligence now have emerged as strategic assets. AI is just about the only way to wring insight out of unstructured datasets so large.

Estimating a retailer’s sales by examining photos of cars parked in lots is one example of past efforts to correlate data. These days, it likely will make more sense to estimate sales by using location data from smartphones.



source: Kleiner Perkins

Thursday, June 1, 2017

Why IoT Requires a Cloud-Based, Virtualized Core Network

Nobody yet knows how many internet of things devices will need to be connected by 2020, using mobile and other local networks. Mobile connections, compared to the 2014 level, could be 22 times to 41 times larger. The total number of IoT connections, including devices using other local connections such as Wi-Fi, could be 12 times the 2014 number, or up to 28 times larger.


The other clear observation is that use cases will span a rather wide range of network resource requirements, requirements for mobility, latency, signaling and throughput. That is one reason why cloud-native and virtualized packet core networks are deemed essential for IoT supported by 5G networks. There are simply clear use cases that use different combinations of network-provided resources.


40 Years of Differences

In January 1978, when the first Pacific Telecommunications Council conference was held, the world was quite different.

  • Fewer than 7% of the world’s people had telephone service
  • Telecom was a monopoly and most firms were government owned
  • Nobody used a mobile phone
  • There was no Internet, no Ethernet, no browsers
  • 82 analog voice circuits connected Hawaii and Australia/New Zealand
  • Modems were acoustic and operated at 300 bps
  • Global telecom revenue and profit was driven by voice, especially long distance
  • “Billions” of people had never made a phone call
  • The business model was simple: build networks, earn a guaranteed return

Now celebrating its 40th anniversary, we all live in a world where:

  • Usage has migrated from voice to data to video
  • Bandwidth routinely is measured in terabits per second
  • There are 7.9 billion mobile phone accounts, used by 4.8 billion people
  • Telecom is part of the internet and computing ecosystems
  • Most telecom markets are fiercely competitive
  • All legacy revenue streams are under pressure, and new revenue models must be created
  • Cloud computing, OTT, 5G, smart cities and internet of things are top of mind
  • The business model is anything but certain,and every legacy service is mature or soon to be mature

U.S. Ranks 10th for Mobile Internet Speed: Why That is Not a Problem

Less often than in the past, one hears it said that the United States has a broadband problem. Costs are said to be too high, speeds too low, choice inadequate. That is true, in some locations, to some extent.

At times over the past decade, it has been argued that, where it comes to fixed network internet access, the United States was “behind” in either coverage, usage or speed.

The digital divide these days continues to be an issue in rural areas, but arguably is more complicated an issue since some users prefer mobile-only access and some people say they do not use the internet because they do not wish to do so.

Some might also argue that the way people and nations use the internet also matters, not simply availability, price or speed.

International comparisons can be instructive, though sometimes not for the reasons one suspects. Consider voice adoption, where the best the United States ever ranked was about 15th, among nations of the world, for teledensity.

For the most part, nobody really seemed to think that ranking, rather than higher on the list, was a big problem, for several reasons. Coverage always is tougher for continents than for city states or small countries. Also, coverage always is easier for dense urban areas than rural areas. The United States, like some other countries (Canada, Australia, Russia) have vast areas of low population density where infrastructure is very costly.

On virtually any measure of service adoption (voice or fixed network broadband, for example), it will be difficult for a continent-sized market, with huge rural areas and lower density, to reach the very-highest ranks of coverage.

That remains the case for mobile internet coverage or mobile internet average speeds, where, according to Akamai, the United States ranks about 10th.

source: Akamai

Will Edge Computing and Low-Latency Services Allow ISPs to Move Up the Stack?

Most moves made by most tier-one telcos “up the stack” have not worked well, if at all, and that include early moves into computing, data center operations, app stores, appliances and devices, over the top voice and messaging apps, or even OTT video services.

The jury still is out on moves into banking services, mobile advertising and content, but many telcos have fared rather well in the linear video subscription areas.

And though it is a statement of vision more than a practical reality at the moment, AT&T believes that, with a move to pervasive computing (which is one way to describe what “internet of things” is about), there is an inherent ability to embed higher-value operations into the network.

“The network itself moves from a connection to an experience that can include the compute,” said John Donovan, AT&T chief strategy officer.

In other words, even if data warehouses generally have proven to have modest strategic value for access providers (telcos and other access providers), that might well change as services and apps are created that rely on edge computing support.

As a horizontal business model, edge computing support could emerge as an area where telcos and other access providers might actually have some advantages, such as dense networks, access to power, other real estate and network elements that could play a role in supplying edge compute services to third parties.

Consider other potential advantages. AT&T’s new AirGig platform, for example, offers the promise of affordable trunking anywhere above ground where there are power utility poles and transmission lines.

Even if that is not so crucial for urban areas where access providers already have easements, pole attachment rights and access to power, AirGig might well play an important role in rural areas, where the cost of networking and bandwidth has always been tougher.

“For us it's a game changer on a cost basis because the components are small, sample and plastic performance wise,” said Donovan.

In other words, in addition to the “connections” function, there is a logical role either at the applications layer or in the computing layer.

That is not to say the task will be easy. It will be hard. But it is possible, and could prove to be among the more-successful ways telcos can move up the value chain.


CBRS Needs Certainty, Firms Say

Telecom and all other firms generally hate uncertainty. So a call for keeping in place rules for Citizens Broadband Radio Service, and it's approach to spectrum sharing, is important, industry suppliers say  

How to Move Up the Stack, How Not To

Telcos have been trying to “move up the stack” into application layer businesses for quite some time, with very mixed success. Computing firm NCR was acquired by AT&T in 1991, for example, in an effort to create a vertically-integrated computing capability. That effort failed, and NCR was eventually spun off.

That might be one key to how at least some tier-one telcos might look at their moves up the stack. Consider the different way Comcast has used its NBC Universal assets, and how AT&T must use its Time Warner assets.

Comcast did not try to make NBCUniversal (could not, for legal reasons) an “exclusive” or “vertically-integrated” asset available only to its owned cable TV systems. In other words, NBCUniversal was not about vertically integrating the content and making it proprietary to Comcast.

Instead, the value of NBCUniversal content is that it is sold to all other U.S. linear video subscription providers, even if some of that content is used in a proprietary way at the theme parks.

Likewise, AT&T will find its Time Warner content being sold (by law) to all other linear subscription providers, and eventually, in other ways, to over the top services. Likewise, AT&T would have little to no interest in restricting distribution of its studio content (movies) only through AT&T distribution assets. Instead, it would want continued distribution as widely as possible.

The point is that what has worked in the linear video space is not vertical integration, but rather broad sales to direct competitors, who serve customers that want the content.

In other words, instead of vertical integration that seeks uniqueness, content assets are broadly attractive to all suppliers in the linear video business, though also used in a vertical way--as an input--to support AT&T’s own linear video and OTT video operations.

In the internet of things area, a similar approach might be the right way to operate as well. Instead of acquiring or growing assets that are “captive” to AT&T, a better approach might be to create or acquire assets of broad value to customers and competitors.

The other approach--capturing the benefits internally and uniquely within AT&T--might not prove as successful, ultimately.

Most telco VoIP or OTT messaging efforts have failed. One commonality: those efforts were “branded” alternatives to other OTT or VoIP services. In other words, those were attempts to vertically integrate and restrict use of the services only to customers of telco access services.

The opposite approach is taken by wildly-successful consumer apps and appliances such as Google, Facebook, Amazon, Netflix or Apple. Those apps and devices work on all networks, and are not captive to any single access provider.

All that suggests the fruitfulness of seeking assets in IoT that are valuable using any access network, not specific features of a single provider’s access service.

source: Telco 2.0

Wednesday, May 31, 2017

Internet Trends Report: It's Now All About the Apps, Content, Games, Advertising

At a high level, the biggest takeaway from this year's Internet Trends presentation by Mary Meeker is how much she focuses on apps, games, content and advertising, not access. 

Access is not so much the issue anymore. And "access" is mostly a matter of smartphones. Meeker also spends quite a lot of time on two markets: India and China. 


Yes, Follow the Data. Even if it Does Not Fit Your Agenda

When people argue we need to “follow the science” that should be true in all cases, not only in cases where the data fits one’s political pr...