Friday, April 12, 2019

What HDTV Could Teach Us About Mobile, OTT Video

“People prefer HDTV even when the TV is off,” one executive quipped, in the days before high-definition TV was launched in the U.S. market. What he meant was that the different aspect ratio of the screen (16:9 compared to 4:3) was preferred over the analog TV screen. It is easy to say that people wanted the higher-definition picture, but there were other elements of the experience that also changed at the same time.

The higher resolution is part of the long trend towards more realism in video, to be sure. But higher resolution also meant that pictures looked better on larger screens. So part of the attraction of HDTV was larger screens.

At the same time, the shift to flat screens also had begun, adding a further stylistic change of form factor, and something consumers clearly preferred.

The point is that sometimes consumers desire a product for all kinds of reasons beyond the stated purpose of an innovation.

That is probably good advice when considering more-recent changes, such as the shift to on-demand, non-linear viewing and streaming delivery. People might choose to behave in ways that ultimately may surprise, and not as expected.

For example, most likely believe the story that streaming has value because it provides sufficient choice at lower prices than linear TV. But a new Harris Poll suggests most consumers will eventually spend as much on their streaming subscriptions as they do on linear TV.


Regardless of whether it is linear subscription TV  or OTT, consumers are consistent in how much they are willing to pay and the amount of content they view. Consumers want about 15 cable channels or OTT services, and are willing to spend $100 per month total, the survey suggests.

The typical U.S. home spent $107 a month on linear subscription TV service in 2018, according to Leichtman Research. And prices for streaming services also are growing. The linear TV replacement services, for example, cost between $40 and $70 a month, with Sling at the low end and DirecTV Now at the high end.

A couple of observations therefore are apt. The AT&T move into linear TV has been criticized as a failure. And some also did not favor its later move into content ownership, either. Some supporters of both moves might say the Harris Poll results tend to confirm that linear video is a springboard to OTT video, and will ultimately be of similar revenue magnitude, even if less of the total revenue might flow to any single former linear video provider.

But the poll results also suggest the shift to skinny linear bundles makes sense, since that approach is best suited to a new market in which overall non-streaming demand falls. But linear streaming formats and on-demand formats will coexist.

Mobile TV is viewed as a coming evolution of the business as well. Consumers who use at least one OTT service are heavy mobile users, with many saying they are on their smartphone for more than six hours every single day.

Streamers also consume more than 2.5 hours of video content every day on their smartphones, according to the Harris Poll commissioned by OpenX.


What is less clear is how the video subscription business could change as mobile delivery becomes easier, or more popular. Screen size does not seem to be the limitation it once was, as mobility now seems to be valued at least as much as screen size. Unclear are the potential changes in features.

Some might argue that the big change coming with mobile streaming is simply the screen the video is consumed on, namely, the mobile phone instead of the television. So video consumption becomes less place-based (not a fixed TV location).

At least in principle, that creates new opportunities for temporary venue-based video, in some instances. But all that is yet to be developed. Still, it is possible that mobile TV might eventually result in new features for video consumption, as HDTV actually represented several concurrent changes beyond image quality.

Thursday, April 11, 2019

Nine Lies About Work

Half of All U.S. Households are Not Using 25 Mbps Speeds? Impossible

Many complain that Federal Communications Commission data on broadband speeds is incomplete, misleading or wrong. Fair enough. The data will likely never be as good as gathered by Speedtest and other organizations that test actual user sessions, in the concrete. On the other hand, those tests are not “scientific” in the sense of using controlled, weighted samples.

But just a bit of logic suggests many of the complaints about U.S. broadband speeds cannot be correct, either.

Roughly 10 percent of U.S. households are in rural areas, the places where it is most expensive to install fast fixed network internet access facilities, and where the greatest speed gaps--compared to urban areas--almost certainly continue to exist.

In its own work with TV white spaces, Microsoft has targeted perhaps  two million people, or roughly a million households, that have no fixed network internet access. That assumes there are two people living in a typical household, which is below the U.S. average of roughly 2.3 to 2.5 per household.

Recall that the definition of broadband is 25 Mbps downstream. Microsoft has argued that 20 million people (about 10 million homes) or perhaps eight percent of the population (perhaps four percent of homes) cannot get such speeds from any fixed network service provider.

Ignoring for the moment access at such speeds by satellite, fixed wireless or mobile, that is about the dimensions of the potential rural broadband problem. Perhaps nobody would dispute a potential speed gap for those 10 million or so homes.
But Microsoft claims about half of U.S. residents cannot get 25 Mbps service. That is hard to believe. There simply are not enough rural households to create a gap that large when urban and suburban areas, where 92 percent of people live, have access to speeds far higher than 25 Mbps.

In 2018, U.S. fixed internet service provider speeds averaged 96 Mbps downstream, according to Speedtest.


We can freely admit that the FCC data is based on sampling and inferences from that reported data. But we should at least be skeptical, and apply some tests of sanity, when claims of the magnitude of the speed gaps are made. But Ookla’s data is likely quite representative of internet users.

Consider the incentive to test a connection speed. What sort of user is most likely to do so, and when? In my own experience, people test when they suspect a problem, not because they simply want to enjoy how fast their connections are. If anything, the Ookla tests should overemphasize tests by people who suspect they have a problem (slow router, for example).

Personally, I only test when I think I have a problem.

It simply is not mathematically possible for half of U.S. homes to be using less than 25 Mbps when the number of homes in that range are assumed to be rural households, which represent less than 10 percent of total homes.

Even if 100 percent of rural homes could not buy 25 Mbps service, such locations only represent eight percent or so of all locations.

When average speeds are above 100 Mbps nationwide, it is not reasonable to conclude that those eight percent of locations--even if all were using connections slower than 25 Mbps--could represent half of all households.

Microsoft’s claims seem impossible to believe, even if we all agree the FCC data is incomplete or even wrong.

Wednesday, April 10, 2019

TVision Home by T-Mobile



T-Mobile US is getting into the linear TV subscription business. 

T-Mobile US Launches TVision Home

T-Mobile US has introduced “TVision Home,” its over the top linear video service for fixed network customers, in eight U.S. markets, apparently where its Layer 3 networks already had been available.

T-Mobile US also announced it will launch nationwide streaming services later in 2019, presumably based on use of T-Mobile’s own 5G network as well as an OTT app.

TVision Home 275-plus available channels and over 35,000 on-demand movies and shows. On-screen social content, a personalized home screen and DVR for each user, smart speaker voice control with Amazon Alexa or Google Assistant, and access to security cameras are part of the service.

TVision Home launches with apps for Pandora, iHeartRadio, XUMO, CuriosityStream, Toon Goggles and HSN.

There also is a bit of regulatory arbitrage. Fixed network service providers are required to pay local fees and taxes that TVision Home is not liable for. That allows it to sell a service costing less than cable or telco suppliers.

The implications are clear enough. T-Mobile US is getting into the linear video business, to compete with cable TV operators and telcos. At the same time, it is getting into the live TV streaming business (linear, rather than on-demand), with on-demand video services such as Netflix also accessible as part of the app.

The bigger question is whether 5G mobile TV, in this form, also will begin to show consumer appetite for linear video consumed on mobile devices. Mobile service providers have offered the ability to listen to broadcast radio and TV for decades, but with no real traction. The shift to the full palette of managed video channels is new, though.

Among the questions is how some consumers might start to use the service, casting to TVs, for example.

Tuesday, April 9, 2019

ISPs are Not the Big Internet Problem

Among the disingenuous arguments raised about network neutrality is that it somehow “saves internet freedom.” It does not. The heart of the network neutrality argument is that internet service providers should be barred from offering any quality of service features for consumer internet access services, on the theory that any other policy would allow ISPs to degrade service on their networks, while creating “value added” services that run faster.


Ignore for the moment that fixed network speeds increased about 38 percent in 2018 alone; that with 5G delivering an order of magnitude faster speeds; with new low earth orbit satellite constellations launching; with new ways to deploy fast fixed wireless access networks; with the number of ISPs actually growing; and that you would be extremely hard pressed to find any actual instances of ISPs deliberately building “crappy” networks that run slow, just so they can try and upsell.


Ignore the coming era of edge computing that will make network responsiveness even higher. Ignore Amazon’s entry into the ISP business; Google’s existing operations and Facebook’s satellite and other ISP operations.


Every public policy has private interest implications. Net neutrality essentially protects Google, Facebook and others from potential underlying cost pressures, even as all major app providers themselves pay to “speed up” their own services, using content delivery networks.


All those firms win when “everybody” has internet access, of good quality and low price. Net neutrality rules are viewed as ways to promote such outcomes. U.S. regulators never have allowed ISPs to block or degrade consumer internet access to all lawful applications, period.


So net neutrality cannot be about preserving consumer access to lawful applications, despite the breathless rhetoric. A fair assessment would be that the danger to consumer welfare comes from the content and application sphere these days: privacy violations; excessive sales of user data; biased filtering policies; selective censorship and so forth. None of those ailments are caused by ISPs.

Internet access keeps getting better, and rapidly. The era of 5G is going to provide extraordinary low latency and quality bandwidth at prices that are low, and dropping, in terms of price per bit and even often in real terms (adjusted both for inflation and different consumer choices).

Monday, April 8, 2019

IoT Markets are Hard to Quanify with Any Precision

Quantifying the size of edge computing or internet of things markets is difficult, if in fact both have both horizontal capability (general purpose use cases) as well as vertical (within specific industry vertical) implications.


Most firms--even those focusing on IoT--will not likely derive most of their revenue from specific edge or IoT sales. When “everybody” can claim to be, and in fact might be, a participant in the broad ecosystem, it also is nearly impossible to specify the specific value or revenue upside. Virtually all the major cloud suppliers, mobile and fixed connectivity providers, end user appliance and general-purpose computing firms can claim to have some role.


That means virtually every large enterprise software or solution provider, every large ISP, every large computing solutions suppliers, many device and component suppliers and system integrators can claim to have a role.


Security, robotics, connectivity services, chipsets, databases, devices, enterprise software, consulting and implementation services, vertical industry software suppliers, server suppliers, analytics, embedded computing suppliers all would have plausible roles.


Consider just industrial IoT, which might include firms such as DHL, Hitachi, Huawei, SAP, GE, Rolls Royce, Dell, ARM, Bosch, Cisco, AWS, AT&T, Fujitsu, Deutsche Telecom, Telefonica, Google, HPE, IBM, Intel, Microsoft, Oracle, Qualcomm, Salesforce, Samsung, Sierra Wireless and PC-Tel, as well as any firm that competes with these firms.


Siemens, ABB, Honeywell, Schneider Electric, ON Semiconductors, Micron, Broadcom, Analog Devices, NXP, Texas Instruments, Gemalto, RSA, Symantec, Kaspersky, Palo Alto Networks, ThingWorx, Sigfox, Ingenu, Accenture, Fujitsu, Tata, Infosys, Tech Mahindra, Fanuc, Kuka, Boston Dynamics, Zebra and ProGlove also are among firms with possible roles to play.

Verizon, Fortinet, Samsara, Eaton, SonicWall, Johnson Controls, Vertiv and Honeywell are some of the other names any industrial IoT potential participants list would tend to include.

PTC, IGEL, Red Hat, Senet, Cambium, Lenovo, National Instruments, Particle, Rigado, Roambee, Armis, Claroty, Cradlepoint, ForeScout, Trend Micro, VDOO, Xage Security, Zingbox, Falkonry, Foghorn Systems, IoTium, KMC Controls and Seeq might also be names with industrial IoT credibility.


But few of those firms will be able to generate specific industrial IoT revenue that moves the needle much on overall firm revenues.


And much the same can be said for potential suppliers in any other vertical market IoT realm. As a general theme, IoT is going to be significant. But specific forecasts will be difficult.

Can 5G Providers Sell QoS?

Can 5G service providers charge a premium for low-latency performance guarantees, when the stated latency parameters--best effort--are already so low? That is a question that also might be asked in other ways.

Will 5G best effort service be good enough, latency-wise and bandwidth, to obviate the need for any additional quality of service features to preserve low latency and bandwidth?

Is there a market for quality of service when delivered bandwidth rates are so high, and latency performance so much better than 4G? In a broader sense, as network performance keeps getting better on both latency and bandwidth dimensions, can connectivity providers actually sell customers on QoS-assured services?

Also, some would argue, it becomes problematic to try and maintain QoS packets are encrypted at the edge. A service provider cannot prioritize what it cannot see. And that is the growing trend as most traffic gets encrypted.  

By about 2020, estimates Openwave Mobility, fully 80 percent of all internet traffic will be encrypted. In 2017, more than 70 percent of all traffic is encrypted.

The other change is the emergence of edge computing for latency-sensitive applications. We can assume that the whole point of edge computing is to provide a level of quality assurance that cannot otherwise be obtained.

As content delivery networks provide such assurances to enterprises and content suppliers for a fee, so it is likely that edge computing networks or other networks relying on network slicing to maintain low-latency performance will be sold as a service to enterprises who want that latency protection.

Such deals do not violate network neutrality rules, which do not apply to business services such as content delivery networks. So, ultimately, between encryption, network slicing, edge computing and CDNS, there might actually not be much of a market for consumer services featuring QoS.

Best-effort-only never has been part of the vision for next-generation networks, whatever might have been proposed for the public internet. According to the International Telecommunications Union, “a Next Generation Network (NGN) is a packet-based network able to provide services including Telecommunication Services and able to make use of multiple broadband, QoS-enabled transport technologies and in which service-related functions are independent from underlying transport-related technologies.”

Telecom Industry About to be Amazoned

Question: What happens to any market Amazon enters? Answer: public market valuations drop; market share shifts (or at least people expect that to happen with time).

Rhetorical question: What happens when Amazon enters the telecom business? And it will. Amazon is moving to commercialize its own fleet of nearly 4,000 low earth orbit satellites, to provide internet access to literally every square inch of the earth’s surface.

Amazon wants to be an ISP for the same reason Google and Facebook do, with one major difference: the revenue model. As any ad-supported app provider’s success hinges on the total number of people able to use the apps, so any commerce supplier’s fortunes rest on the number of consumers it can establish direct or indirect relationships with.

And Amazon believes it will do better when more connections can be made directly, without relying on the goodwill of governments or other private firms.

“Four billion new customers” is a big enough carrot to justify launching a constellation of nearly 4,000 low earth orbit satellites, making Amazon one of the world’s potentially biggest internet service provider firms.

It is easy to predict what the implications are for others in the connectivity services ecosystem.

The "Amazon effect" refers to the impact created by the online, e-commerce or digital marketplace on the traditional brick and mortar business model due to the change in shopping patterns, customer expectations and a new competitive landscape. https://www.investopedia.com/terms/a/amazon-effect.asp

Some note that Amazon’s business activities prevent inflation. That’s another way of saying prices cannot rise much.  


Surveys tend to show that contestants in any business believe entry by Amazon into their own markets affects gross revenue, profit margins and distribution channels.


To be sure, margin pressure already is a huge issue in the global telecom business for other reasons (competition, changes in end user demand). Amason’s entry into one or more parts of the connectivity value chain will only worsen those pressures.

The big observation is that the telecom industry is about to be “Amazoned.”

Saturday, April 6, 2019

Will ITU Refarm All C-Band Spectrum for 5G?

Later in 2019, at the World Radio Conference later this year (WRC-19), it is perhaps likely that the entire C-band spectrum presently used by satellite operators will be reallocated for IMT-2000 (5G) purposes.

That is some of the backdrop to current discussions by the satellite, cable TV and other interests, including the Federal Communications Commission, about reallocating up to 500-MHz of spectrum presently allocated for C-band satellite, to 5G, itself a component of the overall 5G FAST plan.  

“I believe the best option would be to pursue a proposal put forth by a large, ad hoc coalition of equipment manufacturers, wireless providers, and unlicensed users,” said FCC Commissioner Michael O'Rielly. “They recommend that the FCC allocate spectrum now used for satellite C-Band downlinks (3.7 to 4.2 GHz) for licensed mobile communications and designate 6 GHz spectrum (5.925 to 7.125), which includes the C-Band uplink, for unlicensed use.”

If approved, this approach would free up 1700 megahertz of spectrum, 500 megahertz for licensed and up to 1.2 gigahertz for unlicensed purposes.

Satellite and mobile interests seem always at odds about spectrum allocation, so positions on the latest efforts in C-band will not be foreign.

A report by an advisory committee to the U.S. Secretary of Defense, co-written by vice president of wireless at Google, Milo Medin and tech venture capitalist Gilman Louie makes the point that such a development hinges on use of spectrum sharing, as has been pioneered by Citizens Broadband Radio Service.

The report recommends the “NTIA, FCC and Department of State should advocate the reallocation of the C-band satellite spectrum to IMT-2000 5G use at the World Radio Conference later this year (WRC-19), and take measures to adopt sharing in all 500 MHz of the band in the United States on an accelerated basis for fixed operations.”

A shift of former C-band satellite spectrum in the 4-GHz region might also be more important than some believe, if global 5G supply chains and service providers build product volume in the 3-GHz to 4-GHz frequency ranges.

“In the near term, 3 and 4 GHz spectrum will likely serve as the dominant global bands that drive volume in infrastructure and device deployments,” the authors argue.

And that provides some idea of the importance of how the Federal Communications Commission sets policy for refarming as much as 500 MHz of C-band spectrum in the United States, which is in the crucial band the authors say will be an area of robust supply chain focus.

Friday, April 5, 2019

After Edge Computing, Network Slicing, CDNs, is There Any Market for QoS Internet Access?

Can 5G service providers charge a premium for low-latency performance guarantees, when the stated latency parameters--best effort--are already so low? That is a question that also might be asked in other ways.

Will 5G best effort service be good enough, latency-wise and bandwidth, to obviate the need for any additional quality of service features to preserve low latency and bandwidth?

Is there a market for quality of service when delivered bandwidth rates are so high, and latency performance so much better than 4G? In a broader sense, as network performance keeps getting better on both latency and bandwidth dimensions, can connectivity providers actually sell customers on QoS-assured services?

Also, some would argue, it becomes problematic to try and maintain QoS packets are encrypted at the edge. A service provider cannot prioritize what it cannot see. And that is the growing trend as most traffic gets encrypted.  

By about 2020, estimates Openwave Mobility, fully 80 percent of all internet traffic will be encrypted. In 2017, more than 70 percent of all traffic is encrypted.

The other change is the emergence of edge computing for latency-sensitive applications. We can assume that the whole point of edge computing is to provide a level of quality assurance that cannot otherwise be obtained.

As content delivery networks provide such assurances to enterprises and content suppliers for a fee, so it is likely that edge computing networks or other networks relying on network slicing to maintain low-latency performance will be sold as a service to enterprises who want that latency protection.

Such deals do not violate network neutrality rules, which do not apply to business services such as content delivery networks. So, ultimately, between encryption, network slicing, edge computing and CDNS, there might actually not be much of a market for consumer services featuring QoS.

Best-effort-only never has been part of the vision for next-generation networks, whatever might have been proposed for the public internet. According to the International Telecommunications Union, “a Next Generation Network (NGN) is a packet-based network able to provide services including Telecommunication Services and able to make use of multiple broadband, QoS-enabled transport technologies and in which service-related functions are independent from underlying transport-related technologies.”

Will 6G be Based on Frequencies at and Above 95 GHz?





Some already speculate, based at least in part on actions by the U.S. Federal Communications Commission, that future mobile platforms, including 6G, will use almost-impossibly-high frequencies in the bands above 95 GHz. What applications could develop based on such frequencies are yet unknown. Something beyond artificial reality, augmented reality and artificially-intelligent apps are where we'd be looking. 

What Digital Transformation Really Means

Digital transformation is one of those somewhat-nebulous terms one hears all the time, where it comes to what enterprises need to do to survive and thrive in their future markets. One hears all sorts of near-platitudes about how companies must now be continuously reinventing their business processes.

The not-as-often mentioned reason for all this “digital reinvention” is that firms and industries must adapt to an era of direct-to-consumer business models that disrupt or destroy traditional distribution channels.

“Digital connections will "cut out the middleman” while “manufacturers will sell directly to customers,” Forrester Research researchers say. All of that means “changing the economics of selling, service, and fulfillment.”

In other words, the carrot is better performance in a direct-to-consumer world. The stick is business disruption and loss of markets.

The sort of typical way this is stated is that firms must create the ability to deliver easy, effective and emotional customer experiences that customers value. Many would say the winners are working from the customer’s perspective, not the organization’s view. That is almost too lyrical.

Digital transformation is much more raw; a response to more-difficult markets characterized by growing transparency of supply and prices that combine to attack profit margins.

In other words, though we often think of digital transformation as something firms need to do--and though that is an apt characterization--digital transformation speaks to the “Amazoning” or “Alibaba-ing” of virtually all markets.

“Using hardware, software, algorithms, and the internet, it's 10 times cheaper and faster to engage customers, create offerings, harness partners, and operate your business,” say researchers at Forrester Research.

Ability to create and support digital marketplaces is one angle. But It is more than widespread “online commerce.” Nor is it just the ability to create digital products and services.
“You want to be customer obsessed, not competitor obsessed,” Forrester researchers say.

All true. But what is really happening is a drastic altering of the balance of power between customers and companies.

And that means lower revenue and lower profit margins as transparency destroys pricing premiums created by consumer lack of knowledge or accessibility.

Non-efficient pricing becomes nearly impossible.

So, ignoring all the fancy clauses, firm digital transformation aims to prepare firms for a direct-to-consumer world, with all that implies for the ways services and products are created, marketed, sold and supported.

Thursday, April 4, 2019

Will AR and VR Finally Make the Net Neutrality Debate Superfluous?

“Network neutrality” rules never have been designed to prevent business services from providing different levels of service; prioritized delivery or quality of service. That is precisely why content delivery networks add value.

The issue is whether rules mandating that nothing other than “best effort” internet access for consumers actually is good policy, going forward, if one assumes that any number of new apps and services, based on augmented reality or virtual reality, are going to be important some day, for consumers.

With the caveat that there is much nonsense in the arguments made in favor of network neutrality rules--“save the internet” being among the most obvious examples of that--it seems obvious that if VR and AR require stringent control of latency, that is an obvious example of why forbidding anything other than best-effort internet access is going to be an obstacle to AR and VR apps and services.

For gaming apps, a human requires 13 milliseconds or more to detect an event. A motor response by a gamer might add 100 ms of latency, just to react. But then consider artificial reality or augmented reality use cases.

To be nearly indistinguishable from reality, one expert says a VR system should ideally have a delay of seven milliseconds to 15 ms ms between the time a player moves their head and the time the player sees a new, corrected view of the scene.

The Oculus Rift can achieve latency of about 30 ms or 40 ms under perfectly optimized conditions, according to Palmer Luckey.

There also are other latency issues, such as display latency. A mobile phone, for example, might add 40 ms to 50 ms to render content on the screen. Any display device is going to add about that much latency, in all likelihood.

The point is that end-to-end latency is an issue for VR apps, and edge computing helps address a potentially-important part of that latency.  

To have any hope of reducing latency to tolerances imposed by the displays themselves, VR and AR content will have to have extensive forms of quality of service guarantees, almost certainly by caching content at the very edges of the network, and using networks such as 5G with very low latency.

To be sure, it is not clear that something other than 5G best effort latency is a problem if the edge data centers are close enough to the radio sites. On the other hand, neither is it obvious that an edge services provider can be legally barred from charging for the use of what is a next-generation content delivery network.

And that might ultimately be the practical resolution of the “best effort only” conundrum. Perhaps standard best-effort delivery on a 5G network is good enough to support VR and AR, so long as content is edge cached. So there are no fast lanes or slow lanes: all lanes are fast.

On the other hand, edge computing services can charge a market rate for use of their edge computing networks.

Indirect Monetization of Language Models is Likely

Monetization of most language models might ultimately come down to the ability to earn revenues indirectly, as AI is used to add useful fe...