Friday, May 25, 2018

Technology Predictions are Often Spectacularly Wrong

Nobel prize winner (in chemistry) and physicist Ernest Rutherford (known for his work on nuclear physics) once experimented with radio waves, but gave it up when told radio had no future.

The point is that even the best and brightest minds in technology often are very wrong about how a particular innovation will develop. So humility is not a bad attitude for any market researcher to adopt.

In 1943, Thomas Watson, IBM CEO said “I think there is a world market for maybe five computers.”

Ken Olson, Digital Equipment Corp. president, in 1977 said “there is no reason for any individual to have a computer in his home.”

Western Union execs once argued the telephone “is inherently of no value to us.”

Thomas Edison said “fooling around with alternating current is just a waste of time.”

Others argued the automobile was a novelty or fad. Studio executive Daryl Zanuck once said “television won’t be able to hold on to any market it captures after the first six months.”

Even Marty Cooper, a mobile phone pioneer, once argued that “cellular phones will absolutely not replace local wire systems.”

Robert Metcalfe, a father of Ethernet, said “the internet will soon go spectacularly supernova and in 1996 catastrophically collapse.

Steve Ballmer, Microsoft CEO, once said “there’s no chance that the iPhone is going to get any significant market share.”

The point is that the best and brightest among us, and certainly even competent market researchers, can be spectacularly wrong. That might be especially the case when subjects we do not routinely cover start to reshape industries, markets and possibilities, but such dangers always exist, even in markets we do claim to cover.

“You don’t know what you don’t know” is one explanation for forecasting error. No forecaster is able to incorporate somewhat unplanned events such as major recessions, huge changes in underlying core technologies, wars, revolutions and other exogenous events that directly affect markets of any sort.

Also, we tend to work with two-dimensional spreadsheets. Reality necessarily is more complex than that. So humility always is a good attitude to maintain when making predictions.

Why Software-Defined Networks are Destined to be More Important

There is a good reason why software-defined networks are getting more attention. Simply, all business and consumer networking use cases are becoming “cloud-based” interactions that are themselves largely virtualized.  

For enterprises and businesses, that tends to mean a shift towards virtual private networks supporting mobile, remote and distributed users who require access to cloud-based computing resources, no matter where they are working. That also implies a need for security and quality of service support provided by VPNs.


So a reasonable person might conclude that the shift in enterprise networking will be in the direction of software-defined networking (SD-WANs, VPNs, virtualized core networks, network slicing).

Call Center Experience the Weakest Part of Service Provider Performance

Unhappiness with call center interactions seems to be the key reason for low customer satisfaction scores with linear video subscription services (no direct question about price or value-price relationship is asked).

And though fixed-line voice service ranks higher, the same pattern of unhappiness with call center experience occurs there. Call center experience also is the lowest-ranked feature of mobile satisfaction tracked by ACSI.

I cannot remember a time in three decades when cable TV services got high satisfaction ratings. Nor, since the American Customer Satisfaction Index (ACSI) began tracking internet service providers, do I recall ever seeing high satisfaction ratings for internet access service, either.

In fact,  “subscription television and internet service providers rank last among all industries tracked by the ACSI,” a placement that has been consistent for several years.

It appears much of the problem lies with customer interaction and customer support, especially the use of call centers.

Mobile phone services, on the other hand, score much higher, though not as high as the devices themselves. Fixed network phone service, perhaps paradoxically, not scores higher satisfaction than subscription TV or ISP service. The likely explanation there is that all the unhappy customers have left.

The 2018 annual ACSI report on such services only confirms the trend.


Customer satisfaction with subscription television service fell 3.1 percent to an ACSI score of 62, an 11-year low, ACSI says.

New indices for video on demand and video streaming initially show higher scores, with video streaming services on par with mobile phones.  



Thursday, May 24, 2018

Moore's Law Really Does Matter

Moore’s Law and optical fiber matter, where it comes to fixed network internet access speeds.

Back in the early 1980s, when I first got into the cable TV business, many rural systems were operating at less than 200 MHz of total analog bandwidth, the first big city franchises were about to be awarded, and the state of the art was systems promised to operate at 400 to 450 MHz.

All that was before optical fiber and the hybrid fiber coax architecture, the need for reliable two-way communications or data services.

Because of Moore’s Law advances and optical fiber, HFC physical bandwidth now pushes between 750 MHz and a gigaHertz and internet services now push between hundreds of megabits per second and a gigabit per second, using DOCSIS 3.1.


It is possible, perhaps likely, that bandwidth will grow further beyond planned improvements to DOCSIS 3.1.
Indeed there is early speculation about what might be possible with next-generation DOCSIS that harnesses new spectrum ranges. Other proposed ways of increasing symmetrical bandwidth require all-fiber networks and full-duplex networking, where the same bandwidth is used for both upstream and downstream communications.

The point is that advances in computing power, with lower prices, plus optical fiber, make possible amounts of commercial bandwidth that would have been unthinkable back in the early 1980s.

Nobody Knows "How Many" Facilities-Based Telcos Can Exist in a Mature Market

In most fixed network telecom markets, the reality is that only a single facilities supplier is financially sustainable on a national basis, so competition usually takes the form of wholesale obligations. Mobile markets historically have featured at least two to four facilities-based competitors.


But as in the fixed network market, there are questions about sustainable numbers of contestants as the market matures. Over time, fewer competitors are generally expected.


The big issue for regulators is how few competitors are required to provide the benefits of competition, but on a sustainable basis. And that answer is not yet known.


“The idea that the U.S. mobile market has an equilibrium of four firms (nationally, at least) is an emotional and not a scientific conclusion,” said George Ford, Phoenix Center for Advanced Legal and Economic Policy Studies chief economist.


In other words, four national providers might not be sustainable. That view is supported, Ford argues is entirely consistent with the financial struggles of Sprint and T-Mobile US. Even Arcep, the French regulator, now hints that it might allow consolidation in the French mobile market that it long has resisted.


Still, the U.S. Department of Justice said in 2011 that the transition from four to three mobile mobile providers in the U.S. would constitute an unacceptable reduction in the number of competitors.


That combination of AT&T and T-Mobile US would have raised market concentration scores on the Hirschman-Herfindahl Index (HHI) by more than 400 points, a level guaranteed to raise antitrust scrutiny.


Such a score is not an absolute barrier to any particular merger, but places a strong burden on the proponents to show why the merger is not anticompetitive.


Though what is the market? Is not a big question for regulators who will look at the Sprint merger with T-Mobile US, there are going to be bigger questions for some of the other possible mergers, starting with AT&T and Time Warner.

Such foundational questions about the relevant market also are likely to be an issue faced by regulators, if they look at market concentration in application markets lead by the likes of Google, Facebook and Amazon, in the future.

Talk Talk Sales Show Value of Indirect Channel

There is reason why many products are sold using indirect (partner) channels. This chart by Talk Talk in the United Kingdom illustrates the basic economics. Talk Talk sells 83 percent of its products wholesale, using indirect channels (channel partners).

The traditional reason for using indirect channels is that a supplier cannot afford to sell direct to customer segments targeted by the channel partners.

In fact, Talk Talk reports that earnings (cash flow) from the indirect channels used to sell to consumers are about the same as cash flow from the business customer segment which is sold using a direct sales force.

source: Talk Talk

Wednesday, May 23, 2018

Why 4K/8K TV is a Waste for Most People

For most consumers, 4K and 8K TVs are unlikely to provide an actual experience boost, despite the denser pixel count. The reason is that the human eye cannot tell the difference between 4K and 8K from 1080p unless a person sits uncomfortably close to a screen, or unless the screen is really huge. Simply put, 4K is a waste of money, as 8K will be, for most people.

Most people simply do not sit close enough to the screen to perceive the difference 4K or 8K can provide.  

It is obvious why consumer electronics companies want to sell you new TVs. TVs no longer break, and manufacturers need new reasons for you to buy a new screen and move the existing screen to a bedroom or elsewhere in a house.

Content developers have their own reasons for wanting higher resolution: it is part of the decades-long effort to create greater realism and experiential immersion.


The trend to bigger screens therefore makes sense. Either people have to move closer to their screens, or screens have to get much bigger. Bigger screens probably are the only realistic option.

But 4K and 8K really make sense for business, medical, industrial and other applications where a human operator actually is very close to a screen with very-rich detail.

Zoom Wants to Become a "Digital Twin Equipped With Your Institutional Knowledge"

Perplexity and OpenAI hope to use artificial intelligence to challenge Google for search leadership. So Zoom says it will use AI to challen...