Tuesday, December 10, 2019

Powerline Electromagnetic Effect is No More Demonstrable Now than 40 Years Ago

In addition to concern about any possible causal link between mobile phones and cancer, some see dangers elsewhere, including electromagnetic fields around power lines. At least one older study suggested there is a connection between childhood leukemia and living near power lines.  

A 2018 paper in Nature reviewed dozens of studies over decades and determined that kids who lived within 50 meters (165 feet) of a 200 kilovolt or higher power line had a slightly elevated risk of contracting leukemia. 

But the researchers concluded that the intensity of magnetic fields couldn’t be the culprit because the intensity of the magnetic fields wasn’t high enough to explain the findings.

Power line magnetic fields top out at about 2.5 microteslas when you’re directly underneath, whereas the earth’s magnetic field, to which we’re all exposed all the time, varies from 25 to 65 microteslas, 10 to 26 times higher. In other words, it is impossible to separate the effect of living new power lines with generalized background effects. 

“Even those researchers who have found a correlation between high-voltage power lines and childhood leukemia are dubious that what they’re measuring has to do with power lines at all,” though.  In 2005, one researcher suggested that living near power lines is also correlated to something else that really does increase a child’s leukemia risk.

 “Reasons for the increased risk, found in this and many other studies, remain to be elucidated,” wrote the researchers.

Some researchers have speculated that something else is the kind of hygiene, nutrition, general quality of life, and chemical exposure conditions that exist in communities through which high-voltage power lines are allowed to cross.

How Important is Net Promoter Score in Telecom?

The net promoter score is considered useful as a predictor of potential revenue growth, the theory being that customers willing to recommend a firm are loyal, and therefore, repeat buyers. So the higher the net promoter score, the better positioned a firm is supposed to be, in terms of ability to generate a profit.

Bain and Company fellow Fred Reichheld, inventor of the net promoter score, an index of customer willingness to refer a product to others, once famously argued that loyal customers were more profitable. 

The argument is that loyal customers generate increasing profits each year they stay with a company, in part because they buy more, and because they impose fewer operating costs. They know how to use a company’s products, have figured out why they use a product and therefore are less likely to have questions about billing and other elements of the product experience. 

They also arguably make more referrals to others, which is what the NPS attempts to measure. In many cases, loyal customers might also be willing to pay a premium rather than switch. 

“In financial services, for example, a five percent increase in customer retention produces more than a 25 percent increase in profit,” Reichheld argued. 

But some question its relevance and predictive power, as popular as NPS is in many firms. “Two 2007 studies analyzing thousands of customer interviews said NPS doesn’t correlate with revenue or predict customer behavior any better than other survey-based metric,” two reporters for the Wall Street Journal report. “A 2015 study examining data on 80,000 customers from hundreds of brands said the score doesn’t explain the way people allocate their money.”

Of all the criticisms, lack of predictive capability might be the most significant, since that is what the NPS purports to do: predict repeat buying behavior. 

“The science behind NPS is bad,” says Timothy Keiningham, a marketing professor at St. John’s University in New York, and one of the co-authors of the three studies. “When people change their net promoter score, that has almost no relationship to how they divide their spending,” he said. 

Others might argue that social media has changed the way consumers “refer” others to companies and products. Some question the methodology

As valuable as the “loyalty drives profits” argument might be, it is reasonable to question how well the NPS, or any other metric purporting to demonstrate the causal effect of loyalty or satisfaction on repeat buying, actually can predict such behavior. 

For similar reasons, it might be fair to question relevance in some industries that habitually score at the very bottom of U.S. industries on NPS, such as internet service provider business or the cable TV business. Where NPS scores range from zero to 100, cable TV and ISP service ranks in negative numbers, 2019 U.S. NPS scores show. 


One issue with the NPS is that some argue customer satisfaction is what is measured, not loyalty. The difference is subtle, but possibly important. 

Surveys have shown that even satisfied customers will switch brands. The point of loyalty is that customers show resistance to switching behavior. And some point out that only “complete satisfaction” is highly correlated with loyalty (repeat buying behavior). Merely “satisfied” customers arguably are as fickle as unhappy customers. 


Those rankings are congruent with satisfaction surveys published by the ACSI, which show internet access and cable TV at the bottom of all industries ranked, virtually year after year. 



Some might argue that the NPS or other measures of satisfaction are more important in highly-competitive industries, while of little use in monopolized businesses. This 1995 chart shows how little customer satisfaction mattered in the telephone business, then a monopoly. Whether very satisfied or completely dissatisfied, buying behavior was not affected. There were no choices. 

These days, as the telecom business is significantly competitive, we can argue about the importance of customer satisfaction, to a degree. Possibly nobody would claim customer satisfaction does not matter, as a contributor to customer loyalty (repeat buying). But neither is it completely clear how important satisfaction actually happens to be. 

Nor is it possible to divorce the importance of customer targets from the broader satisfaction measures. Any firm has to match its offers with the right audience, not just the right features and value proposition.

Targeting the wrong customers will generally fail, with high rates of churn and customer dissatisfaction. The oft-cited example is chasing price-sensitive customers who will quickly churn off once the discounts end. 

Customer satisfaction is not the same as customer loyalty, in other words. But it might still be argued that net promoter scores do matter within an industry, as a way of measuring performance against a firm’s competitors. In other words, it might well matter if Verizon’s service gets higher NPS than Comcast. 

Still, little research seems to have been done on circumstances when NPS actually is misleading or irrelevant. Industries that are declining might be an instance where even lower NPS or higher NPS scores do not matter much, as revenues are shrinking inexorably. At the margin, slower rates of decline are better than faster rates of decline, so higher NPS might have some value. 

Stil, if demand is declining, ultimately even high NPS does not matter. The market is shrinking, so high recommendations will not fundamentally change revenue prospects.

Monday, December 9, 2019

FCC Says 5G Just as Safe as Other Gs

Are 5G or other mobile phones safe to use? The scientific evidence so far suggests the clear answer is “yes.” And after a new review in light of 5G network launches, the U.S. Federal Communications Commission has found no reason to modify its existing guidelines. 

“As noted by the FDA, there is no evidence to support that adverse health effects in humans are caused by exposures at, under, or even in some cases above, the current RF limits,” says the FCC. “Indeed, no scientific evidence establishes a causal link between wireless device use and cancer or other illnesses.”

The FDA also maintains that “[t]he scientific evidence does not show a danger to any users of cell phones from RF exposure, including children and teenagers,” the FCC says. 

The World Health Organization (WHO) states that “[f]rom all evidence accumulated so far, no adverse short-or long-term health effects have been shown to occur from the RF signals produced by base stations.

The FDA maintains that “[t]he weight of scientific evidence has not linked cell phones with any health problems”31 and that “the current safety limits for cell phones are acceptable for protecting the public health.”

Upon review of the record, the FCC says, “we find no appropriate basis for and thus decline to initiate a rulemaking to reevaluate the existing RF exposure limits.”

In a recent proceeding taking a long at radio frequency emissions standards, the FCC found a  “lack of data in the record to support modifying our existing exposure limits.”

“Specifically, no expert health agency expressed concern about the Commission’s RF exposure limits. Rather, agencies’ public statements continue to support the current limits,” the FCC says. 

“Our existing exposure limits were adopted following recommendations from the U.S. Environmental Protection Agency (EPA), the Food and Drug Administration (FDA), and other federal health and safety agencies,” the FCC says.

“While research on the health effects of RF energy continues, no evidence has moved our sister health and safety agencies to issue substantive policy recommendations for strengthening RF exposure regulation,” the FCC says. 

Indeed, says the FCC, the standards some argue the FCC should adopt are millions to billions times more restrictive than FCC limits. The practical result is that “no device could reliably transmit any usable level of energy by today’s technological standards while meeting those limits.” In other words, no cell phone use whatsoever, by anybody. 

“There is no scientific evidence in the record that such restrictive limits would produce any tangible benefit to human health, or provide any improvement over current protections against established risks,” the FCC says. 

AWS Wavelengths Does Not Create a Platform Opportunity for Telcos

One of the most-common suggestions for connectivity service providers selling to consumers is the notion that the business model has to evolve from “connectivity” (dumb pipe) to something else, up to and including “becoming platforms.”

So look at what telcos have been doing in the edge computing business so far. You might argue the approach is not “becoming a platform” but supplying dumb pipe (hosting, in this case). Amazon Web Services, for example, is partnering with several tie-one telcos to create edge computing as a service nodes. 

Wavelength is a physical deployment of AWS services in data centers operated by telecommunication providers to provide low-latency services over 5G networks. Operators signed up so far include Verizon, Vodafone Business, KDDI and SK Telecom.

Keep in mind, in this arrangement, it is AWS that becomes the platform. The telco participates as a supplier of rack space and related services, and benefits indirectly to the extent that its connectivity service adds value. 

Taxonomically, the telco acts as a “pipeline” business, creating a capability (server hosting) and selling it direct to a customer (AWS). Most businesses historically have been pipelines, creating products and selling to customers, so that is not unusual. 

The important fact to note is that, for this particular opportunity, telcos are not seeking to create a platform. AWS is the platform. Telcos sell a pipeline service, which, by definition, is sold to a single type of customer. 

A platform, also by definition, involves becoming a marketplace where services or products are sold to at least two different groups of constituents, and where the platform enables transactions. 

Ridesharing services, for example, are platforms, linking drivers and riders, but not owning or creating the resources used for fulfillment. 

The optimistic view on creating a platform is that any product can become a platform if information or community can create new value. The unstated corollary is that the “platform” activities must generate incremental revenue. 

And there is no shortage of recommendations that telcos become platforms. “Operators will have to shift from traffic monetization (relying mainly on connections) to traffic value monetization (inclusive of rate, latency, and slicing),” say HKT, GSA and Huawei. “In other words, operator business models must provide both intelligent platforms and services instead of merely traffic pipes.”

If you have been in the communications business long enough, you have heard that suggestion almost all the while you have been in business. “Value, not price” is the way forward, one hears. 

That is correct, up to a point. Any pipeline business can add, augment or replace the actual products it creates and sells to customers. It matters not what the product is that is created and sold to customers. Generally speaking, this is the meaning of the advice to “move up the stack,” supplying value beyond connections, bandwidth or minutes of use. 

The more-challenging notion is “become a platform.” The Wavelengths deal is not the only way telcos can participate in the edge computing business. But it is unlikely to create a platform for telcos, because, by definition, the sale of hosting to AWS is not a platform business model. 

Quite to the contrary, Wavelengths is both traditional “hosting” and also seems a direct outgrowth of the way AWS has in the past sourced computing infrastructure, including a mix of owned and leased facilities. Along the way, AWS has had to create and get comfortable with the idea of its servers operating in somebody else’s facilities.

Up to this point, the “somebody else” has been third party data centers. But edge computing requires even more decentralized facilities. Hence, Outposts, a rack of servers managed by AWS but physically on-premises. 

The customer provides the power and network connection, but everything else is done for them. If there is a fault, such as a server failure, AWS will supply a replacement that is configured automatically. Outposts runs a subset of AWS services, including EC2 (VMs), EBS (block storage), container services, relational databases and analytics. S3 storage is promised for some time in 2020. 

Local Zone, currently only available in Los Angeles, is an extension of an AWS Region, running in close proximity to the customers that require it for low latency. Unlike Outposts, Local Zone is multi-tenant. AWS deploys only when there is a critical mass of customers unable to take advantage of an established AWS region. 

All three services essentially are built on Outposts and local server facilities using third party sites. As AWS had to get comfortable with third party data centers hosting its servers, so Outposts extends that hosting to enterprises. 

A Local Zone is effectively a large group of outposts. 

Wavelength involves Outposts located inside a telco facility of some kind, likely often a central office. AWS is early to move, but the other hyperscale computing-as-a-service providers also are expected to make big moves toward edge computing facilities as well. 

By 2023, by some accounts, the hyperscale cloud computing firms will be spending $23 billion in a single year to create edge computing facilities, about half of total capex in that year. 

All interesting. And telcos are likely to experiment with other initiatives in edge computing. But Wavelengths does not achieve the objective of creating a platform.

Telcos May Not Even Try and Enter the General Purpose Computing as a Service Business at the Edge

It has to be said: in choosing to supply AWS with edge hosting facilities, a few tier-one telcos, likely to be followed by others, are making a considered bet that edge computing as a service is likely to be lead by, if not dominated, by the same providers in the ecosystem that dominate computing as a service. 

As they have found in other areas, winning a fight with Google, Facebook, Amazon and others in the application space is unlikely. That rational belief appears to condition strategies in the emerging edge computing space as well.

The AWS deal seems to signal belief that a general role as edge computing supplier will be lead by the hyperscalers. Hosting (the real estate role), on the other hand, might work. There are trade-offs. The highest-margin role likely will remain with the hyperscalers.

But a significant role in the hosting role would be a win for most telcos, who have generally not been able to carve out similar roles in the existing data center business.

It likely is too late for telcos to replicate the hyperscaler role in “as a service” computing, at the edge or elsewhere.

But edge data center hosting gives them another chance to carve out a role in the ecosystem. And the AWS partnership balances risk and reward, even if it signals belief in a smaller potential role in actual edge computing.

Telcos have not been hugely successful, outside of mobility or video entertainment, in creating big new businesses and revenue streams. Edge computing seems a promising area.

But many could recall that the data center business also was seen as a logical area of new revenue generation that meshed with the existing core competency of connections and data transport.

So recent deals between Amazon Web Services and a handful of tier-one service providers are instructive.

Basically, AWS Wavelengths creates AWS edge computing nodes at the edge of the telco network. So telcos act as providers of hosting (racks, power, security, cooling). 

In seeking that role in edge computing, the telco partners avoid the heavy capex required to emulate what the hyperscalers can provide their customers, instead choosing the simpler hosting role.

What they may be hoping is that the AWS moves lead to similar deals with many of the other hyperscale computing as a service providers, creating a data center hosting role some telcos tried and abandoned earlier. 

While other roles are not foreclosed, the AWS partnerships suggest that executives do not believe they are in position to invest in--or win--the battle for computing as a service. As many discovered earlier, the data center business has generally not been an area where telcos brought  significant advantages. 

On the other hand, perhaps many are betting that an early lead can be gained in the “edge facilities” part of the data center business, before potential rivals can scale their efforts. Of course, the hyperscale computing as a service suppliers are at the top of the list of potential leaders of the coming edge computing business. 

So the optimistic view might be that although not in position to lead edge-based computing as a service, telcos might secure a meaningful role in the edge data center hosting business, which requires distributed smallish data center locations.

Telcos of course have long considered former central offices or switching centers to be ideal real estate, in that regard. In metro areas where most of the edge computing demand will develop, central offices sit at the center of access networks running a few miles or so from end user locations. 

At least some mobile switching offices also are viable candidates for edge facilities as well.

Other roles are not foreclosed by the telco deals with AWS. It is conceivable that some vertical market services might develop where a few telcos are significant providers of the applications or capabilities. Vehicle communications and computing are logical candidates, for example. 

Still, the AWS deals are stark reminders that the edge computing ecosystem is, at the moment, most favorable for telcos as suppliers of rack space and communications. Most observers would probably agree with that assessment. 

That role also is arguably a capital-efficient and low-risk way  to enter the market. Other roles are not foreclosed. But perhaps few observers really believe the long-term telco opportunity is greatest anywhere but in the “pipeline” and “real estate” areas.

Saturday, December 7, 2019

AWS Transit Gateway Gets AWS into the WAN Business

Amazon Web Services has launched Transit Gateway, a private network anchored by the nationwide AWS network. Among the advantages for enterprises is a reduction in the number of private lines required to connect branch offices and other remote locations. 

In essence, Transit Gateway puts remote locations on the AWS network, creating the equivalent of peering relationships. 


What if Bandwidth were Free?

The "original insight" for Microsoft was the question: "What if computing computing were free?" It might have seemed a ludicrous question. Young Bill Gates reportedly asked himself what his business would look like if hardware were free, an astounding assumption at the time, when owning a computer was an impossibility, as they cost several millions of dollars. 

In 1970, a computer cost perhaps $4.6 million, as there only were mainframes.

“The mainframe we played tic-tac-toe on in 1968, like most computers of that time, was a tempermental monster that lived in a climate-controlled cocoon,” Gates wrote in his book The Road Ahead. “When i was in high school, it cost about $40 an hour to access a time-shared computer using a teletype.”

When Micro-Soft was founded, Gates concluded that the cost of computers would drop so much that the cost of the hardware was not a barrier to using them. In turn, that meant a huge business could be built supplying software for those computers. But nobody should minimize the near-crazy assumptions made at the time: that million-dollar computers would become so cheap that the cost of computing was nearly free. 

Prices dropped by orders of magnitude in the personal computer era, however, confirming the original insight by Gates. In 1972, an HP personal computer cost more than $500,000. In inflation-adjusted terms, an Apple II computer in 1977 cost $5,174, for example. 

In 2004, Gates argued that “10 years out, in terms of actual hardware costs you can almost think of hardware as being free. I’m not saying it will be absolutely free, but in terms of the power of the servers, the power of the network will not be a limiting factor.”

You might argue that is a position Gates adopted recently. Others would argue that has been foundational in his thinking since Micro-soft (the original spelling of what became Microsoft) was a tiny company based in Albuquerque, New Mexico in 1975. But prices did, in fact, tumble. 

Microsoft's newer "insight question" was: "What if digital communication were free?" It's the same scenario, only this time it applies to the capacity to move data--audio and video as well as text--from one point to another. 

Communications industry executives hate the idea, but facts tend to support the notion that the cost of using bandwidth keeps dropping to the point where there is almost no barrier to using it. 
As Intel CEO Andy Grove once famously said, "If you think PC prices have plummeted, wait till you see what happens to bandwidth. 

As much as telecom executives might rue the observation, bandwidth is approaching the point where its use does not impede creation and use of applications, no matter how bandwidth-intensive they might be. 

Among the biggest problems telecom service providers face is that connectivity prices in the digital era have shown a “disturbing” tendency to drop relentlessly lower, in some cases even trending towards zero. Consider the price of mobile service, which has dipped since 1997 by about half, while prices for other products have increased 100 percent to 200 percent. 


Also, some products sold by internet service providers, including entertainment video subscriptions, virtually require very-low bandwidth costs. According to Cisco, 80 percent of global traffic now consists of video. And that means virtually all networks must be designed to carry entertainment video at very low cost per bit. 

Bandwidth is not “free.” But it is affordable, and constantly getting more affordable. The key business implication is that internet bandwidth costs are, and will be, low enough so that use of internet apps is not impeded. 

Computing hardware, though not free, is no longer a barrier to widespread use. Neither is bandwidth. Though the trend has not yet reached ubitquity, that is the direction. A good analogy is electricity. It is affordable, not free, but costs are reasonable enough devices and applications based on its consumption are plentiful. 

So it will be with bandwidth.

Will AI Fuel a Huge "Services into Products" Shift?

As content streaming has disrupted music, is disrupting video and television, so might AI potentially disrupt industry leaders ranging from ...