Thursday, July 23, 2020

"New Normal" Might Well Look Pretty Much Like the Old Normal

This ABI Research forecast showing how company analysts have revised projections of future activity because of the Covid-19 pandemic might also suggest why many predictions of a “new normal” might turn out to be mostly ephemeral. That is not the conventional wisdom, but there are precedents for “less change than might have been supposed.”


source: ABI Research


To be sure, the pandemic might accelerate any pre-existing trends. What already was in motion might “go faster” because of pandemic responses. That is why some observers say internet usage experienced a year’s worth of change in a couple of months when people were forced to work from home and stay home from school. 


It is common to hear arguments that many or most workers will continue to work from home, post pandemic, and not return to in-office work. Many argue business travel will never return to past levels. Some might argue that will last until lots of firms start losing market share to competitors who are getting face to face with prospects. 


Beyond that, there are other events few who make the “nothing will be the same” arguments. They assume there is no vaccine, or that most people will refuse to vaccinate. They tacitly seem to assume some new version of virus keeps recurring, so that no country is ever fully “over” the pandemic. 


But people and firms will make different decisions if they no longer must worry about social distancing, masks and other protective measures. A non-scientific poll on Blind, for example, shows 66 percent of respondents reporting work from home is harming their mental health. Though impressionistic, that suggests people will want to return to normal work settings, once safety is no longer an issue. 


Productivity might also be an issue, longer term, if work from home spreads widely. 


Also, most businesses probably cannot sustain themselves with permanent social distancing. Profit margins are too thin to allow restaurants to operate at 25-percent capacity; airlines to block middle seats forever; elevators to restrict riders to only a few at a time. 


Expanding into Adjacencies is Risky, But Risks Must be Taken

When firms look for growth in slow-growth businesses, one obvious option is to seek adjacent roles in the ecosystem. The other common routes to “diversify” are to add new customer segments, new roles for existing products, new products, distribution channels or geographies. 


It never is especially easy, as analysts note that the odds of success decrease as firms move further away from their present “core” competencies and roles. 


source: Harvard Business Review 


The same holds for internal reform of organizations. Consider a study by McKinsey on successful organizational change. That study suggests that about 26 percent of all attempted organizational transformations succeed, whether or not change agents have taken at least 24 discrete actions in support of the change. In that study, the suggested actions are not necessarily the same as approval hurdles. But the principle is likely at work.


source: McKinsey


This should not come as a surprise. All proposed internal changes encounter resistance. Management experts sometimes note that the chances of any successful organizational change are somewhat slim, and more difficult as the number of approvals grows. 


source: Purdue University


If you have ever spent time and effort trying to create something new in the communications business, you know it rarely is easy, simple or uncomplicated to do so, and the larger the organization you work for, the harder it seems to be. That is because all organizational change involves power and politics, and changes will be resisted.  


You might be familiar with the rule of thumb that 70 percent of organizational change programs fail, in part or completely. 


There is a reason for that experience. Assume you propose some change that requires just two approvals to proceed, with the odds of approval at 50 percent for each step. The odds of getting “yes” decisions in a two-step process are about 25 percent (.5x.5=.25). 


source: John Troller 


The odds of success get longer for any change process that actually requires multiple approvals. Assume there are five sets of approvals. Assume your odds of success are high--about 66 percent--at each stage. In that case, your odds of success are about one in eight for any change that requires five key approvals (.66x.66x.66x.66x.66=82/243). 


You might argue the difficulty of change means firms should not try to change. That might work in stable industries, for stable firms, when demand is constant and profit margins are reasonable.


But connectivity firms do not work in that environment. There essentially is no option to “do nothing.” All legacy product lines are shrinking and must be replaced. So the risk of change must be taken.


How Much Value from Telco Data Stores?

One often hears it said that connectivity providers serving enterprises and consumers have a trove of data that can be mined using analytics to predict or help prevent some issues such as churn. Call center detail can be used to identify service issues and outages. There might also be other ways to mine data stores to reduce truck rolls and service calls, especially related to network issues. 


Beyond that, one might argue, there is not actually all that much data useful at the application layer. Mobile network operators have location data, but it never is so clear that such data can be used in a personally-identifiable way, allowing revenue streams to be created. 


Google, on the other hand, seems to know with great precision not only where users are, and also can build histories of movement in space. Google also has lots of other data to collate with movement details, which is how it builds its advertising business.


The point is the value of service provider analytics never seems as high as proponents tout, beyond network operations details. 


That is not to downplay the value of network-related analytics. But one might be skeptical about the value of much other customer account data as a way of predicting much of anything about demand for new services. Telco data stores do not inherently include much detail that is useful for psychographics or even much in the way of demographics. And then there are privacy laws which might restrict the use of such information, even if available. 


One often hears it said that there are “hundreds” of indicators an account is about to churn, for example. And, to be sure, U.S. mobile operator churn levels are quite low. 


What is not so clear is that it is the application of analytics that primarily explains the lower churn. Some might argue that competition has been so robust that switching does not yield the benefits it once dida, and the friction, such as having to buy a pricey new smartphone, also act as barriers to switching behavior. 


At least in the U.S. mobile market, multi-user plans also seem to have worked to reduce churn levels. Multi-service bundles have provided the same benefit in fixed network operations. 


The point is that some perspective on how much can be gleaned from telco data stores is prudent. Beyond network health and status, which does contribute to customer service call and chat volume, there is arguably little useful detail beyond location (and limited ability to use such personally-identifiable information) that might prove the foundation for applications and revenue streams based on that data.


Tuesday, July 21, 2020

Is U.S. Internet Access Actually Expensive or Slow?

Minimums, median and maximum all are valuable indices in life, business and nature, including measures of internet access adoption or “quality.”


Benchmarks are valuable when trying to measure “progress” toward some stated goal. A minimum speed definition for broadband access is an example. But that does not obviate the value of knowing maximum and median values, either, especially when the typical U.S. internet access buyer routinely buys services significantly higher than the minimum. 


In the first quarter of 2020, for example, only about 18 percent of U.S. consumers actually bought services running at 40 Mbps or less. All the rest bought services running faster than 50 Mbps. 


source: Openvault


An analysis by the Open Technology Institute concludes that “consumers in the United States pay more on average for monthly internet service than consumers abroad—especially for higher speed tiers.” 


As always, methodology matters. The OTI study examines standalone internet access plans, even if that does not account for the plans most consumers actually buy. The figures do not appear to be adjusted for purchasing power differences between countries. Were that done, it might be clearer that average internet access prices are about $50 a month, globally


Global prices are remarkably consistent, in fact, when adjusting for purchasing power conditions in each country.  


Nor does any snapshot show longer term trends, such as lower internet access prices globally since at least 2008. A look at U.S. prices shows a “lower price” trend since the last century. U.S. internet access prices have fallen since 1997, for example. 


source: New America Foundation


The OTI study claims that, comparing average prices between markets with and without a municipal system shows higher prices in markets with government-run networks. Not all agree with that conclusion. 


“The OTI Report’s data, once corrected for errors, do not support the hypothesis that government-run networks charge lower prices,” says Dr. George Ford, Phoenix Center for Advanced Legal and Economic Public Policy Studies chief economist. 


“Using OTI’s data, I find that average prices are about 13 percent higher in cities with a municipal provider than in cities without a government-run network,” says Ford. 


Our definitions of “broadband” keep changing in a higher direction. Once upon a time broadband was anything faster than 1.5 Mbps. Ethernet once topped out at 10 Mbps. 


Today’s minimum definition of 25 Mbps will change as well. The point is that having a minimum says nothing about typical or maximum performance.


About 91 percent to 92 percent of U.S. residents already have access to fixed network internet access at speeds of at least 100 Mbps, according to Broadband Now. And most buy speeds in that range. 


source: Broadband Now


It is useful to have minimum goals. It also is important to recognize when actual consumers buy products that are much more advanced than set minimums.


Sunday, July 19, 2020

How We Go Back to the Office Might Matter, in Terms of Productivity Benefits

How workers go back to offices might matter if the objective is to reap the benefits of physical interactions at work, including the unplanned interactions that are touted as a benefit of office work. 


Many firms now talk about hybrid work arrangements, partly in the office, partly at home. How that is accomplished could make all the difference. Right now, almost everyone is working from home. What happens when reopening happens? 


If hybrid work environments create two tiers of employees (those who are in the office and those who are not, or those who have the ability to informally interact with senior leaders and those who do not), virtual employees risk becoming a “lower class.” And that will create incentives for people to prefer in-office working, rather than staying at home. 


Nor is it clear how much benefit might accrue from the unplanned interactions that can happen at a workplace. 


Even in-person interactions might suffer if “mask wearing” in the office and social distancing are required. Extended wearing of masks likely means conversations and meetings will be shorter. 


That and social distancing will inhibit informal face-to-face communication, which is the main reason for sending employees back to the office. 


About 70 percent  or more workers consistently say they would rather continue to work from home than go into reconfigured offices and be required to wear masks, the authors say. 


That noted, widespread work from home policies arguably have not lead to a drop in productivity many would logically have expected. 


A survey of 600 U.S. white collar employees, 40 percent of whom say they are “in management,” suggests the enforced work-from-home experience has been unexpectedly more successful, in terms of perceived productivity, than expected. The authors believe the “everybody has to do it” context made a big difference, as some early work-from-home studies suggested a drop in productivity could be expected. 


That has many speculating about whether many or most such employees might “never” return to the older office-based patterns. The authors of the study say there are some issues that will likely have to be addressed for that to happen on a widespread scale. 


Unplanned interactions that lead to important outcomes are one advantage of physical settings. “Physical offices cause people who don’t normally work with each other to connect accidentally — bumping into each other in the hallway or the cafeteria — and that interaction sparks new ideas,” they say. 


“In our analysis of the amount of digital interaction at a different technology company, we found that, after the lockdown, employees increased their communication with close collaborators by 40 percent but at a cost of 10 percent less communication with other colleagues,” the authors day.


“There also tends to be less schmoozing and small talk among virtual workers, which Michael Morris of Stanford and Columbia and Janice Nadler, Terri Kurtzberg, and Leigh Thompson of Northwestern have shown leads to lower levels of trust,” they note. “The decline in such spontaneous communications and trust can have a big negative impact on innovation and collaboration.”


Virtual work could undermine three other activities that are critical to long-term organizational health: 

  • Onboarding new employees

  • “Weak” relationships

  • “Strong” relationships


Onboarding new employees in terms of inculcating culture seems fairly easy to do in a virtual context. It seems harder to assess and develop peoples’ unique strengths. 


Virtual work also means it is harder to develop “weak ties,” shallow or peripheral relationships among members of an organization who don’t work closely with each other but have nonetheless connected over time.


Weak ties have been shown to play an important role in organizational performance, including innovation, raising or maintaining product and service quality, and attaining project milestones, they argue. That is difficult to create, on a virtual basis.  


Strong ties also are harder to develop. “People are still getting the work done, but the long-term relationships that once sprang from such shared experiences are undoubtedly at risk,” they note. 


Beyond that, the way that workers come back to work might matter. Hybrid work environments--a combination of virtual and office-based work--sounds like the best of both worlds. 


It might also become the worst of both worlds. Many of the benefits of having everyone work virtually may be lost if companies send just some employees back to the office. 


Some research has found that teams with isolated members (one person per location) or an equivalent number of members in each location (two in one office and two in another) reported better scores on coordination and identification within the team. 


“But if some team members were collocated and others were not (as would likely be true in hybrid environments), team dynamics suffered, which presumably hurt performance,” the authors note. 


Friday, July 17, 2020

Why Innovation is So Hard

If you have ever spent time and effort trying to create something new in the communications business, you know it rarely is easy, simple or uncomplicated to do so, and the larger the organization you work for, the harder it seems to be. That is because all organizational change involves power and politics, and changes will be resisted.  


You might be familiar with the rule of thumb that 70 percent of organizational change programs fail, in part or completely. 


There is a reason for that experience. Assume you propose some change that requires just two approvals to proceed, with the odds of approval at 50 percent for each step. The odds of getting “yes” decisions in a two-step process are about 25 percent (.5x.5=.25). 


source: John Troller 


The odds get longer for any change process that actually requires multiple approvals. Assume there are five sets of approvals. Assume your odds of success are high--about 66 percent--at each stage. In that case, your odds of success are about one in eight (.66x.66x.66x.66x.66=82/243). 


Consider a study by McKinsey on successful organizational change. That study suggests that about 26 percent of all attempted organizational transformations succeed, whether or not change agents have taken at least 24 discrete actions in support of the change. In that study, the suggested actions are not necessarily the same as approval hurdles. But the principle is likely at work.


source: McKinsey


The more hurdles (approvals) required for a change to happen, the less likely the change will happen. Even when the odds of approval at any stage are 66 percent, the necessity of just five approvals will lead to seven of eight change efforts failing. 


Are Network Slices and Edge Computing Competitive Solutions?

As a rule, there always are many ways to solve a particular computing or communications problem. For connectivity providers, that often means supporting different ways of solving business problems. Network slicing--the ability to create end to end virtual private networks across a 5G core network--is one way to create customized end user networks with specific blends of network performance.


Many potential use cases will revolve around ultra-low latency performance, and network slices are one new way to fulfill such requirements. But edge computing might also be a way to solve the same ultra-low latency requirements.


Connectivity providers offering both edge computing support and network slices will in essence be offering two different ways of solving some problems. 


source: TM Forum


Network slicing, the ability to create virtual private networks that run end to end on 5G networks, provides another opportunity to find out where demand might lie for private networks whose characteristics and performance are better matched to some use cases. That providing that the same functionality is not provided by edge computing, which obviates the need for ultra low latency across the wide area network.


Of course, to the extent network slicing offers business value, potential buyers will have incentives to explore “do it yourself” alternatives when it saves them money. In that sense, edge computing networks are an alternative to network slices. 

source: TM Forum


If ultra-low-latency applications are those which could benefit from network slices, one alternative is do commuting at the edge, and not sending data across wide area networks that are optimized for low latency. In many use cases, the value of ultra-low-latency computing is supplied by edge computing services, with non-real time backup across wide area networks. 


Perhaps ironically, consumer customers who have few other alternatives might be good candidates for internet access with quality of service features a network slice offered by a connectivity provider. But regulations often prevent such offers. Gaming services, work from home conferencing and ultra-high-definition video are among potential use cases. 


Verizon Business, IBM Collaborate for Edge Computing

A new collaboration between Verizon Business and IBM illustrates the way 5G, multi-cloud support, edge computing, artificial intelligence, internet of things, “Industry 4.0,”  private networking are intrinsically related. 


The companies plan to combine Verizon’s 5G and Multi-access Edge Compute (MEC) capabilities, IoT devices and sensors at the edge, and IBM’s expertise in AI, hybrid multi cloud, edge computing, asset management and connected operations. 


source: IBM


The collaboration uses Verizon’s ThingSpace IoT Platform and Critical Asset Sensor solution (CAS) plus IBM’s Maximo Monitor with IBM Watson and advanced analytics. This effort has IBM supplying the needed analytics and multi-cloud computing support; Verizon the edge devices, access network and collocation facilities. 


source: IBM


IBM and Verizon are also working on potential combined solutions for 5G and MEC-enabled use cases such as near real-time cognitive automation for the industrial environment. 


Separately, Verizon says the 5G Future Forum will release its first technical specifications in the third quarter of 2020. The 5G Future Forum is a group of 5G service providers and suppliers working to  accelerate the delivery of Multi-access Edge Computing-enabled solutions around the world.


The 5G Future Forum was established in January 2020 by América Móvil, KT Corp., Rogers, Telstra, Verizon, and Vodafone.


What is edge computing?

Thursday, July 16, 2020

S Curve, Bass Model, Gompertz Function

The concept of the S curve has proven to be among the most-significant analytical concepts I have encountered over the years. It describes product life cycles, suggests how business strategy changes depending on where on any single S curve a product happens to be, and has implications for innovation and start-up strategy as well. 


source: Semantic Scholar 


Some say S curves explain overall market development, customer adoption, product usage by individual customers, sales productivity, developer productivity and sometimes investor interest. It often is used to describe adoption rates of new services and technologies, including the notion of non-linear change rates and inflection points in the adoption of consumer products and technologies.


In mathematics, the S curve is a sigmoid function. It is the basis for the Gompertz function which can be used to predict new technology adoption and is related to the Bass Model.


 I’ve seen Gompertz used to describe the adoption of internet access, fiber to the home or mobile phone usage. It is often used in economic modeling and management consulting as well.

Wednesday, July 15, 2020

We Don't yet Agree on What the "Fourth Industrial Revolution" Entails

One hears quite a lot about how 5G will help power the “fourth industrial revolution.” It never is completely clear how people are using the term, but one way of looking at matters is to recall three earlier revolutions based on mechanized looms, steam power and railroads, oil energy and mass production. Using that typology, computing will power the fourth industrial revolution, which we have been in for some decades. 

source: Carnegie Investment Counsel


Others might describe the revolutions slightly differently, basing the first revolution on mechanization, steam and water power. The second industrial revolution then is mass production based on the use of electricity. The third revolution then was based on use of electronic and information systems plus automation. The fourth revolution then is based on cyber-physical systems. 


source: Britannica


There are yet other ways to describe the fourth industrial revolution, where the first was based on the steam engine, the second on mass production, the third on information technology and the coming fourth on smart finance. Some descriptions of the revolutions focus on steam power, electricity and information technology as the drivers of the first three industrial phases. 

source: Digital Republic


Others think the fourth industrial revolution is about applying artificial intelligence to create autonomous decision making. 


With that amount of disagreement, it seems obvious that we do not yet all agree on what is coming.


Tuesday, July 14, 2020

Since 1997, U.S. Internet Access Prices Have Dropped 25%, "All Other Prices" Up 50%

Since 1997, U.S. internet access prices have dropped about 25 percent while the prices of “all items” have grown 50 percent, according to the Bureau of Labor Statistics. 


source: Statista


Moving Up the Stack Still a Requirement for Some Tier-One Telcos

As hard as it typically is, many tier-one service providers will have to consider ways to continue “moving up the stack” into applications, or across the ecosystem into new roles, to jump on a higher-growth revenue curve. Those options might not be available for smaller specialists in the access or transport parts of the business, simply because scale is not possible.


The basic connectivity business is growing at less than one percent a year, while most other parts of the information industry (apps, hardware, devices) is growing at perhaps 12 percent annually. 


source: Ericsson


source: Ericsson


Though the surest revenue opportunity remains connectivity, the big potential roles include service enablement or possibly apps, in some cases. Dial-up internet access, for example, did not produce much incremental revenue for telcos. Broadband access has driven incremental revenue, as has mobility services. 


There is a difference. Broadband is mostly a “dumb pipe” service, while mobility includes voice and messaging, which are apps, plus mobile internet access, which is a dumb pipe product. 

Service enablement might be an opportunity, if service providers can create valuable digital platforms for third parties, or provide system integration and content management. 


Service enablement provides value by freeing developers from handling their own low-level data structure or connecting with 5G and other internet of things platforms. Little of that has developed so far in the internet era, as far as telco platforms are concerned.


Monday, July 13, 2020

Some Competitive Markets Make FTTH a Tough Business Model

Competitive markets make market share a key issue. Consider areas where firms such as Verizon, AT&T or CenturyLink have fiber to home networks. You might consider that a no-brainer, in terms of share. Not so. 


Even with years of marketing, Verizon’s FiOS fiber to home network seems to get sustained share of only about 30 percent. Across a base of 16 million homes, some note that Verizon seems essentially stuck at about that level of adoption.


AT&T has about 14 million to 15 million homes able to buy FTTH service. But AT&T seems relatively stable at about 30 percent share. 


CenturyLink fares worse, with FTTH take rates at about 11 percent to 17 percent. Of course, the U.S. market is different in that cable TV operators have about two thirds market share in consumer markets, using hybrid fiber coax networks routinely making gigabit per second service available. 


That does not mean most cable TV internet access customers buy service at gigabit speeds, only that they generally can. In such a market the business case for additional FTTH is very difficult, since any service provider has to expect stranded assets of perhaps 70 percent to 80 percent of locations passed.


Yes, Follow the Data. Even if it Does Not Fit Your Agenda

When people argue we need to “follow the science” that should be true in all cases, not only in cases where the data fits one’s political pr...