Tuesday, July 21, 2020

Is U.S. Internet Access Actually Expensive or Slow?

Minimums, median and maximum all are valuable indices in life, business and nature, including measures of internet access adoption or “quality.”


Benchmarks are valuable when trying to measure “progress” toward some stated goal. A minimum speed definition for broadband access is an example. But that does not obviate the value of knowing maximum and median values, either, especially when the typical U.S. internet access buyer routinely buys services significantly higher than the minimum. 


In the first quarter of 2020, for example, only about 18 percent of U.S. consumers actually bought services running at 40 Mbps or less. All the rest bought services running faster than 50 Mbps. 


source: Openvault


An analysis by the Open Technology Institute concludes that “consumers in the United States pay more on average for monthly internet service than consumers abroad—especially for higher speed tiers.” 


As always, methodology matters. The OTI study examines standalone internet access plans, even if that does not account for the plans most consumers actually buy. The figures do not appear to be adjusted for purchasing power differences between countries. Were that done, it might be clearer that average internet access prices are about $50 a month, globally


Global prices are remarkably consistent, in fact, when adjusting for purchasing power conditions in each country.  


Nor does any snapshot show longer term trends, such as lower internet access prices globally since at least 2008. A look at U.S. prices shows a “lower price” trend since the last century. U.S. internet access prices have fallen since 1997, for example. 


source: New America Foundation


The OTI study claims that, comparing average prices between markets with and without a municipal system shows higher prices in markets with government-run networks. Not all agree with that conclusion. 


“The OTI Report’s data, once corrected for errors, do not support the hypothesis that government-run networks charge lower prices,” says Dr. George Ford, Phoenix Center for Advanced Legal and Economic Public Policy Studies chief economist. 


“Using OTI’s data, I find that average prices are about 13 percent higher in cities with a municipal provider than in cities without a government-run network,” says Ford. 


Our definitions of “broadband” keep changing in a higher direction. Once upon a time broadband was anything faster than 1.5 Mbps. Ethernet once topped out at 10 Mbps. 


Today’s minimum definition of 25 Mbps will change as well. The point is that having a minimum says nothing about typical or maximum performance.


About 91 percent to 92 percent of U.S. residents already have access to fixed network internet access at speeds of at least 100 Mbps, according to Broadband Now. And most buy speeds in that range. 


source: Broadband Now


It is useful to have minimum goals. It also is important to recognize when actual consumers buy products that are much more advanced than set minimums.


Sunday, July 19, 2020

How We Go Back to the Office Might Matter, in Terms of Productivity Benefits

How workers go back to offices might matter if the objective is to reap the benefits of physical interactions at work, including the unplanned interactions that are touted as a benefit of office work. 


Many firms now talk about hybrid work arrangements, partly in the office, partly at home. How that is accomplished could make all the difference. Right now, almost everyone is working from home. What happens when reopening happens? 


If hybrid work environments create two tiers of employees (those who are in the office and those who are not, or those who have the ability to informally interact with senior leaders and those who do not), virtual employees risk becoming a “lower class.” And that will create incentives for people to prefer in-office working, rather than staying at home. 


Nor is it clear how much benefit might accrue from the unplanned interactions that can happen at a workplace. 


Even in-person interactions might suffer if “mask wearing” in the office and social distancing are required. Extended wearing of masks likely means conversations and meetings will be shorter. 


That and social distancing will inhibit informal face-to-face communication, which is the main reason for sending employees back to the office. 


About 70 percent  or more workers consistently say they would rather continue to work from home than go into reconfigured offices and be required to wear masks, the authors say. 


That noted, widespread work from home policies arguably have not lead to a drop in productivity many would logically have expected. 


A survey of 600 U.S. white collar employees, 40 percent of whom say they are “in management,” suggests the enforced work-from-home experience has been unexpectedly more successful, in terms of perceived productivity, than expected. The authors believe the “everybody has to do it” context made a big difference, as some early work-from-home studies suggested a drop in productivity could be expected. 


That has many speculating about whether many or most such employees might “never” return to the older office-based patterns. The authors of the study say there are some issues that will likely have to be addressed for that to happen on a widespread scale. 


Unplanned interactions that lead to important outcomes are one advantage of physical settings. “Physical offices cause people who don’t normally work with each other to connect accidentally — bumping into each other in the hallway or the cafeteria — and that interaction sparks new ideas,” they say. 


“In our analysis of the amount of digital interaction at a different technology company, we found that, after the lockdown, employees increased their communication with close collaborators by 40 percent but at a cost of 10 percent less communication with other colleagues,” the authors day.


“There also tends to be less schmoozing and small talk among virtual workers, which Michael Morris of Stanford and Columbia and Janice Nadler, Terri Kurtzberg, and Leigh Thompson of Northwestern have shown leads to lower levels of trust,” they note. “The decline in such spontaneous communications and trust can have a big negative impact on innovation and collaboration.”


Virtual work could undermine three other activities that are critical to long-term organizational health: 

  • Onboarding new employees

  • “Weak” relationships

  • “Strong” relationships


Onboarding new employees in terms of inculcating culture seems fairly easy to do in a virtual context. It seems harder to assess and develop peoples’ unique strengths. 


Virtual work also means it is harder to develop “weak ties,” shallow or peripheral relationships among members of an organization who don’t work closely with each other but have nonetheless connected over time.


Weak ties have been shown to play an important role in organizational performance, including innovation, raising or maintaining product and service quality, and attaining project milestones, they argue. That is difficult to create, on a virtual basis.  


Strong ties also are harder to develop. “People are still getting the work done, but the long-term relationships that once sprang from such shared experiences are undoubtedly at risk,” they note. 


Beyond that, the way that workers come back to work might matter. Hybrid work environments--a combination of virtual and office-based work--sounds like the best of both worlds. 


It might also become the worst of both worlds. Many of the benefits of having everyone work virtually may be lost if companies send just some employees back to the office. 


Some research has found that teams with isolated members (one person per location) or an equivalent number of members in each location (two in one office and two in another) reported better scores on coordination and identification within the team. 


“But if some team members were collocated and others were not (as would likely be true in hybrid environments), team dynamics suffered, which presumably hurt performance,” the authors note. 


Friday, July 17, 2020

Why Innovation is So Hard

If you have ever spent time and effort trying to create something new in the communications business, you know it rarely is easy, simple or uncomplicated to do so, and the larger the organization you work for, the harder it seems to be. That is because all organizational change involves power and politics, and changes will be resisted.  


You might be familiar with the rule of thumb that 70 percent of organizational change programs fail, in part or completely. 


There is a reason for that experience. Assume you propose some change that requires just two approvals to proceed, with the odds of approval at 50 percent for each step. The odds of getting “yes” decisions in a two-step process are about 25 percent (.5x.5=.25). 


source: John Troller 


The odds get longer for any change process that actually requires multiple approvals. Assume there are five sets of approvals. Assume your odds of success are high--about 66 percent--at each stage. In that case, your odds of success are about one in eight (.66x.66x.66x.66x.66=82/243). 


Consider a study by McKinsey on successful organizational change. That study suggests that about 26 percent of all attempted organizational transformations succeed, whether or not change agents have taken at least 24 discrete actions in support of the change. In that study, the suggested actions are not necessarily the same as approval hurdles. But the principle is likely at work.


source: McKinsey


The more hurdles (approvals) required for a change to happen, the less likely the change will happen. Even when the odds of approval at any stage are 66 percent, the necessity of just five approvals will lead to seven of eight change efforts failing. 


Are Network Slices and Edge Computing Competitive Solutions?

As a rule, there always are many ways to solve a particular computing or communications problem. For connectivity providers, that often means supporting different ways of solving business problems. Network slicing--the ability to create end to end virtual private networks across a 5G core network--is one way to create customized end user networks with specific blends of network performance.


Many potential use cases will revolve around ultra-low latency performance, and network slices are one new way to fulfill such requirements. But edge computing might also be a way to solve the same ultra-low latency requirements.


Connectivity providers offering both edge computing support and network slices will in essence be offering two different ways of solving some problems. 


source: TM Forum


Network slicing, the ability to create virtual private networks that run end to end on 5G networks, provides another opportunity to find out where demand might lie for private networks whose characteristics and performance are better matched to some use cases. That providing that the same functionality is not provided by edge computing, which obviates the need for ultra low latency across the wide area network.


Of course, to the extent network slicing offers business value, potential buyers will have incentives to explore “do it yourself” alternatives when it saves them money. In that sense, edge computing networks are an alternative to network slices. 

source: TM Forum


If ultra-low-latency applications are those which could benefit from network slices, one alternative is do commuting at the edge, and not sending data across wide area networks that are optimized for low latency. In many use cases, the value of ultra-low-latency computing is supplied by edge computing services, with non-real time backup across wide area networks. 


Perhaps ironically, consumer customers who have few other alternatives might be good candidates for internet access with quality of service features a network slice offered by a connectivity provider. But regulations often prevent such offers. Gaming services, work from home conferencing and ultra-high-definition video are among potential use cases. 


Verizon Business, IBM Collaborate for Edge Computing

A new collaboration between Verizon Business and IBM illustrates the way 5G, multi-cloud support, edge computing, artificial intelligence, internet of things, “Industry 4.0,”  private networking are intrinsically related. 


The companies plan to combine Verizon’s 5G and Multi-access Edge Compute (MEC) capabilities, IoT devices and sensors at the edge, and IBM’s expertise in AI, hybrid multi cloud, edge computing, asset management and connected operations. 


source: IBM


The collaboration uses Verizon’s ThingSpace IoT Platform and Critical Asset Sensor solution (CAS) plus IBM’s Maximo Monitor with IBM Watson and advanced analytics. This effort has IBM supplying the needed analytics and multi-cloud computing support; Verizon the edge devices, access network and collocation facilities. 


source: IBM


IBM and Verizon are also working on potential combined solutions for 5G and MEC-enabled use cases such as near real-time cognitive automation for the industrial environment. 


Separately, Verizon says the 5G Future Forum will release its first technical specifications in the third quarter of 2020. The 5G Future Forum is a group of 5G service providers and suppliers working to  accelerate the delivery of Multi-access Edge Computing-enabled solutions around the world.


The 5G Future Forum was established in January 2020 by América Móvil, KT Corp., Rogers, Telstra, Verizon, and Vodafone.


What is edge computing?

Thursday, July 16, 2020

S Curve, Bass Model, Gompertz Function

The concept of the S curve has proven to be among the most-significant analytical concepts I have encountered over the years. It describes product life cycles, suggests how business strategy changes depending on where on any single S curve a product happens to be, and has implications for innovation and start-up strategy as well. 


source: Semantic Scholar 


Some say S curves explain overall market development, customer adoption, product usage by individual customers, sales productivity, developer productivity and sometimes investor interest. It often is used to describe adoption rates of new services and technologies, including the notion of non-linear change rates and inflection points in the adoption of consumer products and technologies.


In mathematics, the S curve is a sigmoid function. It is the basis for the Gompertz function which can be used to predict new technology adoption and is related to the Bass Model.


 I’ve seen Gompertz used to describe the adoption of internet access, fiber to the home or mobile phone usage. It is often used in economic modeling and management consulting as well.

Directv-Dish Merger Fails

Directv’’s termination of its deal to merge with EchoStar, apparently because EchoStar bondholders did not approve, means EchoStar continue...