Tuesday, August 4, 2020

Frictionless Business is Partly about Productivity

Business friction is anything that prevents a potential customer from buying your product or service. In a broad sense, friction applies to every part of a business: strategy, product development, technology, distribution channels, marketing, customer service, governance, human resources, capital resources, information technology, customer segmentation and supply chains. 


The immediate thought is that frictionless business involves only “efficiency,” with the least resource input for any given level of output. Frictionless business actually also applies to “effectiveness,” the ability of a business or organization to achieve results that matter. 


According to the U.S. Bureau of Labor Statistics, for example, the productivity (efficiency) of industries including fixed networks, computers and peripherals, communications equipment, semiconductors and non-farm businesses actually saw increased friction (lower productivity) between 2000 and 2017, compared to the 1997 to 2000 period.


source: U.S. Bureau of Labor Statistics


Only the mobile service provider business saw higher productivity over the same time frames. Keep in mind that “productivity” is a combination of output and input--goods and services volume produced compared to the hours required to create those products--along with end user demand. 


Friction can result from any combination of changes in either supply or demand. In the case of fixed network services, much of the fall in productivity comes from reduced demand for the products, and hence lower sales volumes. Even as inputs have been trimmed over time, the lower capital investment or operating costs have not fallen equally fast. 


There is a greater amount of stranded assets, for example, as the percentage of homes or businesses buying legacy services drops. That means the overhead cost of the network has to be borne by fewer paying customers. 


Adoption (percentage of potential customers who actually do buy) also matters. These days, “everybody” uses a mobile service. Less than half of households buy even a single fixed network service from any supplier. 


Frictionless business is the sum total of all actions any business can take to overcome friction, creating and keeping customers, increasing the volume of products sold to those customers with acceptable profit margins, maintaining or increasing market share with superior return on investment. 


Frictionless business reduces every barrier to business success, allowing firms to operate more effectively--doing the right things--as well as efficiently, with minimal waste and maximum productivity. 


Companies that operate with less friction are able to achieve superior results with less resource intensity. To the extent that cloud computing represents a more effective way to deliver internet-based apps and services, as well as providing cost savings and flexibility, it represents a move in the direction of frictionless business. 


To the extent that hyperscale and other data centers are required to support cloud-based apps, and to the extent that cloud apps represent higher value for customers and users, higher revenues and profits for suppliers, featuring new products available at lower costs and with different business models, data centers represent a move in the direction of frictionless business. 


source: Wall Street Journal, Synergy Research


Friction matters for employees and workers as much as it does for companies. One sometimes hears it said that income inequality or wealth inequality is the result of “greedy” people. But worker compensation is directly related to productivity, itself an indicator of friction.


Where friction is least, compensation is highest; where friction is greatest, compensation arguably is lowest. In food services and accommodation, for example, compensation change is directly--one for one--related to changes in productivity. Mining has negative productivity. In the short run, compensation outstrips gains in produced goods. 


Information technology nearly always has the highest improvements in productivity, with comparably lower changes in compensation. That is partly because production is “asset light.” Digital goods are easy to create and reproduce, compared to physical goods. 


Higher usage (demand) is not related in a linear way to the costs of producing the next incremental units. 

source: Bureau of Labor Statistics


Sunday, August 2, 2020

Pareto Principle and Telecom Revenues and Profit

The Pareto Principle, often colloquially known as the 80/20 rule, explains many phenomena in nature, science and business, including the connectivity business. 


In the United Kingdom, for example, 70 percent of people live in areas using 20 percent of cell sites. Ericsson estimates that 20 percent of cell sites carry 70 percent of 3G traffic. We also should expect deployment of about 80 percent of small cells in hyper-dense or very-dense areas. 


CenturyLink earns 75 percent of its revenue from enterprise services. 80 percent of telco profits come from 20 percent of the products or customers. AT&T has earned the bulk of its actual profits from business services. 


Typically,  80 percent of any company’s profit is generated by 20 percent of its customers; 80 percent of complaints come from 20 percent of customers; 80 percent of profits come from 20 percent of the company’s effort; 80 percent of sales come from 20 percent of products or services; 80 percent of sales are made by 20 percent of sellers and 80 percent of clients come from 20 percent of marketing activities.


There are many other common Pareto examples:


  • 80 percent of car accidents are caused by 20 percent of young people

  • 80 percent of lottery tickets are bought by 20 percent of society

  • 80 percent of air pollution is caused by 20 percent of the population

  • 80 percent of all firearms are used by 20 percent of the population

  • 80 percent of all Internet traffic belongs to 20 percent of websites

  • 80 percent of car crashes happen within the first 20 percent of the distance covered

  • 80 percent of mobile phone calls come from 20 percent of the population

  • 80 percent of the time people use 20 percent of the tools at their disposal


It is estimated that 20 percent of Covid-19 cases were responsible for 80 percent of local transmission.  Some 80 percent of users will only use 20 percent of any piece of software's features. Microsoft also observed that 20 percent of software bugs will cause 80 percent of system errors and crashes.


It is estimated that the top 20 percent of players are responsible for 80 percent of a basketball team’s success. 


The issue is how to apply Pareto in the connectivity business, as telecom revenue growth rates are quite low, cash flow is shrinking, returns on invested capital are dropping and consequently equity valuations are under pressure. 


The now-obvious observation is that connectivity provider revenue growth is a fraction of economic growth. To change that situation, something other than “keep doing what you have been doing” will not produce different results. 

source: IDATE


Freedom to maneuver often hinges on the regulatory regime. Tier-one service providers with an obligation to “serve everyone” cannot make the same choices as non-regulated or lightly-regulated firms able to choose their geographies, customers and products. 


Carriers of last resort cannot simply choose not to serve consumer customer segments, or focus only on urban areas. Specialist providers can do so. 


Tier-one service providers have learned to rely on mobility services, though. 


In Western Europe, perhaps 80 percent of revenue growth is driven by mobile services, though mobility revenues overall are about 46 percent of total revenues. 

source: A.D. Little


That is even more true in other regions, where mobility revenue is as much as 82 percent of all connectivity provider revenues, and where mobile infrastructure accounts for most of the new facilities-based competition between service providers. 


source: IDATE


The traditional difference in profit margins between enterprise and consumer accounts also explains why many believe 5G profits will disproportionately be created by enterprise 5G services, not consumer 5G. 


Friday, July 31, 2020

As 5G Focuses on Enterprise Use Cases, 6G Might Focus on Virtualized and Self-Learning Networks

Mobile and fixed network operators constantly are challenged to reduce capital investment and operating costs as a way of compensating for low revenue growth, challenged profit margins and ever-increasing bandwidth consumption by customers whose propensity to pay is sharply limited. 

The very design of future 6G networks might work to help reduce capex and opex, while incorporating much more spectrum, at very high frequencies and basing core operations on use of machine learning (a form of artificial intelligence that allows machines to learn autonomously). 

New 6G networks might rely even more extensively on virtualization than do 5G networks, featuring now-exotic ways of supporting internet of things sensors that require no batteries, a capability that would dramatically reduce IoT network operating costs. 

It is possible 6G networks will be fundamentally different from 5G in ways beyond use of spectrum, faster speeds and even lower latency. 6G networks might essentially be “cell-less,” able to harness ambient energy for devices that require no batteries and feature a virtualized radio access network. 


The “cell-less” architecture will allow end user devices to connect automatically to any available radio, on any authorized network. Harvesting of ambient energy will be especially important for internet of things devices and sensors that might not require any batteries at all to operate, reducing operating cost. 


source: IEEE


The virtualized radio access network will provide better connectivity, at possibly lower cost, as user devices can use the “best” resource presently available, on any participating network, including non-terrestrial platforms (balloons, unmanned aerial vehicles or satellites). 


Backhaul might be built into every terrestrial radio, using millimeter wave spectrum both for user-facing and backhaul connections, automatically configured. That will reduce cost of network design, planning and backhaul. 


Researchers now also say such federated networks will be based on machine learning (artificial intelligence), which will be fundamental to the way 6G networks operate. Devices will not only use AI to select a particular radio connection, but will modify behavior based on experience. 


The network architecture might be quite different from today’s “cellular” plan, in that access is “fully user centric,” allowing terminals to make autonomous network decisions about how to connect to any authorized and compatible network, without supervision from centralized controllers.


Though machine learning arguably already is used in some ways to classify and predict, in the 6G era devices might also use artificial intelligence to choose “the best” network connection “right now,” using any available resource, in an autonomous way, not dictated by centralized controllers.  


To be sure, in some ways those changes are simply extrapolations from today’s network, which increasingly is heterogeneous, able to use spectrum sharing or Wi-Fi access, using radio signal strength to determine which transmitter to connect with. 


Architecturally, the idea is that any user device connects to the radio access network, not to any specific radio, using any specific base station, say researchers Marco Giordani, Member, IEEE, Michele Polese, Member, IEEE, Marco Mezzavilla, Senior Member, IEEE, Sundeep Rangan, Fellow, IEEE, Michele Zorzi, Fellow, IEEE. 

source: IEEE


Overall, many 6G features will be designed to reduce the cost and improve the efficiency of the radio access network, especially to create “pervasive” connectivity, not just to add more bandwidth and lower latency for end users and devices.


Thursday, July 30, 2020

How Much More Can Tier-One Connectivity Suppliers Become Asset Light?

Occasionally over the last few decades, it has been proposed that telcos consider ways to become asset light operators. That advice--to monetize assets--continues to be offered. The issue is what portions of the infrastructure can be spun off or sold. 


In the U.S. market, asset light was recommended for competitive local exchange carriers, at one time able to buy “unbundled network element-provisioned” wholesale services at as much as a 40-percent discount to retail prices. 


In many international markets, mobile virtual network operators are a less-risky way to enter a new market. 


In Europe and other markets, bitstream and other forms of unbundled local loop access have been created to allow asset-light wholesale entry into the telecom market. 


From time to time, observers have speculated on the degree to which it might be possible for new competitors to use unlicensed spectrum assets such as Wi-Fi to create competition for mobile or fixed internet access. At the very least, cable operators and outfits such as Fon argue that a shared Wi-Fi network allows offloading of local mobile phone traffic, thus reducing purchases of wholesale mobile connectivity. 


In specialized areas, such as cell tower facilities, many mobile operators have concluded that sharing the cost of base stations with competitors or selling such assets (with leaseback) is a way to unlock value while becoming a bit more asset light. 


The new issue is whether it is possible to unbundle even more elements of a connectivity provider’s asset base, such as optical fiber facilities serving business customers. Attice, for example, recently sold 49.99 percent of its  Lightpath fiber enterprise business to Morgan Stanley Infrastructure Partners. 


Others have suggested that CenturyLink sell its optical network assets, or at least separate the consumer from the enterprise business. Right now, the enterprise part of CenturyLink accounts for 75 percent of revenue, the consumer business just 25 percent. 


source: S&P Global


Some assets are easier to separate than others. Cell towers and data centers are discrete assets many telcos have divested. In principle, the wide area networks could possibly be divested, though owner’s economics would still be an argument in favor of retaining that portion of their networks. As always is the case, volume improves the economics of owning assets. 


In principle, other new assets, such as small cell installations or backhaul facilities, might be candidates for infrastructure sharing, especially when it is possible to separate the value of facilities from the use of those capabilities to support the core customer experience. 


The issue is whether some operators might become so good at creating and monetizing intangible assets that they can risk shifting in the direction of asset-light or non-facilities-based operations on a wider scale. Few tier-one telcos have felt it was wise to divest access networks.


Access network assets remain quite scarce and therefore valuable in most markets and arguably are the hardest parts of the infrastructure to consider divesting. 


“If telcos do not reconfigure their value chains, other parties may step in, as disaggregated telco assets are being valued differently,” consultants at Arthur D. Little have argued. The problem is that creating more value remains a huge challenge, as the ability to enter new parts of the value chain, though risky for any participant, is asymmetrical. 


Connectivity represents about 17 percent of the revenue earned annually by firms in the internet value chain. The bad news is that connectivity share is dropping.

Has WFH Productivity Actually Held Up?

Nobody knows for certain how productivity might be affected, for different companies, industries and countries, as the enforced work from home policies stay in place. In the short term, as “everybody has to do it,” many studies have suggested an unexpected ability to maintain former output levels. 


What is not clear is how and what might change as the WFH period lengthens, and as firms make different strategic choices once the mandatory WFH period eventually ends. 


The point is that, with time, reader fatigue, Zoom fatigure, work from home burnout and lower productivity now are starting up show up, raising questions about whether permanent work from home policies will be as widespread--or useful--as many predict. 


And despite many claims that WFH productivity has been remarkably high, worker perception of their own productivity is not so clear. To be sure, near-universal WFH in office settings means no firm inherently benefits or loses. So far, since “everyone” has to do it, there appears no systematic advantage gained or lost.


All that will change when the Covid-19 pandemic winds down (because we have vaccines and most people take the vaccines, or herd immunity is gained). Then, WFH will be an option firms can choose, and the advantages and disadvantages might be accrued non-linearly by different firms. 


Also, WFH productivity in some pre-pandemic settings suggests WFH productivity is markedly lower than at the office work. Some early studies of WFH productivity also suggest productivity has dropped. 


That argument might puzzle some. The issue is the amount of useful work getting done, compared to the time spent to achieve those results. By definition, if the same results are obtained, but the time to create those results has increased, productivity is lower. 


Most studies of “productivity” during the pandemic WFH period essentially argue that firms aer able to produce the same results, even when most people are working remotely. What those studies sometimes neglect is the fact that many--if not most--of those at-home employees are putting in longer hours. By definition, then, we have “same results, more hours worked.” So productivity is lower, in that sense. 


Wednesday, July 29, 2020

Communications is Generally Good, Unless It Is Overhead

There is a good reason why work teams often are intentionally kept small: to get any work done at all, the amount of communications overhead has to be reduced. 


If you talk to people who work for large enterprises how much time they spend in meetings, many would say “almost all of my time.” Studies often find that senior managers spend at least half their time in meetings. Some estimate there are 56 million U.S. meetings each day


As the number of people you work with increases, communication overhead increases geometrically until the total percentage of time each individual must devote to group communication approaches 100 percent. 


After a certain threshold, each additional team member diminishes the capacity of the group to do anything other than communicate.


Large companies are slow because they suffer from communication overhead. “If you’re responsible for working with a group of more than five to eight people, at least 80 percent of your job will inevitably be communicating effectively with the people you work with,” argues Personal MBA.


That is one reason why some advocate meetings with no more than seven people. That is literally a rule of seven


Some might argue that is related to Miller's Law, which states that humans can only hold about seven items in short-term working memory. That is why Bell Labs designed U.S. phone numbers with seven digits. 


Likewise, span of control research suggests one person can only effectively and directly manage six people. That is why small teams are the building block of every military, and also applies to business “direct reports.”


That noted, work from home could have some impact on meeting length and frequency. Perhaps the good news is that meetings are shorter, if there are even more meetings to attend. 


Some research from Microsoft suggests the pandemic and dramatic increase in work from home has lead to more meetings that are shorter, suggesting many of those new shorter meetings were replacing informal communications that would have occurred in the office, but which are not possible when everyone is working remotely. 


The important point is that, as important as meetings are for communications and aligning group effort, they are, in effect, substitutes for actually accomplishing the organizational mission. “We can do meetings, or we can do work” might be a crude way of putting matters.


Tuesday, July 28, 2020

Definitions Matter

Minimums, median and maximum all are valuable indices in life, business and nature, including measures of internet access adoption or “quality.” It also has to be noted that constantly moving goalposts--changing our definitions--is a way of creating permanent problems.


That is not to deny the usefulness of revising our definitions over time. It is a truism that yesterday's power user is today's typical user


The percentage of U.S. customers buying internet access at the minimum speeds keeps dropping, as customers migrate to tiers of service that offer higher speeds at the same or only slightly-higher cost. 


But such definitions matter for both consumers and suppliers. Customers might sometimes buy services that actually are overkill. Most internet access customers buy what they believe is good enough to support their actual use cases, and rarely what is the “best” available level of service. 


Suppliers might imperil their business models by forcing investment in facilities that customers will not use, overprovisioning service in ways that raise sunk costs of doing business without providing capabilities customers actually buy. 


Those buying patterns also suggest why some ISP offers that are not state of the art can still be commercially viable. The reason is that, beyond a certain point, additional speed provides no tangible user experience benefit.  


And permanent problems are essential for those who claim to be in the business of “solving those problems.” That matters for education, health, disease, economies, social and economic equality, sports and just about anything else you can think of. In other words, one cannot marshall public policy support to solve a problem that does not exist.  


To be sure, our definitions of “broadband” have evolved, and will continue to evolve. 50 years ago, broadband was defined as any speed at 1.5 Mbps or faster. Once upon a time, Ethernet ran at 10 Mbps, while fiber to the home offered 10 Mbps. Today’s systems all run much faster than that. 


But it also makes a difference to “problem solvers” that definitions are revised upwards. Doing so always creates a “bigger problem.” 


Changing the minimum definition of broadband would shift the size of the “underserved” population or locations, for example. Today, perhaps 20 percent of U.S. buyers of fixed network internet access purchase services at the minimum speed of 25 Mbps. 


Changing the definition to 100 Mbps would increase the size of the underserved locations to nearly half of all buyers. Again, we will keep increasing both minimum levels of service, customers will keep changing the speed tiers they purchase and internet service providers will keep supplying faster speeds. 


The point, however, is that changing minimum definitions does not change the number or percentage of tiers of service customers purchase or that ISPs supply. Already, we find that the percentage of customers buying the fastest-possible speeds (at least 1 Gbps) is in mid single digits. 


More to the point, the typical buyer prefers a service offering 100 Mbps to 400 Mbps. Changing “minimum” to “average” has consequences, arguably distorting our understanding of “good enough” levels of broadband speed. 




Benchmarks are valuable when trying to measure “progress” toward some stated goal. A minimum speed definition for broadband access is an example. But that does not obviate the value of knowing maximum and median values, either, especially when the typical U.S. internet access buyer routinely buys services significantly higher than the minimum. 


In the first quarter of 2020, for example, only about 18 percent of U.S. consumers actually bought services running at 40 Mbps or less. All the rest bought services running faster than 50 Mbps. 


source: Openvault


An analysis by the Open Technology Institute concludes that “consumers in the United States pay more on average for monthly internet service than consumers abroad—especially for higher speed tiers.” 


As always, methodology matters. The OTI study examines standalone internet access plans, even if that does not account for the plans most consumers actually buy. The figures do not appear to be adjusted for purchasing power differences between countries. Were that done, it might be clearer that average internet access prices are about $50 a month, globally


Global prices are remarkably consistent, in fact, when adjusting for purchasing power conditions in each country.  


Nor does any snapshot show longer term trends, such as lower internet access prices globally since at least 2008. A look at U.S. prices shows a “lower price” trend since the last century. U.S. internet access prices have fallen since 1997, for example. 


source: New America Foundation


The OTI study claims that, comparing average prices between markets with and without a municipal system shows higher prices in markets with government-run networks. Not all agree with that conclusion. 


“The OTI Report’s data, once corrected for errors, do not support the hypothesis that government-run networks charge lower prices,” says Dr. George Ford, Phoenix Center for Advanced Legal and Economic Public Policy Studies chief economist. 


“Using OTI’s data, I find that average prices are about 13 percent higher in cities with a municipal provider than in cities without a government-run network,” says Ford. 


Our definitions of “broadband” keep changing in a higher direction. Once upon a time broadband was anything faster than 1.5 Mbps. Ethernet once topped out at 10 Mbps. 


Today’s minimum definition of 25 Mbps will change as well. The point is that having a minimum says nothing about typical or maximum performance.


About 91 percent to 92 percent of U.S. residents already have access to fixed network internet access at speeds of at least 100 Mbps, according to Broadband Now. And most buy speeds in that range. 


source: Broadband Now


It is useful to have minimum goals. It also is important to recognize when actual consumers buy products that are much more advanced than set minimums. 


Directv-Dish Merger Fails

Directv’’s termination of its deal to merge with EchoStar, apparently because EchoStar bondholders did not approve, means EchoStar continue...