Monday, September 20, 2021

S&P 500 Longevity Mirrors Connectivity Provider Revenue Pattern

What is true of connectivity provider revenue also is true of firms. My general rule of thumb is that service providers must replace half of all present revenue every 10 years. As it turns out, about half of all Standard & Poors 500 companies are replaced every decade. 


The typical S&P 500 firm remains on the index less than 20 years, and is predicted to drop to about 14 years by 2026, says Innosight. 


Shrinking lifespans are in part driven by a complex combination of technology shifts and economic shocks. “But frequently, companies miss opportunities to adapt or take advantage of change,” Innosight says. 


“For example, they continue to apply existing business models to new markets, fail to respond to disruptive competitors in low-profit segments, or fail to adequately envision and invest in new growth areas, which in some cases can take a decade to pay off,” Innosight notes. 


The point is that “no business survives over the long-term without reinventing itself,” Innosight says. That seems also true of connectivity firms. 


The next era of telecommunications might be a stretch for most firms, in the sense that revenue growth might have to come from application creation and development that never have been core competencies. 


But some seem to have unrealistically high hopes for 5G.  


To be sure, edge computing, internet of things use cases and a few consumer use cases involving artificial reality or virtual reality seem promising. The larger point is that revenues in any of those new growth areas will not likely be enough to offset stagnating revenues in the core connectivity business. They will help, but the magnitude of new revenue growth will be staggering. 


If we assume that past patterns hold, and that most telcos will have to replace half of current revenue each decade, then any new revenue sources have to be big. And that is the issue. 


Edge computing, internet of things or private networks will help. But are they big enough new revenue sources to replace literally half of current revenue? Some might argue that is unlikely. 


Incremental gains are not going to be enough. Telcos are looking at generating new revenues to the tune of $400 billion in the next 10 years. If IoT or edge computing generate $10 billion to $20 billion in incremental new revenues, that helps. But it does not come close to solving the bigger revenue problem.


Whether at the firm level or the product level, change is a constant. Half of what revenue drives a connectivity business in a decade does not yet exist, or is not yet tangible. Up to half of the product drivers of that revenue do not exist or have not yet been mass deployed.


How Much of What We See in Remote Work is "Hawthorne Effect?"

 We still actually know very little for certain about how productivity changed because of enforced remote work for knowledge and office workers. The problem is that we cannot measure the productivity of such workers easily, if at all.

Also, there are measurement effects, to the extent that enforced remote work is a bit of an experiment. The Hawthorne Effect is that subjects in an experiment tend to perform better. 

There also are demand characteristics. In experiments, researchers sometimes display subtle clues that let participants know what they are hoping to find. As a result, subjects will alter their behavior to help confirm the experimenter’s hypothesis.


Then there are novelty effects: The novelty can lead to an initial increase in performance and productivity that may eventually level off as an experiment continues.


Performance feedback is similar to the Hawthorne Effect. Increased attention from experimenters tends to boost performance. In the short term, that could lead to an improvement in productivity.


That assumes we can measure knowledge worker or office worker productivity, however. 


The problem with all studies of officer worker or knowledge worker productivity is measurement. What can be counted so we know whether inputs have changed. And how do we measure the output of knowledge work? 


Presumably a call center operation has quantifiable metrics, but most office or knowledge work does not have any obvious and convenient measurement criteria. We commonly measure “time working” with the assumption that additional time worked is better. Maybe. But hours worked is an input metric, not an output metric. It is the denominator, not the numerator. 


Logically, increasing input (denominator) can work to reduce productivity (output) unless output measures also increase faster than inputs increase. 


The other common issue is that we equate worker attitudes with outcomes. Happier workers might, or might not, be more productive. All we can measure is a subjective attitude. More happy or less happy does not necessarily correlate with outcomes. 


In principle, one could have happier but less productive workers; less happy but more productive workers. One would need a way to correlate output and outcomes with feelings in ways that outlive simple Hawthorne effects (people work better when they know they are part of an experiment). 


Work team collaboration might have fared better under full remote work conditions, but there is some evidence that firm-wide collaboration has decreased, though the amount of time spent collaborating (meetings, emails, messaging) has grown.  


Actual output is different from input or collaboration time and effort. It might be difficult to measure “creativity,” but there is some belief that has not done better under conditions of remote work.  


Meetings are inputs, not outputs. Having more meetings, or spending more time in meetings, does not make firms or organizations more productive. A Microsoft survey of 182 senior managers in a range of industries found support for that thesis. 


We might say the same for collaboration during the enforced remote work period. It is common to hear technology business or policy leaders argue that remote work has not harmed productivity. 


Leaving aside the issue of whether remote work productivity changes can be measured, collaboration--deemed by most to be vital for knowledge workers--might have gotten far worse because of Covid. 


People like the freedom to work from home, no question.  


That might have happened despite reports that suggest information, knowledge and office workers now are spending more time with electronic forms of communication. But “communication” is not necessarily “collaboration.”


If collaboration is defined as “people working in teams or with others,” then collaboration seemingly has suffered. 


According to Gensler, “high-performing people at top companies tend to do individual work and collaborative work in equal measures—45 percent each, according to our research--with the remaining 10 percent made up of learning and social time.” 


For better or worse, those balances were changed during the period of enforced work from home policies. “While at home during the pandemic, people reported working in individual focus mode 62 percent of the time and 27 percent in collaboration, a disparity that negatively impacts company creativity and productivity,” Gensler argues. 


Before the pandemic, U.S. workers spent an average of 43 percent of their work weeks collaborating either virtually or in person. That number fell to 27 percent for workers who worked from home in 2020, for example. 


“At the onset of the pandemic, our analysis shows that interactions with our close networks at work increased, while interactions with our distant networks diminished,” say Microsoft research. “This suggests that, as we shifted into lockdowns, we clung to our immediate teams for support and let our broader network fall to the wayside.”


There is a downside: similar companies almost certainly became more siloed than they were before the pandemic. 


“And while interactions with our close networks are still more frequent than they were before the pandemic, the trend shows even these close team interactions have started to diminish over time,” Microsoft researchers say. 


Younger workers (25 or younger) also reported more difficulty feeling engaged or excited about work, getting a word in during meetings, and bringing new ideas to the table when compared to other generations.


“Bumping into people in the office and grabbing lunch together may seem unrelated to the success of the organization, but they’re actually important moments where people get to know one another and build social capital,” says Dr. Nancy Baym, Microsoft senior principal researcher “They build trust, they discover common interests they didn’t know they had, and they spark ideas and conversations.”


Microsoft researchers noticed that “at the office: worker instant messages slowed  25 percent during lunchtime, but remote workers at home reduced IMs by 10 percent. Also, IMs grew by 52 percent between 6 p.m. and midnight, suggesting that at-home remote workers might have been working more total hours than employees in the office.


At-home workers also spent about 10 percent more time in meetings. Those results might be interpreted as either good or bad effects of collaboration


Microsoft research also suggests that while collaboration within work teams increased, collaboration outside of the teams, with the rest of Microsoft personnel, decreased.  


The point is that we actually know quite little about potential changes in productivity, especially longer-term impact. In the short term, there is a Hawthorne Effect at work, which would “boost productivity” in the short term. 


Saturday, September 18, 2021

There are No KPIs for Knowledge Workers

Key performance indicators often are a recommended practice for improving organizational output. But knowledge work and office work in general do not allow us to create meaningful KPIs related to productivity.


The problem with all studies of officer worker or knowledge worker productivity is measurement. What can be counted so we know whether inputs have changed. And how do we measure the output of knowledge work? 


Presumably a call center operation has quantifiable metrics, but most office or knowledge work does not have any obvious and convenient measurement criteria. We commonly measure “time working” with the assumption that additional time worked is better. Maybe. But hours worked is an input metric, not an output metric. It is the denominator, not the numerator. 


Logically, increasing input (denominator) can work to reduce productivity (output) unless output measures also increase faster than inputs increase. 


The other common issue is that we equate worker attitudes with outcomes. Happier workers might, or might not, be more productive. All we can measure is a subjective attitude. More happy or less happy does not necessarily correlate with outcomes. 


In principle, one could have happier but less productive workers; less happy but more productive workers. One would need a way to correlate output and outcomes with feelings in ways that outlive simple Hawthorne effects (people work better when they know they are part of an experiment). 


Work team collaboration might have fared better under full remote work conditions, but there is some evidence that firm-wide collaboration has decreased, though the amount of time spent collaborating (meetings, emails, messaging) has grown.  


Actual output is different from input or collaboration time and effort. It might be difficult to measure “creativity,” but there is some belief that has not done better under conditions of remote work.  


Meetings are inputs, not outputs. Having more meetings, or spending more time in meetings, does not make firms or organizations more productive. A Microsoft survey of 182 senior managers in a range of industries found support for that thesis. 


“65 percent said meetings keep them from completing their own work,” according to Microsoft. “71 percent said meetings are unproductive and inefficient.”


Fully  64 percent said meetings come at the expense of deep thinking while 62 percent said meetings miss opportunities to bring the team closer together (which is sort of a paradox). 


Unless directly correlated with output, meetings actually can reduce productivity. Some seem to believe emails are outcomes, when in fact they take away time that might otherwise have been spent actually producing an output. 


source: Lucidspark 


The point is that we actually can say very little about whether productivity has grown, stayed the same or decreased because of enforced remote work. We could not measure productivity before, so we have no baseline against which to compare, even if we thought we could measure it.


Friday, September 17, 2021

"Business Hub" or "Ecosystem" is Easier than "Platform"

Quite often, data centers and connectivity service providers are urged to “become platforms,” “become ecosystems” or “business hubs.” It is good advice, for the simple reason that value is increased. And since value tends to correlate with customer growth, hence revenue, higher value always is desirable. 


source: Medium 


That is an example of network effects: the observation that the value of some products or services increases as more nodes are added. A voice network that connects people and organizations in one local area has some value. A voice network that connects all or most people and organizations in a region is more useful. Better still is a network connecting everyone in a nation. A network that connects all users globally has the highest value. 


That also is true of many internet applications and services as well. “Ecosystems” can be built around products that increase value, as Apple demonstrates with its iPhone. The big issue with ecosystems is the additional value partners bring. 


The concept is easier to visualize for a data center provider than a connectivity provider. A colocation provider gains revenue and customers as more partners conclude it makes sense to colocate at a physical location. To the extent a colocation provider makes money from tenants and interconnection, “ecosystem” makes perfect sense. 


source: Equinix 


The concept is harder to commercialize for a connectivity provider as TCP/IP allows any and all partners to connect with each other without formalized business relationships, and therefore without necessarily increasing connectivity provider revenue in a direct sense. Such “permissionless” access means business partners do not need a connectivity provider’s agreement to conduct business with each other, even when using networks. 


source: STL Partners 


Ecosystems can create additional value, for the same reason that geographic proximity creates value, by increasing the number and range of business partners with whom transactions can happen, and therefore value and features sold to customers and available to end users. 


But that does not make such ecosystems “platforms.” 


source: Goodhum 


“Companies must think of the edge as more than just a collection point for data from intelligent devices,” Iron Mountain argues. “They should broaden their vision to see the edge as a new business hub.”


Regardless of the preferred language, the key business idea is that interconnection or colocation have value. That makes it easier for a data center or colocation provider to envision becoming a business hub, creating an ecosystem.


It is far harder for a connectivity provider to do so. Becoming a true business platform is not easy for either type of provider.


What is less clear is the best description of the business model.


U.S. Broadband Growth Slowing Again?

With broadband net additions apparently slowing, the issue is “why?” Is the Covid push over? Is the market therefore slowing? Are rivals beginning to take share? Comcast reported slower net account growth in the third quarter. But all major internet service providers (telcos and cable) saw slower net additions in the second quarter of 2021, compared to the second quarter of 2020. 


That is the most-likely explanation: simple slowing of growth rates, industry wide. There still is not much evidence that telcos have actually begun to take back market share from the cable companies. 


As telcos continue to deploy new fiber to home lines, many observers believe telcos are poised to gain share, though.  If so, upstream capacity upgrades of cable hybrid fiber coax networks will be more important, as telcos will tout the superiority of their return path bandwidth. 


Though telcos and cable operators have not yet massively embraced symmetrical bandwidth, that is the expected future. 

Thursday, September 16, 2021

Facilities-Based Competition Often Matters Quite a Lot

Though BT’s Openreach wholesale network has been designed to support an evolution to higher-speed internet access in the United Kingdom, but to this point most of the speed gains have been supplied by rival facilities-based providers, which might be a good argument for allowing facilities-based competition where it makes sense. 


In 2020, some 18 percent of U.K. homes could buy FTTH-based gigabit services. That equates to about five million lines. Somewhere more than two million lines were supplied by BT’s Openreach network in May 2021, though the pace of installation is increasing fast. 


That means three million lines--or about 60 percent--of U.K. FTTH accounts were supplied by facilities-based competitors to BT. 


However, in 2020, a total of eight million homes actually could buy gigabit service, the difference being Virgin Media’s gigabit service available to another three million homes. 


According to Ofcom there is about a two-percent overlap between the two different types of networks, in terms of supply.


 

source: Ofcom 


Supply is one thing, take rates another. In the U.S. market, for example, gigabit connections are purchased by about 10.5 percent of households, though available to more than 80 percent of homes passed by networks that can supply it. 


Over the last half decade, Virgin Media has had more “higher speed” customers than have all the competitors using BT’s wholesale network. Today, “altnets” have emerged as additional suppliers of gigabit speeds. 


Many might assume FTTH means gigabit speeds. It does not. Historically, FTTH might have meant speeds in the hundreds of megabits. Some U.S. FTTH networks installed in the mid-1990s to late 1990s offered speeds only up to 10 Mbps.    


Also common are price comparisons or tracking of “average” or “typical speeds experienced by consumer customers.  Less common are measurements of provisioned speeds. In other words, instead of looking at access technologies, what is the expected bandwidth a customer might obtain, on any network?


That matters for a simple reason. FTTH is not the only available access technology, and not the only possible fixed network platform. Looking only at the numbers of deployed lines, or take rates on those lines, tells us much. It does not tell us the whole story. 


In Germany, for example, Vodafone expects gigabit-per-second connections to be driven by rival hybrid fiber coax networks, not FTTH/B. By 2022, Vodafone expects 72 percent of all gigabit lines to be supplied by cable operators, not FTTH/B providers. 


source: Vodafone 


The point is that we get different pictures of where advanced fixed network internet access stands when we measure by access technology instead of available speed. 


FTTH available lines or provisioned lines alone does not necessarily tell us all we would like to know about user experience. What is the designed-for speed, upstream as well as downstream? A “mere” statistic on FTTH homes passed does not shed light on that question.


If one asks a different question, such as “what percentage of home passings offer downstream speeds of 1 Gbps,” we get a different answer. Or perhaps we cannot get a very good answer. Very few connections are capable of offering such speeds, even using FTTH. 


If we ask other questions, such as “what percentage of lines are symmetrical?” we would get yet another set of answers.  


Even when deploying FTTH, an internet service provider must yet decide what optoelectronics to use, and that of course affects network capabilities. So FTTH does not necessarily tell us much about available speeds.


Nor does the deployment of FTTH by one legacy provider necessarily tell us much about the actual state of gigabit per second or even “hundreds of megabits per second” service. Cable hybrid fiber coax is important in many markets. Rival overbuilders or altnets are important in some markets. 


Eventually, mobile networks will emerge as challenges in some instances. 


Methodology always matters when evaluating the quality of consumer broadband. FTTH is one measure of potential progress. But it is not the only important metric. We always need to know the designed-for speeds. And other platforms also compete. 


So, in many cases, the issue is not “FTTH.” The issue is “gigabit per second speeds.” FTTH is a matter of media, not commercially-available gigabit speeds.


Rural FTTH: Media Does Not Always Tell Us Much About Speed

It is common when measuring internet access progress to look at coverage, take rates, speeds and prices. Fiber to the home adoption almost always is measured in terms of coverage.That is only analytically helpful up to a point. 


In some markets FTTH might be the only practical way to supply gigabit per second and eventually multi-gigabit per second speeds. Not in all markets, however. And though FTTH might eventually be the only way to supply terabit-per-second speeds by mid-century, that is a ways off. 


In the meantime, media choices are one thing, but commercially-available speeds, competition and facilities-based differentiation extend beyond FTTH itself. 


Consider FTTH in rural areas. 


For example, fiber to the home (or basement for multiple dwelling units) covers about 22 percent of rural households, compared to 45 percent for all territories in the European Union and United Kingdom, a new study by the Fiber to the Home Council Europe. 


Spain has 60.5 percent rural FTTH/B coverage in 2020 while Germany has 9,8 percent coverage of rural dwellings.  


source: IDATE 


What is not clear is how much additional bandwidth, and therefore speed, is available on those lines. According to the NTCA, a trade group representing more than 650 rural telcos in the United States, nearly 70 percent of their customers are connected by FTTH, up from 58 percent in 2018.  


That 2020 survey also reported that 68 percent of NTCA member customers can receive downstream speeds greater than 100 Mbps. Some 80 percent can receive downstream speeds greater than 25 Mbps, up from 57 percent and 70 percent in 2018, respectively. 


Just under half (45 percent) of customers have access to 1 Gbps or higher downstream broadband speed, a metric that has nearly doubled in just two years (23.4 percent in 2018), NTCA says. 


So FTTH and gigabit per second speeds are not identical, it appears. Some 45 percent of customer locations can buy service at such speeds, while FTTH is deployed to 70 percent of locations. 


So fiber and gigabit are not identical. Nor does the existence of FTTH tell us anything for certain about available speeds.


AI Will Improve Productivity, But That is Not the Biggest Possible Change

Many would note that the internet impact on content media has been profound, boosting social and online media at the expense of linear form...