Thursday, November 10, 2022

Remote Workers Like It, C Suite Tends to Have Doubts

The general end of remote work at Twitter will be controversial in some quarters, for reasons related to differences of role between the C suite generally and employees generally. By now, mostly everyone who is an “employee” knows why they prefer remote work. Anecdotal evidence suggests that many workers can get “their work” done in far less time than seemingly was required in the past, allowing them more free time for “non-work” activities.


As welcome as “work-life balance” is for employees, C suite executives instinctively question the productivity implications. Leaving aside “control freak bosses who do not trust their employees,” there still is a growing body of evidence suggesting that, in fact, happier at-home workers are not “more productive.” They are just happier. 


Though is is counterintuitive, “happy workers” are not always “more productive.” Unhappy workers can be more productive; happy workers can be less productive or both can be equally productive. The nature of the work often dictates outcomes, not perceived happiness. 


Looking only at engineering output, for example, some studies suggest remote workers are less productive than similar workers “in the office.” 


“A study conducted in 2012 shows those office workers who were assigned boring tasks performed better and faster in the regular office setting. Home-life distractions are more likely to prevent productive work when you don’t enjoy the work,” notes Apollo Technical. 


On the other hand, the same study also found the reverse. More-creative work often were completed faster than “in the office.” 


But what many workers do with the extra time is the issue. Some will argue that if workers complete the minimum-required quantitative outputs, there is no harm. The same amount of work gets done and employees can simply use the extra time as they choose. 


But the same “group norms” hold for remote workers as for workers “in the office.” In other words (those of you who have worked union jobs know this well), there are disincentives for workers to outperform others, as it “makes the others look bad” and also tends to raise output expectations for the entire class of workers.


In other words, expected output levels will rise generally if enough workers in the class start producing at higher levels. That sort of behavior will be peer group discouraged. That will happen in both in office and remote settings. 


Another more recent study states that the more hours an individual works from home, the less productive they become, for the perhaps-obvious reasons that there are more distractions at home: pets, children, housework, household chores and entertainment options.


“Those who worked full time (eight  hours per day) at home are 70 percent less productive than those who don’t work from home,” says Apollo Technical. 


Leaving aside other issues, including the ability to structure self-supervised work effort, productivity, in principle, for some jobs, could be higher when knowledge work that can be conducted individually is performed. 


As a long-timer journalist, analyst and researcher who has worked remotely for 30 years, monitoring work is unnecessary when output can be quantified: so many stories per day, week or month; a major report delivered in three months’ time; a white paper produced by deadline. 


But not all jobs can be evaluated purely quantitatively. How does one measure the “quality” of computer instructions; a painting; a report; a story? But those intangibles exist no matter where the output is produced. 


Since it is nearly impossible--if not completely impossible--to measure knowledge worker and office worker productivity, much of what we think we know consists of opinion. So we really cannot say whether remote work “always” leads to higher productivity. It might just as often lead to lower productivity or equivalent output. 


What we might observe anecdotally is that lots of remote workers like remote work because it leads to “more free time.” They can use that free time to do other things than work. Whether employers have a “right” to expect more is a matter of debate. 


But it already seems clear enough that C-level executives question the productivity impact of widespread remote work. I suspect we would not be having discussions about quiet quitting, and much anecdotal evidence that remote workers get their work done faster, but then use their free time for non-work pursuits, if this were not the case.


Is T-Mobile About to Begin its Move into FTTH?

It was virtually inevitable that T-Mobile, in the U.S. market, would eventually move beyond its “mobile-only” approach to services. Virtually everywhere globally, dominant service providers have both mobile and fixed network retail operations. In the U.S. market, where T-Mobile primarily competes with AT&T and Verizon, those competitors have large-scale fixed networks businesses that T-Mobile cannot presently confront, head to head.


So it is not surprising that T-Mobile is looking to create a joint venture, funded at perhaps $4 billion in total, with T-Mobile investing $2 billion initially, to enter the home broadband market using a fixed network platform. 


For those of you who have watched other internet service providers get into the fiber-to-home business, the initial foray would not have huge scale. At network costs of perhaps $900 per location, that level of investment--in suitable markets-- might create a network passing about 400,000 home locations. And the actual number of connected customers would require an additional incremental capital investment of perhaps $600 each. 


So the initial footprint would not change national installed base or market share figures. At first, the joint venture would reasonably expect to connect 20 percent of locations passed, ramping over perhaps five years to as much as 40 percent. 


Still, the move is not unexpected. As well as T-Mobile is doing in gaining mobile customer market share, it could not forever ignore the fixed networks business as a growth driver. In fact, the foray into fixed wireless was a half step in that direction, allowing T-Mobile to compete for some portion of the existing home broadband market where it has not been present before. 


But most observers would agree that fixed wireless competes best in the value segment of the market. 


Fixed wireless has been the go-to platform for wireless internet service providers operating in U.S. rural areas. The issue now seems to be how important fixed wireless could be for some internet service providers such as Verizon and T-Mobile, who do not have the financial resources to overbuild 80 percent of the U.S. home market (Verizon) or all of that market (T-Mobile).


And that would not change were T-Mobile to invest a few billion dollars in FTTH. 


Perhaps the fixed network equivalent of mobile virtual network operators will eventually emerge at scale, allowing T-Mobile and Verizon to partner in some way with other entities to create or use FTTH facilities. In the meantime, joint ventures are likely the fastest way to start scaling into the business. 


“Scale” is  largely a “tomorrow” issue for T-Mobile. The immediate issue is whether fixed wireless can shift a few points of home broadband market share


By some estimates, U.S. home broadband generates $60 billion to more than $130 billion in annual revenues


If 5G fixed wireless accounts and revenue grow as fast as some envision, $14 billion to $24 billion in fixed wireless home broadband revenue would be created in 2025. 


5G Fixed Wireless Forecast


2019

2020

2021

2022

2023

2024

2025

Revenue $ M @99% growth rate

389

774

1540

3066

6100

12,140

24,158

Revenue $ M @ 16% growth rate

1.16

451

898

1787

3556

7077

14,082

source: IP Carrier estimate


If the market is valued at $60 billion in 2021 and grows at four percent annually, then home broadband revenue could reach $73 billion by 2026.




2022

2023

2024

2025

2026

Home Broadband Revenue $B

60

62

65

67

70

73

Growth Rate 4%







Higher Revenue $B

110

114

119

124

129

134

source: IP Carrier estimate


If we use the higher revenue base and the lower growth rate, then 5G fixed wireless might represent about 10 percent of the installed base, which will seem more reasonable to many observers. 


Assuming $50 per month in revenue, with no price increases at all by 2026, 5G fixed wireless still would amount to about $10.6 billion in annual revenue by 2026 or so. That would have 5G fixed wireless representing about 14 percent of home broadband revenue, assuming a total 2026 market of $73 billion.


If the home broadband market were $134 billion in 2026, then 5G fixed wireless would represent about eight percent of home broadband revenue. 


That is a serious incremental share gain for the likes of T-Mobile and Verizon, even if it leaves the long-term strategy undeveloped. To be sure, 6G will come, and will increase capacity at least 10 times over 5G. Using other tools, it might still be possible to boost fixed wireless capacity further, or to create mechanisms for offloading much mobile traffic to the fixed networks. *-/9+88/7


Comcast and Charter continue to claim that fixed wireless is not damaging its home broadband business, and that might well be partly correct. For any internet service provider, a customer move is an opportunity to gain or add an account, so lower rates of dwelling change should logically reduce the chances of adding new accounts. 


But that is akin to retailers blaming “the weather” when they have a revenue miss. Weather does play a role, but most often is not the only driver of results. 


In the second quarter of 2022, Comcast reported a net loss of customer relationships and “flat” home broadband accounts. 


Fixed wireless might not be a “long term” solution for every customer. But it might remain an option for a significant percentage of customers, especially if the long-term solution for T-Mobile and Verizon is yet to be created. But it appears T-Mobile is about to move on that part of the strategy.


Does Bandwidth "Want to be Free?"

About 25 years ago there was significant discussion in industry circles about the implications of essentially free bandwidth, computing and storage. Bandwidth providers were outraged by the suggestion, as you might guess. 


Around the turn of the century, Bill Gates irritated executives in the communications ecosystem by arguing that “bandwidth wants to be free? ” Others at the time quipped about whether “computing wants to be free?” Others might argue that data wants to be free. And some have been arguing that content wants to be free


Twenty years later, we are tempted to argue that Gates was more right than wrong, both about computing and bandwidth. 


To be sure, Gates did not mean computing or bandwidth would literally “cost nothing.” He only meant that neither computation nor bandwidth would not be a constraint to creating new services and apps. 


In 2004, Gates argued that “10 years out, in terms of actual hardware costs you can almost think of hardware as being free — I’m not saying it will be absolutely free — but in terms of the power of the servers, the power of the network will not be a limiting factor,” Gates said.


You might argue that is a position Gates adopted recently. Others would argue that has been foundational in his thinking since Micro-soft was a tiny company based in Albuquerque, New Mexico in 1975.


Young Bill Gates reportedly asked himself what his business would look like if hardware were free, an astounding assumption at the time. Keep in mind the audacious assumption Gates made. In 1970 a computer cost about $4.6 million each. 


The original insight for Microsoft was the answer to the question "What if computing were free?". Recall that Micro-Soft (later changed to Microsoft) was founded in 1975, not long after Gates apparently began to ponder the question. 

source: AEI 


In 1982 Gates did not seem to go out of his way to argue that hardware would be free, but he did argue it would be cheaper and far less interesting than software. 


 Gates made the argument in 1994. Gates was still saying it in 2004.  


The point is that the assumption by Gates that computing operations would be so cheap was an astounding leap. But my guess is that Gates understood Moore’s Law in a way that the rest of us did not.


Reed Hastings, Netflix founder, apparently made a similar decision. For Bill Gates, the insight that free computing would be a reality meant he should build his business on software used by computers.


Reed Hastings came to the same conclusion as he looked at bandwidth trends in terms both of capacity and prices. At a time when dial-up modems were running at 56 kbps, Hastings extrapolated from Moore's Law to understand where bandwidth would be in the future, not where it was “right now.”


“We took out our spreadsheets and we figured we’d get 14 megabits per second to the home by 2012, which turns out is about what we will get,” says Reed Hastings, Netflix CEO. “If you drag it out to 2021, we will all have a gigabit to the home." So far, internet access speeds have increased at just about those rates


How many business models, products and services now are routine and feasible because Moore’s Law keeps driving higher performance and lower cost? How many applications are possible because bandwidth keeps growing in a similar manner? 


Video streaming, early virtual reality and augmented reality, ridesharing, advanced smartphone features, use of millimeter wave spectrum for 5G and all forms of applied artificial intelligence for search, e-commerce and customer service are feasible because computing and bandwidth performance increase while costs are contained. 


Think about the application of computing over time, in situations where business models formerly unthinkable can become quite practical because the cost of computation and storage have become so cheap. 


The key insight is to ask “what would my business look like?” if communications, bandwidth, computing, storage or information or any other scarce or costly input were so available and low cost that those ceased to be constraints to a revenue model. 


The question might also be asked the other way: what does your business look like if a key input becomes too expensive? The key inputs could be labor, knowledge, a raw material, a logistics or supply chain change. 


A related question is “what does my business look like if demand changes in a major way?”


Google, Netflix, Amazon, Apple, Facebook, Square and many other examples illustrate what is possible when computing, communications, devices, transactions and information suddenly cease to be barriers.


But Gates was substantially correct. How many these days would argue against the notion that most public Wi-Fi access is substantially free?


“You can’t use today’s technology constraints to predict tomorrow’s developments,” says Amadeus Consulting CTO John Basso. That fundamental insight, based in large part on Moore’s Law, might once again be more important than often is believed.


You could argue whole businesses now are built on the assumption that technology (especially hardware) constraints disappear over time. All cloud-based apps are built on such assumptions.


Anything we see in consumer internet applications--where capabilities are supplied at no cost to users--provides an excellent illustration. The classic question is what does your business look like if a key cost constraint is removed. 


Though we might have mischaracterized key elements of the argument, ride sharing did raise questions about what it would mean if “cars were free.” They obviously are not “free,” but personal transportation based in part on ride sharing does in some cases affect the case for car ownership. 


The important part of the question is imagining whether a business or product can exist, and what it looks like, if a key cost constraint is removed. 


There is almost never a physical world ability to create Moore’s Law rates of change that are possible in the computing world. But there are going to be many other opportunities in the spaces where computing can alter cost profiles. Think e-commerce in general, ridesharing, lodging apps, video and audio content streaming, videoconferencing, use of millimeter wave spectrum that in an analog technology world is not commercially usable for home broadband. 


But it is hard and unusual to ask the right question: what does my business look like if a key cost input is removed?


Wednesday, November 9, 2022

Metaverse, Web3, Blockchain, VR, AR, 5G: Less Change Than You Expect, Early On; Far More Change Than You Expect in a Decade or Two

I learned early in my career making forecasts that it is better to conservative in the early going. Consider that an application of the maxim that humans tend to overestimate near-term impact of any technology and underestimate the long-term impact. 


“We tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run” is one way of stating the principle. So is “We always overestimate the change that will occur in the short term and underestimate the change that will occur in the long term.”


Or, “People overestimate what can be done in one year, and underestimate what can be done in ten.” All three statements capture the wisdom of how significant new technologies create change. 


There is a bit of business wisdom that argues we overestimate what can be done near term, but underestimate the long term impact of important technologies or trends. The reason is that so many trends are an S curve or Sigmoid function


Complex system learning curves are especially likely to be characterized by the sigmoid function, since complex systems require that many different processes, actions, habits,  infrastructure and incentives be aligned before an innovation can provide clear benefit. 

source: Rocrastination 


Also, keep in mind that perhaps 70 percent of change efforts fail, the Journal of Change Management has estimated. We might then modify our rules of thumb further, along the lines of “even as 70 percent of innovations fail, we will see less change than we expect in one year and more change than we expect in 10 years.” 


At least in part, technological impact increases over time for reasons of diffusion (what percentage of people use the technology regularly) as well as enculturation (it takes time for people and organizations to figure out how to best use a new technology). 


Impact arguably also increases as the ecosystem grows more powerful, allowing many more things to be done with the core technology. 


So, applied to 5G, the metaverse, Web3, augmented or virtual reality, blockchain or just about anything else, we will see less early impact than expected, but far more long-term change than we presently imagine.


Tuesday, November 8, 2022

Would Home Broadband "Utility" Regulation Lead to Lower Prices?

It never is entirely clear to me what people mean when they argue internet access or home broadband home broadband “should be a utility,” or that such services already are a utility similar to  electricity, gas, water, sewers, phone lines, roads, seaports or airports. 


Some might mean home broadband should be, or is, a public utility in the sense of “common carrier” with obligations to serve the general public. Though most of us would undoubtedly agree with that notion, telecom policy already has such goals. That is why we have universal support support funds and subsidies for operators in high cost areas. 


Others might mean essential or regulated in terms of price or conditions of service. That might imply regulated prices and terms and conditions of service. 


Others might fix on the used everyday sense of the term, which is that internet access is fundamental for inclusion in normal life, as are electricity, fresh water, wastewater services, garbage collection. It might mean that home broadband is essential in the same way that roads, schools, medical care, food supply, airports and seaports are necessary to support life. 


None of that seems to capture the implied meaning that home broadband should be a utility. More likely, there is some expectation that things would be better if prices, coverage, terms and condition of service were regulated in ways that led to lower prices, less competition or some combination of the two. 


And that should raise serious questions. There was a time when all “telecom services” were regulated as monopoly public utilities. But prices were high and innovation low, under that framework. Ironically, if what people mean is that internet access should be a regulated monopoly, the outcome would almost certainly be higher prices and less innovation; lower rates of quality improvement and other forms of customer value. 


Were home broadband regulated, we would see less innovation and investment as well, as potential suppliers would find they cannot make a positive business case. 


   

source: Market Business News 


As it pertains to “home broadband,” generally the term refers to fixed network supply of home broadband, not mobile network supply. 


The expectation that utility regulation would lead to lower prices is almost certainly wrong.


Most of us are too young ever to have experienced “connectivity services” as a public utility. But prices were not uniformly low. 


In 1984, before the breakup of the U.S. AT&T monopoly, calling between states cost about 90 cents a minute. In 1955, a phone call between Los Angeles and San Francisco (not even interstate) cost about 70 cents a minute, not adjusted for inflation.


In 2022 currency that would be about $7.75 per minute. So, no, prices were not uniformly lower under monopoly or public utility regulation. 


Of course, that was by policy design. High long distance charges and high business services were intended to subsidize consumer local calling. 


Were home broadband to become a regulated service, something similar would happen. While prices for some features and plans might be price controlled, other elements of value would increase sharply in price. 


And price is only one element of value. Service innovation was sharply limited in the monopoly era. In the U.S. market, consumers could not own their own phones, or attach third party devices to the network. All consumer premises gear had to be purchased from the phone company, for example. 


To be sure, AT&T Bell Labs produced many innovations. But they were not directly applied to the “telephone service” experience. Those included Unix, satellite communications, the laser, the solar cell, the transistor, the cellular phone network, television and television with sound. 


Though ultimately quite important, none of those innovations arguably applied directly to the consumer experience of the “phone network” or its services. 


The point is that monopoly regulation tends to produce varied prices for different products (some subsidized products, some high-cost products), but also low rates of innovation in the core services. 


Utility regulation would likely not wind up being as beneficial as some seem to believe. Be careful what you wish for.


Directv-Dish Merger Fails

Directv’’s termination of its deal to merge with EchoStar, apparently because EchoStar bondholders did not approve, means EchoStar continue...