Saturday, March 19, 2022

We Actaully Know Very Little About What "Causes" Rapid Economic Development


Even if we accept the validity of the Kondratieff Wave hypothesis, and many do not, potential long waves of innovation, with a duration about 50 years, seem to have disparate impact. 


It is not  true that any particular wave has universally-applicable and substantial transformative effects for all people, all regions, all counties, all firms or all segments of the economy. There always are leaders and laggards. 

source: CFI 


One big criticism of the theory is that there is significant disagreement about when such waves have started and ended in the past. Being off by 20 years might not matter for a historian. It is life and death for firms who are too early or too late. 


Some argue the last wave ended in 2008, for example. Others believe that wave has not ended yet. Keep in mind there is a couple decades long pause between waves, characterized by recessions or slow growth. 


If the last wave has ended, it might be decades before the next wave is identified. 


source: Seeburger 


Just as important, nobody knows when the next wave will break, though the theory suggests a new wave might be building. It all depends on where one believes the last cycle will run for a couple more decades or has already ended. 


source: Wikipedia 


Still more contentious is the driver of the next wave. Lots of candidates will be offered. Perhaps none seem especially credible at this point. 


source: Insead 


It might seem obvious that the long wave theory is not precise enough to be useful for guiding firm strategy and investments. The larger point might be that even when we can correctly identify which era we might be in, such knowledge does not mean we have a magic bullet that causes economic growth everywhere, at the same rates. 


It is similar to the argument that quality broadband causes economic growth. We might have made the same arguments about railroads, steam power, electricity, internal combustion, mass production or information technology. 


Growth happens, but whether the underlying technologies can be harnessed everywhere, by everyone, at high levels, has proven untrue. Areas of low population density, for example, rarely benefit as much as areas of high population density. 


Societies without a firm rule of law that protects property rights rarely, if ever, develop as much as regions with such protections. In fact, we probably continue to know very little about why development actually happens in particular circumstances.


Historically, productivity and population increases are associated with economic growth. Obviously population growth happens in lots of places without much growth. Productivity growth likewise is uneven.


"Keeping Whales Safe" Using IoT, Edge Computing and Artificial Intelligence

Whale Safe, a project of the nonprofit Benioff Ocean Initiative, is deploying buoys with acoustical sensors using on-board artificial intelligence and edge computing in key whale habitat off the coast of southern California. 


The analysis of actual whale locations, correlated with movements of ships in the channel, helps avoid whale strikes by the ships moving in one of the most-trafficked sea lanes globally. 


The use case illustrates why it often is hard to clearly delineate "edge computing" value from that of "artificial intelligence" from "internet of things."


source: Whalesafe 


Ships are asked to slow down when whales are located in their path, since collisions between ships and whales are often fatal. 


Since deployment of the system in 2020, whale strikes declined significantly in the Santa Barbara Channel, which is one of the most active feeding grounds for humpback and endangered blue whales on the West Coast.

AI Applications Forecasts Get Revised Upwards at IDC

Spending on artificial intelligence in the United States will grow to $120 billion by 2025, representing a compound annual growth rate of 26 percent between 2021 and 2025, says IDC, which projects that the U.S. market will constitute half of the global market. 


So that implies a global AI applications market of about $240 billion by 2025.


That represents a “robust” expectation, as it lies at the top end of Omdia forecast ranges of revenue earned from AI software sales. 

source: Omdia 


Forecasts will vary significantly based on what is included in the definition: app software only; hardware; system integration services; consulting; AI feature upgrades to existing products. 


If you have ever worked doing market research, you know everything rests on the assumptions. 

The IDC and Omdia forecasts seem to include only AI apps. Almost arbitrarily-larger figures would result if we looked at all AI-related revenues, from chipsets and programming tools to hardware and professional services. 


Still, the IDC methodology suggests a significant new market, even if some of the reported revenue is likely cannibalized from other existing product segments. And IDC has ramped up its forecasts. In 2020 it predicted global revenue by 2024 of only $110 billion. Now IDC expects U.S. AI spending alone (on applications) to reach $120 billion. 


source: IDC 


Retail will remain the largest U.S. industry for AI spending while banking will be the second largest industry, says IDC. Augmented customer service agents, and expert shopping advisors making product recommendations will drive about 40 percent of AI spending in the industry, IDC predicts. 


These two industries will represent nearly 28 percent of all AI spending in the United States in 2025.


But professional services, media, and securities and investment services industries will have CAGRs greater than 30 percent, IDC notes. 


AI spending in the banking industry will be spread across customer service (program advisors and recommendation systems); fraud Analysis and investigation) and security.


Customer service and sales will represent more than 20 percent of all AI spending in the U.S. market in 2025, across most industries, IDC projects.


Friday, March 18, 2022

Home Broadband Prices Can be "Higher" Without Being "Too High"

We often hear, often without use of supporting data, that U.S. home broadband prices are high and speeds slow. When data is used, it most often is comparative to other countries. That can be done, with or without adjusting for purchasing power across countries. 


As has been the case for other U.S. connectivity services, the United States does not typically rank “first” on such global comparisons. Any rank between nine and 15 would be expected. There are valid reasons for that, in substantial part due to the large percentage of the U.S. land mass that is lightly or uninhabited. 


But there are other ways to compare prices. Consider general price levels and inflation, for example. 


In one sense, we can note that U.S. price levels are “higher” for almost every category since 1950, for example. 


Prices in 2022 are 11.77 times higher than average prices since 1950, according to the Bureau of Labor Statistics consumer price index. A dollar in early 2022 only buys about 8.5 percent of what it could buy in 1950. By the end of 2022 the dollar will buy less, as the inflation rate has exploded. 

source: U.S. Bureau of Labor Statistics 


So are prices higher in 2022 for virtually anything than in 1950? And, if so, would we really expect prices for home broadband to be “lower” in an absolute sense?


But you might object that internet access did not exist in 1950. So consider general U.S. price changes since 1996, when people were buying internet access. Since 1996, U.S. prices have increased almost fifty percent, accounting for inflation. 


source: OfficialData.org 


In other words, according to the U.S. Bureau of Labor Statistics, a unit of U.S. currency in 2022 buys about 52 percent of what it bought in 1996. Stated another way, price levels in 2022 are about 50 percent higher than they were in 1996. 


So virtually any product can be accused of “costing more” in 2022 than it cost in 1996. 


Some may intuitively feel this cannot be the full story where it comes to digital products. That hedonic change.


Hedonic qualIty adjustment is a method used by economists to adjust prices whenever the characteristics of the products included in the consumer price index change because of innovation. Hedonic quality adjustment also is used when older products are improved and become new products. 


That often has been the case for computing products, televisions, consumer electronics and--dare we note--broadband internet access services. 


Hedonically adjusted price indices for broadband internet access in the U.S. market then looks like this:

Graph of PCU5173115173116


source: Bureau of Labor Statistics 


In other words, dial-up internet access and gigabit broadband are not the same product. 64 kbps internet access is not the same product as 10 Mbps broadband. And 10 Mbps broadband is not the same product as gigabit or multi-gigabit home broadband. 


In comparing digital prices over time, one must adjust for inflation and hedonic quality changes to really understand real prices. 


This is an applied instance of Moore;s Law at work. The cost of computing power, for example, has continually dropped since 1940, for example. 



source: Hamilton Project


So has the cost of bandwidth seen hedonic changes and falling prices. Many will note the revenue per unit trends and cost per unit trends that are part of the capacity business. 


Compared to 2008, fixed network broadband costs have fallen, globally, though there is a slight rise in developed nations, driven by consumer preferences for higher-priced and higher-speed services, according to International Telecommunications Union data. 

source: ITU 


To be sure, most of the improvement has happened, since 2008, in developing countries. Prices in developed nations have been relatively low, and stable, since 2008. 


But while prices have stayed essentially flat, speed and bandwidth consumption allowances have risen steadily. In real terms, and adjusting for hedonic changes, U.S. home broadband prices have dropped dramatically since 2017, according to Bureau of Labor statistics. 


The point is that if all prices in the U.S. market have gone up since 1950, since 1996 or for any other time period, so would we expect prices for home broadband to rise, with the general change in overall prices. 


It is possible to argue that even if home broadband prices have risen, the reasons are inflation--all prices are higher--or product quality changes (hedonic change) or consumer preference for different products (gigabit speeds rather than 100 Mbps to 300 Mbps). 


A Tesla is not a Honda Civic. People pay more for the former than for the latter. But does that mean “car prices” have risen? Yes and no. Inflation drives prices higher over time. But when product differentiation is possible, consumers make different choices about what to buy. 


A Civic owner who then buys a Tesla is arguably not buying the same product. When customers can buy a 100-Mbps service at the low end or 5 Gbps on the high end, “average” price is misleading. 


Beyond that, which prices do we choose to compare? Do we analyze the services “most frequently bought?” Do we use posted retail prices or do we also include buying patterns that feature price discounts, such as product bundles? 


Do we measure price per household, per user, per megabit per second, per consumption or something else? 


If consumer demand shifts, how do we incorporate such shifts into the analysis? It is permissible to argue that home broadband prices “have risen.” It also is intellectually honest to admit that all prices have risen over time. 


One may argue that U.S. prices are “too high.” But it is honest to explain “in relation to what?” Are we comparing a continent-sized situation to a small city-state? Or are we comparing a substantially-rural market to a highly-urbanized market? 


In Canada, 14 percent of the people live in areas of density between five and 50 people per square kilometer. In Australia, 18 percent of people live in such rural areas.


In the United States, 37 percent of the population lives in rural areas with less than 50 people per square kilometer.


Put another way, less than two percent of Canadians and four percent of Australians live in such rural areas. In the United States, fully 48 percent of people live in such areas.


Coverage is an issue in such rural areas. About six percent of the U.S. land mass is “developed” and relatively highly populated. Those are the areas where it is easiest to build networks. 


But about 94 percent of the U.S. land surface  is unsettled or lightly populated, including mountains, rangeland, cropland and forests. And that is where networks are hardest to build and sustain.


That does not directly shape retail prices. But density does affect when and where sustainable networks can be built, even including government subsidies. 


Are home broadband prices “higher” in 2022 than in 1996? A reasonable person could answer “yes” without also arguing prices are “too high.”


Thursday, March 17, 2022

"Why Go To the Office?" is a New Question for Many Business Leaders

A key takeaway from a Microsoft-sponsored study of global work patterns suggests a new challenge:  Leaders must establish the why, when, and how of the office,” says Microsoft. “This means defining the purpose of in-person collaboration, creating team agreements on when to come together in person, defining hybrid meeting etiquette, and rethinking how space can play a supporting role.”

source: Microsoft 


That’s different from pre-pandemic work modes where “going to work” was simply accepted as the way work gets done. 


Social capital also is an issue. Social capital refers to network of personal relationships within an organization or external to it that are helpful in achieving organization objectives. Social capital matters because it includes the shared sense of goals and belonging; culture and values that participants share. 


The point is that social capital helps organizations achieve their goals by increasing trust and willingness to help others; reducing organizational friction. As with friendships, social capital has to be built and cultivated. 


One aspect of enforced remote work such as we have experienced during the Covid pandemic is the inability to create new social capital as easily. We are, in a real sense, working off of social capital already created. 


And the supposition most of us likely share is that it is much harder to create new social capital when people are not in proximity, interacting face to face and building trust. 


Rebuilding social capital looks different in a hybrid world. With 51 percent of hybrid workers considering a shift to full remote work in the year ahead, companies cannot rely solely on the office to recoup the social capital we’ve lost over the past two years, says Microsoft.


Forty-three percent of leaders say relationship-building is the greatest challenge of having employees work in a hybrid or remote environment.


“We are not the Same People Who Went Home to Work in Early 2020," Microsoft Says

“We are not the same people who went home to work in early 2020,” a new report from Microsoft says. So there could be implications for suppliers in the connectivity and computing industries. 


Most obviously, if Microsoft is right, collaboration “at a distance” will be a permanent reality. That has implications for application usage, bandwidth requirements (volume and location) and the value of “realism” for remote interactions. Business tools able to use artificial reality (and therefore artificial intelligence) to create a more realistic interaction should arrive. 


But much hinges on worker choices about work venues. More than half of presently “remote” workers are considering spending some time in their offices. But more than half of those who presently work partly in the office also are considering shifting to remote work. 


If one assumes there are far more employees presently remote, then the possibility of work venues shifting back to offices is substantial. In that case the “remote computing, apps and connectivity” trends will ameliorate somewhat. 

source: Microsoft 


Business processes will have to be recrafted on the assumption that virtually 100 percent of the workforce will be remote at least some of the time. So “zero trust” security will become common and necessary. 


Facing widespread resistance to “returning to the office full time,” organizations are likely to continue to pursue “hybrid work arrangements” on a permanent basis, which implies a more-distributed communications, computing and applications environment. 


source: Microsoft 


The study of 31,000 people in 31 countries includes data from Microsoft 365 and LinkedIn finds a strong employee resistance to returning to the office full time. If managers are unable to overcome that resistance, all manner of computing and communications patterns will remain more distributed. 


Mobile service providers might find that capacity upgrade requirements in urban areas and along commuting corridors are lessened. Less foot traffic “downtown” will create financial hazards and closures for many small businesses, reducing small business communications and computing demand. 


Bandwidth demand could shift outward to suburban areas, as well, with the potential for a permanent shift of demand from mobile to fixed networks, as workers at home offload to Wi-Fi. 


Overall, cloud-based applications would seem to be in permanently-higher demand as well, if higher amounts of remote work now are a permanent reality. 


How Few Competitors Still Provide Competitive Benefits?

As a practical matter, policy debates about how to sustain competition in the mobility business while also sustaining the supply of services often focuses on whether the number of suppliers should be consolidated from four to three


Most of us forget how complicated the early mobile “phone” business was, and how much asset rearrangement produced the current pattern. Consider the changes between 2005 and 2015, alone. 


source: Fierce Wireless 


Over a longer period of time, the asset reshuffling was even more complex. 


source: Deadzones 


In the fixed networks business the dilemma is whether two viable facilities-based contestants can support themselves over the long term, or whether the only choice is a monopoly wholesale provider with retail competition. 


It’s complicated. Some 25 years ago, U.S. policymakers believed that two national mobile operators were providing too little in the way of competitive benefits, leading to the granting of additional spectrum to enable a third national competitor. 


Then market dynamics changed and a four-leader market developed, though a duopoly remained at the top of the market. With the merger of Sprint and T-Mobile, a three-supplier pattern now holds, though support for a fourth national provider (Dish Network) also was part of the deal-making around the merger. 


In the fixed network market, however, just two competitors have provided genuine innovation and competitive benefits for consumers, perhaps assisted in part by the use of different infrastructure solutions. U.S. cable operators found ways to boost internet access speeds faster, and cheaper, than telcos could, leading to an installed base share as high as 70 percent. 


Like the U.S. mobile industry, the fixed networks business was once more fragmented than at present. The AT&T breakup in 1984 resulted in eight large suppliers, AT&T in long distance and manufacturing and seven regional access companies. 


source: Hoot and Hollah 



These days, cable operators have emerged as key competitors for the remaining telcos, both in the fixed networks business and now are emerging as contestants in the mobility segment as well. 

source: The Wall Street Journal 


Where the wholesale, single-network framework has been used, competition has indeed developed as well, though one might argue that facilities-based competition tends to result in higher rates of innovation. 


In the wholesale framework every retailer has access to the same products, purchased wholesale at the same prices. There is some room for differentiation of offers, but not much based on infrastructure features. 


One certainty remains: a capital-intensive business such as “network access” tends to feature just a few providers. Periodic efforts to increase the number of suppliers always seems to result in reconsolidation. 


It remains to be seen how much consolidation can happen while still providing competitive benefits.


Directv-Dish Merger Fails

Directv’’s termination of its deal to merge with EchoStar, apparently because EchoStar bondholders did not approve, means EchoStar continue...