Tuesday, December 15, 2020

Net-Zero Carbon at Costs Comparable to Present Spending?


This is the sort of thing a prudent economist or advocate looks for: ways to solve a pressing problem without sacrificing economic growth, firm profitability or consumer welfare. If you have ever modeled your own personal carbon footprint, you realize how difficult a task that is, and the sacrifices that--with present technology--must be made to reduce a footprint by just 30 percent. 

Long story made short, I found I could only hope to achieve a 30-percent carbon footprint reduction by completely avoiding use of an automobile or flying on airplanes. The former would significantly affect daily life, as I live out in the American West, with lowish density and longish distances.

The latter would severely limit my business functions. 

What we need are ways to reduce carbon output without crashing the economy, whole industries and individual firms. 

We also cannot tell people to be cold in the winder, hot in the summer and to avoid many conveniences of modern life. Sure, there are spiritual values to be reaped by reducing much of our consumption. But that has to be voluntary: cheerfully undertaken and not experienced as an imposition.

This is helpful in all those respects, it seems. 

Tier-One Telco Non-Core Revenue Averages 20%

Unless a tier-one telco serving consumers and businesses believes it can grow its business on the basis of connectivity services alone, new services and products beyond core communications must be found. 


GSMA Intelligence suggests services beyond the communications core account for between 10 percent and 40 percent of total retail service provider revenues, and just over 20 percent, on average, for many tier-one providers. That is up from 17 percent in 2017, GSMA Intelligence says. 

source: GSMA Intelligence

Artificial Intelligence Helps Researchers Sort Through Photos, Identify Wildlife in Australia


Over the next six months, more than 600 sensor cameras will be deployed in bushfire-affected areas across Australia, monitoring and evaluating the surviving wildlife populations. 

This nationwide effort is part of An Eye on Recovery, a camera sensor project run by the World Wide Fund for Nature and Conservation International,.

Using Wildlife Insights, a platform powered by Google’s Artificial Intelligence technology, researchers across the country will upload and share sensor camera photos to give a clearer picture of how Australian wildlife is coping after the devastating bushfires in the past year.

The Wildlife Insights platform can now identify over 700 species of wildlife in seconds and quickly discard empty images. 

Monday, December 14, 2020

Gig Economy is Part of a Long-Term Trend

The gig economy actually is not new. In recent years we have called them contingent workers. Nor is the use of  freelancers or independent contractors new. What arguably is new is the percentage and number of workers doing so full time by choice 


source: Brookings 


In 1989 some 17 percent of the U.S. workforce worked as independent contractors. By 2020 some 43 percent of U.S. workers were contingent. 


Since the official end of the 2008 Great Recession, the number of  temporary or contingent 

workers has substantially risen by more than 50 percent to 2.7 million, according to the U.S. Federal Reserve That is the biggest increase since the government began to record these figures in 1990. But the trend has been in place since before 1990. 


source: Intuit 


In 2020 about 40 percent of U.K. workers were contingent rather than employees. 


Most Big IT Projects--Including "Digital Transformation"--Fail

Of the $1.3 trillion that was spent on digital transformation--using digital technologies to create new or modify existing business processes--in 2018, it is estimated that $900 billion went to waste, say Ed Lam, Li & Fung CFO, Kirk Girard is former Director of Planning and Development in Santa Clara County and Vernon Irvin Lumen Technologies president of Government, Education, and Mid & Small Business. 


That should not come as a surprise, as historically, most big information technology projects fail. BCG research suggests that 70 percent of digital transformations fall short of their objectives. 


From 2003 to 2012, only 6.4 percent of federal IT projects with $10 million or more in labor costs were successful, according to a study by Standish, noted by Brookings.

source: BCG 


IT project success rates range between 28 percent and 30 percent, Standish also notes. The World Bank has estimated that large-scale information and communication projects (each worth over U.S. $6 million) fail or partially fail at a rate of 71 percent. 


McKinsey says that big IT projects also often run over budget. Roughly half of all large IT projects—defined as those with initial price tags exceeding $15 million—run over budget. On average, large IT projects run 45 percent over budget and seven percent over time, while delivering 56 percent less value than predicted, McKinsey says. 


Significantly, 17 percent of IT projects go so bad that they can threaten the very existence of the company, according to McKinsey . 


Beyond IT, virtually all efforts at organizational change arguably also fail. The rule of thumb is that 70 percent of organizational change programs fail, in part or completely. 


There is a reason for that experience. Assume you propose some change that requires just two approvals to proceed, with the odds of approval at 50 percent for each step. The odds of getting “yes” decisions in a two-step process are about 25 percent (.5x.5=.25). In other words, if only two approvals are required to make any change, and the odds of success are 50-50 for each stage, the odds of success are one in four. 


source: John Troller 


The odds of success get longer for any change process that actually requires multiple approvals. Assume there are five sets of approvals. Assume your odds of success are high--about 66 percent--at each stage. In that case, your odds of success are about one in eight for any change that requires five key approvals (.66x.66x.66x.66x.66=82/243). 


So it is not digital transformation specifically which tends to fail. Most big IT projects fail.

Saturday, December 12, 2020

Work from Home and the Solow Productivity Paradox

It is easy, but perhaps wrong, to attribute many types of change to “Covid-19” or the responses made to the pandemic. To be sure, the prevalence of work-from-home, learn-from-home modes required by governments to slow the spread was a precipitating event. It arguably speeded up trends already in place and convinced larger numbers of people and firms to consider joining trends, such as substituting Zoom video conferences for older meeting formats. 


With good reason, increased amounts of work from home are viewed as a permanent shift in venues where many types of work are done on a routine basis. The conventional wisdom is that hybrid models will dominate, with more workers spending parts of the week working from home, rather than “in the office.”


source: Researchgate  


But it is worth noting that this “remote work” trend has been in place and growing for more than 50 years, though we used to call it “telecommuting.” 


source: Federal Reserve Bank of St. Louis 


The point is that forecasters have expected a huge increase in remote work patterns for quite some time. 

Source


So it might be safe to say that belief in permanent change of remote work arrangements will happen. But the change might be more gradual than some believe. 


There might be unexpected barriers in the form of cost issues, as has proven true in the past, for at least some firms. 


More importantly, it is hard enough to measure office worker productivity at all. It will be devilishly difficult to determine what impact on productivity remote work in large doses might produce. 


Obviously, at some level of productivity (higher, same, lower), many types of work can be performed remotely, at home. 


source: McKinsey


But productivity is an issue. To be sure, most of us assume that higher investment and use of technology improves productivity. That might not be true, or true only under some circumstances. 


Investing in more information technology has often and consistently failed to boost productivity.  Others would argue the gains are there; just hard to measure.  There is evidence to support either conclusion. 


Most of us likely assume quality broadband “must” boost productivity. Except when it does not. The consensus view on broadband access for business is that it leads to higher productivity. 


But a study by Ireland’s Economic and Social Research Institute finds “small positive associations between broadband and firms’ productivity levels, none of these effects are statistically significant.”


“We also find no significant effect looking across all service sector firms taken together,” ESRI notes. “These results are consistent with those of other recent research that suggests the benefits of broadband for productivity depend heavily upon sectoral and firm characteristics rather than representing a generalised effect.”


“Overall, it seems that the benefits of broadband to particular local areas may vary substantially depending upon the sectoral mix of local firms and the availability of related inputs such as highly educated labour and appropriate management,” says ESRI.


Most of us are hopeful about the value of internet of things. But productivity always is hard to measure, and is harder when many inputs change simultaneously. Consider the impact of electricity on agricultural productivity.


“While initial adoption offered direct benefits from 1915 to 1930, productivity grew at a faster rate beginning in 1935, as electricity, along with other inputs in the economy such as the personal automobile, enabled new, more efficient and effective ways of working,” the National Bureau of Economic Research says.  


There are at least two big problems with the “electricity caused productivity to rise” argument. The first is that other inputs also changed, so we cannot isolate any specific driver. Note that the automobile, also generally considered a general-purpose technology, also was introduced at the same time.


Since 1970, global productivity growth has slowed, despite an increasingly application of technology in the economy overall, starting especially in the 1980s. 

 

A corollary: has information technology boosted living standards? Not so much,  some say. The absence of huge productivity gains has created what economists call the “productivity paradox.”


Basically, the paradox is that the official statistics have not borne out the productivity improvements expected from new technology.

 

Still, the productivity paradox seems to exist. Before investment in IT became widespread, the expected return on investment in terms of productivity was three percent to four percent, in line with what was seen in mechanization and automation of the farm and factory sectors.


When IT was applied over two decades from 1970 to 1990, the normal return on investment was only one percent.


This productivity paradox is not new. Information technology investments did not measurably help improve white collar job productivity for decades. In fact, it can be argued that researchers have failed to measure any improvement in productivity. So some might argue nearly all the investment has been wasted.


Some now argue there is a lag between the massive introduction of new information technology and measurable productivity results, and that this lag might conceivably take a decade or two decades to emerge.


Work from home trends were catalyzed by the pandemic, to be sure. Many underlying rates of change were accelerated. But the underlying remote work trends were there for decades, and always have been expected to grow sharply. 


Whether that is good, bad or indifferent for productivity remains to be seen. The Solow productivity paradox suggests that applied technology can boost--or lower--productivity. Though perhaps shocking, it appears that technology adoption productivity impact can be negative.

Friday, December 11, 2020

Is Gigabit Speed Really Available to more than 80% of U.S. Housing Units?

Some question statistics that gigabit internet access now is available (can be purchased) by about 84 percent of U.S. residents, especially when based on data reported to the Federal Communications Commission. 


Others might find the claim that gigabit access is not that widely available a bit incongruous, but not for reasons of FCC data reporting. The NCTA says 80 percent of U.S. homes now can buy gigabit speed internet access, up from about 63 percent in 2018.  


And since cable TV operators in the U.S. market have at least 70 percent installed base, looking only at cable TV data provides a non-duplicated view of access speeds. Assume for the moment zero supply of gigabit services by other internet service providers. 


According to the U.S. Census Bureau there are about 137.9 million U.S. housing units. Not all those units are occupied at any particular time, but ignore that for the moment, 


Roughly 8.8 percent of units are not occupied, typically. Vacant year round units represented 8.8 percent of total housing units, while 2.6 percent were vacant for seasonal use. 


Approximately 2.2 percent of the total units were vacant for rent, 0.7 percent were vacant for sale only and 0.6 percent were rented or sold but not yet occupied. Vacant units that were held off market comprised 5.3 percent of the total housing stock – 1.5 percent were for occasional use, 1.0 percent were temporarily occupied by persons with usual residence elsewhere (URE) and 2.9 percent were vacant for a variety of other reasons.


Add it all up and 88.6 percent of the housing units in the United States in the first quarter of 2020 were occupied and 11.4 percent were vacant, according to the U.S. Census Bureau. 


For the moment, ignore that. Retail consumer networks are not built to pass only “occupied” dwellings, but all dwellings in an area. If there are 137.9 dwelling units, with an average of 2.6 persons per household, then coverage of 80 percent of U.S. homes equates to 110.3 million locations. At 2.6 persons per home, that suggests 287 million people are in living units able to buy gigabit internet access from cable operators alone. 


If the U.S. population is 382.2 million, then some 75 percent of the U.S. population can buy gigabit internet access from cable operators alone, assuming no coverage provided by telcos or independent internet service providers. 


Those figures track closely with the FCC figures for “people” able to buy gigabit internet access. If you know anything about the way hybrid fiber coax networks are built, you also know that internet access speeds are designed to be the same at every end user node on the network. 


The architecture uses an optical fiber to node design, with very short electrical repeater segments (generally a few amplifiers) between the optical node and any location. Compared to the archaic all-electrical designs, that means top speeds do not decline with distance to any appreciable extent. 


The point is that if an HFC network is designed and built to support gigabit speeds, it will provide speeds close to that at all locations reached by the network, much as a fiber-to-home network would do. 


The point is that I cannot think of a good reason why the cable claim of passing 80 percent of U.S. home locations with gigabit service available is not believable. 


And that is assuming zero non-overlapping coverage by all other ISPs. After all, all ISPs build gigabit facilities where they believe the demand is greatest. Those also are the places where competition arguably is greatest, such as high-income suburban areas. 


That noted, surveys of rural telcos conducted by the NTCA have found that 25 percent of respondents offer gigabit internet access, while gigabit speeds are offered by a growing number of U.S. ISPs.  

 


Governments Likely Won't be Very Good at AI Regulation

Artificial intelligence regulations are at an early stage, and some typical areas of enforcement, such as copyright or antitrust, will take...