Thursday, September 2, 2021

Why IT Failure is So Common

Beyond IT, virtually all efforts at organizational change arguably also fail. The rule of thumb is that 70 percent of organizational change programs fail, in part or completely. 


There is a reason for that experience. Assume you propose some change that requires just two approvals to proceed, with the odds of approval at 50 percent for each step. The odds of getting “yes” decisions in a two-step process are about 25 percent (.5x.5=.25). 


In other words, if only two approvals are required to make any change, and the odds of success are 50-50 for each stage, the odds of success are one in four. 


Consider new drug approval rates. Consider a four-phase drug approval process, where the real mortality (58 percent to 87 percent failure) is the gate from phase one to phase two. Where there are three development phases and then an “approval” process of four stages, overall success rates range from 14 percent to 21 percent. 


American Council on Science and Health 


Other examinations of drug approval success rates suggest the odds of success are less than 10 percent for a four-stage set of hurdles, up to perhaps 11 percent.  


source: Seeking Alpha 


In the venture capital business, the odds of getting funding are less than one percent. 


souce: Corporate Finance Institute


In other words, change is hard because any complex process, with multiple stakeholders--with the ability to stop any proposal--is mathematically challenging.


74% of Digital Transformation Efforts Fail

“74 percent of cloud-related transformations fail to capture expected savings or business value,” say McKinsey consultants  Matthias Kässer, Wolf Richter, Gundbert Scherf, and Christoph Schrey. 


Similarly, almost half of all respondents experienced cloud technology as more, or much more,  complex than they initially expected, while 40 percent overran their cloud budgets, some to a significant degree, they note. 


source: McKinsey 


Those results would not be unfamiliar to anyone who follows success rates of information technology initiatives, where the rule of thumb is that 70 percent of projects fail in some way.


Of the $1.3 trillion that was spent on digital transformation--using digital technologies to create new or modify existing business processes--in 2018, it is estimated that $900 billion went to waste, say Ed Lam, Li & Fung CFO, Kirk Girard is former Director of Planning and Development in Santa Clara County and Vernon Irvin Lumen Technologies president of Government, Education, and Mid & Small Business. 


That should not come as a surprise, as historically, most big information technology projects fail. BCG research suggests that 70 percent of digital transformations fall short of their objectives. 


From 2003 to 2012, only 6.4 percent of federal IT projects with $10 million or more in labor costs were successful, according to a study by Standish, noted by Brookings.

source: BCG 


IT project success rates range between 28 percent and 30 percent, Standish also notes. The World Bank has estimated that large-scale information and communication projects (each worth over U.S. $6 million) fail or partially fail at a rate of 71 percent. 


McKinsey says that big IT projects also often run over budget. Roughly half of all large IT projects—defined as those with initial price tags exceeding $15 million—run over budget. On average, large IT projects run 45 percent over budget and seven percent over time, while delivering 56 percent less value than predicted, McKinsey says. 


Beyond IT, virtually all efforts at organizational change arguably also fail. The rule of thumb is that 70 percent of organizational change programs fail, in part or completely. 


There is a reason for that experience. Assume you propose some change that requires just two approvals to proceed, with the odds of approval at 50 percent for each step. The odds of getting “yes” decisions in a two-step process are about 25 percent (.5x.5=.25). 


In other words, if only two approvals are required to make any change, and the odds of success are 50-50 for each stage, the odds of success are one in four. 


The odds of success get longer for any change process that actually requires multiple approvals. 


Assume there are five sets of approvals. Assume your odds of success are high--about 66 percent--at each stage. In that case, your odds of success are about one in eight for any change that requires five key approvals (.66x.66x.66x.66x.66=82/243). 


In a more realistic scenario where odds of approval at any key chokepoint are 50 percent, and there are 15 such approval gates, the odds of success are about 0.0000305. 


source: John Troller 


So it is not digital transformation specifically which tends to fail. Most big IT projects fail.

"Covid" Does Not Drive as Many Big Changes as Argued

Public company executives often have convenient explanations for why revenue targets were missed. “Bad weather” is a common reference for retailers, for example. “Bad weather” kept consumers from shopping. Bad weather depressed demand for spring clothing.. 


But “Covid” provides a rationale for revenue misses as well as expectations for revenue growth. That is not misplaced. But all Covid effects are “at the margin.”


Covid is argued to have pushed forward some information technology spending that otherwise would have taken longer. That is expressed in the phrase “a year’s change in a few months.” 


“Covid-19 has proved a catalyst for investment in technologies that will help them navigate the post pandemic world, with a ramp in spending evident on cloud computing, DaaS Device-as-a-Service) and IoT, as well as investment in 5G and Wi-Fi6,” say researchers at Strategy Analytics. 


But many of the changes do not necessarily seem related to long-term transformation. About 33 percent of U.S. survey respondents said they would increase communications spending between one percent and five percent over the next couple of years. 


But that might be true most years: though most spend roughly the same amount, sizable percentages spend less or more. Also, less spending in 2020 might be expected to rebound in “more normal” years to follow. Mobile roaming charges, for example, were far lower in 2020 than in a “normal” year, as fewer people were traveling. A change between one percent and five percent would, in many cases, simply reflect a reversion to mean. 


With or without Covid-driven conditions, that is a reasonable belief. In the U.S. market, mobility spending alone grows about three percent a year. 


About 20 percent of respondents guessed that their IT spending would grow by more than 10 percent over the next two years. Again, that might be typical in any “normal” year. Gartner, for example, predicts global IT spending will grow nine percent in 2021 alone. Spending also increased about the same amount in 2020.


It is hopes for “digital transformation” that drive that investment, however, not Covid response. Global IT spending has grown since 2005, for example.  And Gartner has predicted four percent growth of IT spending in 2021, for example.  


Yes, some changes in spending were driven by Covid. But the fundamental longer-term changes, as shown by the Strategy Analytics survey, do not require Covid as an explanation. 


Covid had an effect, to be sure. With more remote work happening, demand for home broadband connections appears to have increased. But other changes, such as a shift to cloud computing; hardware as a service and internet of things adoption, are harder to analyze.


source: Strategy Analytics


Such transformations take time, and the sudden work-from-home demand would not have allowed enough time for executives to make many fundamental changes. The same goes with other investments in technology such as blockchain, artificial intelligence, virtual or augmented reality or edge computing. 


Those changes require many other shifts of architecture and business processes that simply cannot be turned quickly, in a couple of months. 


In fact, many IT teams arguably found themselves shifting spending towards buying of personal computers and remote work software licenses rather than anything else related to computing or communications architecture.


Wednesday, September 1, 2021

When Collaboration Becomes a Bug, Not a Feature

We all know the phrase “that’s not a bug, it’s a feature.” The reverse might also be true: “a feature can be a bug.” 


Consider collaboration, which virtually everyone considers a desired feature. During the Covid-19 enforced remote work period, that feature has become a bug. 


So much so that a major supplier of collaboration tools and platforms (Microsoft) notes that actual collaboration between people is improved when electronic collaboration drops. That is a byproduct of people being able to congregate again in workplaces. 


One byproduct of enforced work from home rules is that work team isolation increased, creating more silos within organizations. As the possibility of face-to-face interactions returns, use of electronic collaboration tools drops. But Microsoft researchers say that is a good thing, as it reduces the “silo” impact enforced remote work has caused. 


source: Microsoft 


“At the onset of the pandemic, our analysis shows that interactions with our close networks at work increased, while interactions with our distant networks diminished,” say Microsoft research. “This suggests that, as we shifted into lockdowns, we clung to our immediate teams for support and let our broader network fall to the wayside.”


There is a downside: similar companies almost certainly became more siloed than they were before the pandemic. 


“And while interactions with our close networks are still more frequent than they were before the pandemic, the trend shows even these close team interactions have started to diminish over time,” Microsoft researchers say. 


Younger workers (25 or younger) also reported more difficulty feeling engaged or excited about work, getting a word in during meetings, and bringing new ideas to the table when compared to other generations.


“Bumping into people in the office and grabbing lunch together may seem unrelated to the success of the organization, but they’re actually important moments where people get to know one another and build social capital,” says Dr. Nancy Baym, Microsoft senior principal researcher “They build trust, they discover common interests they didn’t know they had, and they spark ideas and conversations.”


I never thought I’d see the day when a major supplier of collaboration tools actually argues that less use of those tools would improve actual collaboration. If bugs can be features, features can be bugs.


Tuesday, August 31, 2021

Mobile or Fixed Operating Capex Seems Consistent: 10% Per Year

Some connectivity network capital investment assumptions seem remarkably stable after 50 years.


An example is the adage that a fixed network operator “has to do something” for about 10 percent of the physical plant every year. In other words, each year, existing plant has to be replaced, or new plant added, representing about 10 percent of the installed base.


The best example is replacement of in-service cabling and associated electronics or optics.


That has proven a useful rule of thumb for cable TV and telco access networks, and now also seems to be useful for mobile networks. According to Allot, mobile operators need to add capacity to about 10 percent of cell sites every year.


source: Allot


Essentially, that means every fixed or mobile network is potentially 100-percent renewed every decade, piecemeal.


The big exceptions are the once-a-decade upgrades of mobile networks to the new next-generation platform, such as 4G to 5G; telco upgrades from copper access to fiber-to-the-home; or a cable operator upgrade of DOCSIS or major change in serving area size (accompanied by a shift to deeper fiber deployment).


Where replacement, upgrade or repair of about 10 percent of the existing plant is "operating capex," the once-a-decade architecture upgrades are "strategic capex." The former keeps the existing network operating; the latter upgrades capabilities.


Monday, August 30, 2021

Will 5G "Cost per Gigabyte" Rival Fixed Networks?

One key issue as mobile operators contemplate the use of their networks to compete in the home broadband market--formerly the “fixed network internet access” market--is whether such services can compete with incumbents. 


Two key matters are the retail price of a service plan, which is shaped, in turn, by the cost of supplying the capacity, usually denoted as “cost per gigabyte of usage.” 


Determining price on a country-by-country basis also requires adjusting prices to account for differences in currency values and purchasing power (typically using a purchasing power parity method). All those things done, price per gigabyte of mobile data usage ranges between nine cents per gigabyte up to $110 per gigabyte, according to Speedcheck

 

There are all sorts of other complications, including the speed of a connection; whether we include fees and taxes as part of the calculator and whether other changes, such as equipment rental, also are included. All of that will differ from country to country and provider to provider. 


But a reasonable rule of thumb has been that mobile data costs an order of magnitude more than fixed network data, on a cost-per-gigabyte basis. So in the U.S. market fixed network gigabytes might cost 30 cents while mobile gigabytes cost $3. 


If mobile bandwidth traditionally has been an order of magnitude more expensive than fixed network bandwidth, then it is obvious that, to compete, mobile bandwidth has to be as capacious and affordable as fixed network bandwidth. 


What is clear is that, compared to past capabilities, 5G networks will have a cost-per-gigabyte profile that allows mobile operators to radically close the cost gap with fixed networks that prevailed with 4G and prior mobile generations. 


source: Mobile Experts


Up to this point, mobile cost per gigabyte has been as much as an order of magnitude more costly than fixed network cost per gigabyte. As always, it matters how we count. 


The posted retail prices are not necessarily the “actual prices” consumers pay, as many are on promotional deals at any particular time. The other issue is prices for actually-used capacity versus plan allowance price. They are usually different. 


The nominal (designed-for rate) is total usage allowance divided by total recurring cost. But not many users actually consume all the data their plans provide. Also, customers on unlimited-usage plans will have highly-variable “cost per consumption” ratios, as price is fixed, while usage is unlimited. 


Fixed network data costs, on a cost-per-megabyte basis, routinely have in the past been in the 20 times to 60 times lower scale than mobile data. Where fixed network data might cost cents per gigabyte, mobile data costs dollars per gigabyte, counting either plan costs or actual usage costs. 


source: Mobile Experts


To compete in the home broadband market with fixed network providers means mobile operators have to match prices per gigabyte more closely. Cost per gigabyte has been steadily declining, for both fixed and mobile networks. 


The importance of 5G is that, for several reasons, the cost of supplying a gigabyte of usage will drop, compared to 4G, as the cost of each successive mobile generation has done. 


Using mobile networks to compete in the home broadband market never gets the headlines when we talk about 5G. The buzz is all about edge computing or internet of things or virtual reality. 


We might be surprised by the near term revenue upside. Mobile operators might make more new revenue from home broadband services--however unheralded--than from edge computing, IoT or AI-based apps.


Sunday, August 29, 2021

Risk Assessment and Availability Bias

Risk assessment is part of every person’s routine, just as much as it is an enterprise or organization taks. In that regard, human beings are biased toward judging an event’s likelihood or frequency (and hence, risk) based on how easily their minds can conjure up examples of the event occurring in the past, according to behavioral economics.  


In other words, we make bad decisions, or are biased to do so, because our memories are skewed to vivid memories.  


If a similar event has occurred recently, or past instances induced strong emotions, people are much more likely to predict that the event is likely to occur. For business leaders no less than consumers and citizens, judgment is affected by the fact that the event was recent or strongly emotional. 


These mental shortcuts happen because the human brain cannot process all the data it encounters on a routine basis. So brains create rules that simplify the search for meaning. That placing much raw data into a framework results in availability bias


Availability bias is a mental shortcut that relies on immediate examples that come to a given person's mind. The practical impact is that people tend to focus on information that’s easiest to access, most recent or most memorable. 


That can lead to wrong or bad decisions. According to Farnam Street, we end up remembering based on 


  • Our foundational beliefs about the world

  • Our expectations

  • The emotions a piece of information inspires in us

  • How many times we’re exposed to a piece of information

  • The source of a piece of information.


Combatting or overcoming the availability heuristic--a form of irrationality--is not easy, as it requires deliberate effort to evaluate data not based on its vividness, not based on its recency, its prevalence or expected importance. 


One suggestion is to rely on statistics, less than emotion or belief. If something in business or nature tends to happen once in 1000 occurrences, its probability should be deemed unlikely, for example. Not impossible; only rare. 


Focus on trends and patterns. Regression to the mean teaches us that extreme events tend to be followed by more moderate ones. Outlier events are often the result of luck and randomness and are unlikely to reoccur soon. 


Whenever possible, base your judgments on trends and patterns—the longer term, the better. Track record is everything, even if outlier events are more memorable, says Farnam Street. 


Avoid hasty judgments that have big consequences, it goes almost without saying. The whole point of heuristics is that they save the time and effort needed to parse a ton of information and make a judgment. 


When making an important decision, the only way to get around the availability heuristic is to slow down and parse the relevant information, to reduce the impact of recent, emotional, memorable or oft-repeated information.


Do not rely on the soundness of your memory. It is hard to remember what happened in the past, and the more distant, the less-powerful the mental weighting. “What have you done for me lately?” is a commonplace expression of the rule. Humans give priority to what happened more recently, not what happened years ago. 


The unusual--a big mistake; a huge win, an outlandish occurrence--tends to dominate our thinking, rather than the statistical odds of occurrence. 

 

As a student of history, a last bit of advice is second nature. “Go back and revisit old information.” “Even if you think you can recall everything important, it’s a good idea to go back and refresh your memory of relevant information before making a decision,” says Farnam Street. 


Directv-Dish Merger Fails

Directv’’s termination of its deal to merge with EchoStar, apparently because EchoStar bondholders did not approve, means EchoStar continue...