Wednesday, September 1, 2021

When Collaboration Becomes a Bug, Not a Feature

We all know the phrase “that’s not a bug, it’s a feature.” The reverse might also be true: “a feature can be a bug.” 


Consider collaboration, which virtually everyone considers a desired feature. During the Covid-19 enforced remote work period, that feature has become a bug. 


So much so that a major supplier of collaboration tools and platforms (Microsoft) notes that actual collaboration between people is improved when electronic collaboration drops. That is a byproduct of people being able to congregate again in workplaces. 


One byproduct of enforced work from home rules is that work team isolation increased, creating more silos within organizations. As the possibility of face-to-face interactions returns, use of electronic collaboration tools drops. But Microsoft researchers say that is a good thing, as it reduces the “silo” impact enforced remote work has caused. 


source: Microsoft 


“At the onset of the pandemic, our analysis shows that interactions with our close networks at work increased, while interactions with our distant networks diminished,” say Microsoft research. “This suggests that, as we shifted into lockdowns, we clung to our immediate teams for support and let our broader network fall to the wayside.”


There is a downside: similar companies almost certainly became more siloed than they were before the pandemic. 


“And while interactions with our close networks are still more frequent than they were before the pandemic, the trend shows even these close team interactions have started to diminish over time,” Microsoft researchers say. 


Younger workers (25 or younger) also reported more difficulty feeling engaged or excited about work, getting a word in during meetings, and bringing new ideas to the table when compared to other generations.


“Bumping into people in the office and grabbing lunch together may seem unrelated to the success of the organization, but they’re actually important moments where people get to know one another and build social capital,” says Dr. Nancy Baym, Microsoft senior principal researcher “They build trust, they discover common interests they didn’t know they had, and they spark ideas and conversations.”


I never thought I’d see the day when a major supplier of collaboration tools actually argues that less use of those tools would improve actual collaboration. If bugs can be features, features can be bugs.


Tuesday, August 31, 2021

Mobile or Fixed Operating Capex Seems Consistent: 10% Per Year

Some connectivity network capital investment assumptions seem remarkably stable after 50 years.


An example is the adage that a fixed network operator “has to do something” for about 10 percent of the physical plant every year. In other words, each year, existing plant has to be replaced, or new plant added, representing about 10 percent of the installed base.


The best example is replacement of in-service cabling and associated electronics or optics.


That has proven a useful rule of thumb for cable TV and telco access networks, and now also seems to be useful for mobile networks. According to Allot, mobile operators need to add capacity to about 10 percent of cell sites every year.


source: Allot


Essentially, that means every fixed or mobile network is potentially 100-percent renewed every decade, piecemeal.


The big exceptions are the once-a-decade upgrades of mobile networks to the new next-generation platform, such as 4G to 5G; telco upgrades from copper access to fiber-to-the-home; or a cable operator upgrade of DOCSIS or major change in serving area size (accompanied by a shift to deeper fiber deployment).


Where replacement, upgrade or repair of about 10 percent of the existing plant is "operating capex," the once-a-decade architecture upgrades are "strategic capex." The former keeps the existing network operating; the latter upgrades capabilities.


Monday, August 30, 2021

Will 5G "Cost per Gigabyte" Rival Fixed Networks?

One key issue as mobile operators contemplate the use of their networks to compete in the home broadband market--formerly the “fixed network internet access” market--is whether such services can compete with incumbents. 


Two key matters are the retail price of a service plan, which is shaped, in turn, by the cost of supplying the capacity, usually denoted as “cost per gigabyte of usage.” 


Determining price on a country-by-country basis also requires adjusting prices to account for differences in currency values and purchasing power (typically using a purchasing power parity method). All those things done, price per gigabyte of mobile data usage ranges between nine cents per gigabyte up to $110 per gigabyte, according to Speedcheck

 

There are all sorts of other complications, including the speed of a connection; whether we include fees and taxes as part of the calculator and whether other changes, such as equipment rental, also are included. All of that will differ from country to country and provider to provider. 


But a reasonable rule of thumb has been that mobile data costs an order of magnitude more than fixed network data, on a cost-per-gigabyte basis. So in the U.S. market fixed network gigabytes might cost 30 cents while mobile gigabytes cost $3. 


If mobile bandwidth traditionally has been an order of magnitude more expensive than fixed network bandwidth, then it is obvious that, to compete, mobile bandwidth has to be as capacious and affordable as fixed network bandwidth. 


What is clear is that, compared to past capabilities, 5G networks will have a cost-per-gigabyte profile that allows mobile operators to radically close the cost gap with fixed networks that prevailed with 4G and prior mobile generations. 


source: Mobile Experts


Up to this point, mobile cost per gigabyte has been as much as an order of magnitude more costly than fixed network cost per gigabyte. As always, it matters how we count. 


The posted retail prices are not necessarily the “actual prices” consumers pay, as many are on promotional deals at any particular time. The other issue is prices for actually-used capacity versus plan allowance price. They are usually different. 


The nominal (designed-for rate) is total usage allowance divided by total recurring cost. But not many users actually consume all the data their plans provide. Also, customers on unlimited-usage plans will have highly-variable “cost per consumption” ratios, as price is fixed, while usage is unlimited. 


Fixed network data costs, on a cost-per-megabyte basis, routinely have in the past been in the 20 times to 60 times lower scale than mobile data. Where fixed network data might cost cents per gigabyte, mobile data costs dollars per gigabyte, counting either plan costs or actual usage costs. 


source: Mobile Experts


To compete in the home broadband market with fixed network providers means mobile operators have to match prices per gigabyte more closely. Cost per gigabyte has been steadily declining, for both fixed and mobile networks. 


The importance of 5G is that, for several reasons, the cost of supplying a gigabyte of usage will drop, compared to 4G, as the cost of each successive mobile generation has done. 


Using mobile networks to compete in the home broadband market never gets the headlines when we talk about 5G. The buzz is all about edge computing or internet of things or virtual reality. 


We might be surprised by the near term revenue upside. Mobile operators might make more new revenue from home broadband services--however unheralded--than from edge computing, IoT or AI-based apps.


Sunday, August 29, 2021

Risk Assessment and Availability Bias

Risk assessment is part of every person’s routine, just as much as it is an enterprise or organization taks. In that regard, human beings are biased toward judging an event’s likelihood or frequency (and hence, risk) based on how easily their minds can conjure up examples of the event occurring in the past, according to behavioral economics.  


In other words, we make bad decisions, or are biased to do so, because our memories are skewed to vivid memories.  


If a similar event has occurred recently, or past instances induced strong emotions, people are much more likely to predict that the event is likely to occur. For business leaders no less than consumers and citizens, judgment is affected by the fact that the event was recent or strongly emotional. 


These mental shortcuts happen because the human brain cannot process all the data it encounters on a routine basis. So brains create rules that simplify the search for meaning. That placing much raw data into a framework results in availability bias


Availability bias is a mental shortcut that relies on immediate examples that come to a given person's mind. The practical impact is that people tend to focus on information that’s easiest to access, most recent or most memorable. 


That can lead to wrong or bad decisions. According to Farnam Street, we end up remembering based on 


  • Our foundational beliefs about the world

  • Our expectations

  • The emotions a piece of information inspires in us

  • How many times we’re exposed to a piece of information

  • The source of a piece of information.


Combatting or overcoming the availability heuristic--a form of irrationality--is not easy, as it requires deliberate effort to evaluate data not based on its vividness, not based on its recency, its prevalence or expected importance. 


One suggestion is to rely on statistics, less than emotion or belief. If something in business or nature tends to happen once in 1000 occurrences, its probability should be deemed unlikely, for example. Not impossible; only rare. 


Focus on trends and patterns. Regression to the mean teaches us that extreme events tend to be followed by more moderate ones. Outlier events are often the result of luck and randomness and are unlikely to reoccur soon. 


Whenever possible, base your judgments on trends and patterns—the longer term, the better. Track record is everything, even if outlier events are more memorable, says Farnam Street. 


Avoid hasty judgments that have big consequences, it goes almost without saying. The whole point of heuristics is that they save the time and effort needed to parse a ton of information and make a judgment. 


When making an important decision, the only way to get around the availability heuristic is to slow down and parse the relevant information, to reduce the impact of recent, emotional, memorable or oft-repeated information.


Do not rely on the soundness of your memory. It is hard to remember what happened in the past, and the more distant, the less-powerful the mental weighting. “What have you done for me lately?” is a commonplace expression of the rule. Humans give priority to what happened more recently, not what happened years ago. 


The unusual--a big mistake; a huge win, an outlandish occurrence--tends to dominate our thinking, rather than the statistical odds of occurrence. 

 

As a student of history, a last bit of advice is second nature. “Go back and revisit old information.” “Even if you think you can recall everything important, it’s a good idea to go back and refresh your memory of relevant information before making a decision,” says Farnam Street. 


Friday, August 27, 2021

What Causes Difficulty for Digital Transformation?

In a study of banking “digital transformation, two researchers illustrate why the way humans are involved in actual business processes shapes the effort. Even when using a single new tool--the SAP loan management system--the adaptation was easier for some parts of the organization than others. 


Complexity is a key issue. Also, it matters how much people need to understand the business logic of their firms. For example, one group of clerks used the new SAP-based loan management system to enter new contracts. For them, learning how to do their work with the new system was easy, the researchers say. 


In stark contrast, clerks who needed to make edits to loans in stock had a much harder time learning how to work with it, they note. 


Clerks in the former group achieved effective use within six to eight weeks, but those in the latter group needed over six months to do their work effectively again. The complexity of the task shapes the ease or difficulty of adapting. 


source: Harvard Business Review 


The researchers note the role of “system dependency,” which is a measure of how much of a user’s task is represented in the system. When more of the output or outcomes hinge on the innovation, adoption takes longer. 


That makes intuitive sense. An innovation that reshapes or affects 80 percent of a worker’s output or outcomes is going to be more complicated than when an innovation is actually peripheral to a worker’s job. 


Semantic dependency--the degree to which users need to understand how the business logic of their task is implemented in the system--seems just as important. 


Digitalized tasks that have a high degree of both dimensions are the most complex, they say. Of course.


Non-Profit Digital Transformation is REALLY Challenging

Digital transformation or digitalization in a non-profit setting might be more qualitative--and less quantifiable in terms of outcomes--than for private firms. Also, subjective assessment of “better outcomes” is one thing; objectively measurable outcomes are harder. 


One survey by Business and Decision found non-profit practitioners’ perceptions of value exceeded expectations across the board. Keep in mind these are perceptions, not measurements. 


Practitioners ranked transformation efforts high for “raising awareness,” for example. Perceptions of value for gaining new members or generating donations were generally expected to yield less improvement, as was fund raising or generating donations. That likely reflects a genuine understanding that these “tangible goals” were going to be more difficult than intangible outcomes. 


source: Business and Decision 


It is not easy. Many nonprofits struggle to get by, Microsoft notes. They are revenue-stretched, and paper-bound. Also, if digitalization normally presumes the ability to harvest insights from data, non-profits often have limited capability to generate meaningful data.


Also, non-profits often lack a firm understanding of how they are performing or what their costs really are, Microsoft notes. “They aren’t sure what programs are doing well and what could be done better.”


“Arcane and laborious administrative tasks, as well as the pressure of constant fundraising, can tie up skilled specialists and volunteers, keeping them from focusing on their real mission: helping others,” Microsoft says. 


Some basic requirements, such as understanding actual process flows, can be challenging as they are non-linear, non-standardized or porous, as they often rely heavily on volunteers or high rates of employee and volunteer  turnover. 


Also, “the nonprofit sector is not known for being particularly innovative or open to change,” notes Suzanne Laporte, Compass president. . 


So it might not be surprising that as much as 84 percent of non-profit digital transformation projects fail. 

DirecTV Now is a Standalone Company

Back in 2015 (though it seems much longer ago than that), a colleague working on U-Verse worried that the deal meant DirecTV would become the “go-to” platform for video entertainment. Some six years later, it is hard to disagree. 


AT&T’s deal to move DirecTV assets into a different company has closed. TPG Capital now will own and operate the DIRECTV, AT&T TV and U-verse video services previously owned and operated by AT&T. 


DIRECTV had approximately 15.4 million premium video subscribers at the end of the second quarter of 2021.


Many describe the transaction as “AT&T getting out of video entertainment.” That is far from correct. AT&T contributed its U.S. video business unit to the new entity in exchange for cash compensation, debt assumption but also retains a 70 percent interest in DirecTV. 


TPG contributed approximately $1.8 billion in cash to DIRECTV in exchange for preferred units and a 30% interest in common units of the new company.


At close, AT&T received $7.1 billion in cash and transferred approximately $195 million of video business debt to the new entity. 


Aside from the asset ownership dilution that frees up cash to pay down debt, AT&T managerial attention can shift back to the core mobility business. But AT&T still owns 70 percent of DirecTV. That is hardly “getting out of the subscription video business.”


Net AI Sustainability Footprint Might be Lower, Even if Data Center Footprint is Higher

Nobody knows yet whether higher energy consumption to support artificial intelligence compute operations will ultimately be offset by lower ...