Saturday, September 18, 2021

There are No KPIs for Knowledge Workers

Key performance indicators often are a recommended practice for improving organizational output. But knowledge work and office work in general do not allow us to create meaningful KPIs related to productivity.


The problem with all studies of officer worker or knowledge worker productivity is measurement. What can be counted so we know whether inputs have changed. And how do we measure the output of knowledge work? 


Presumably a call center operation has quantifiable metrics, but most office or knowledge work does not have any obvious and convenient measurement criteria. We commonly measure “time working” with the assumption that additional time worked is better. Maybe. But hours worked is an input metric, not an output metric. It is the denominator, not the numerator. 


Logically, increasing input (denominator) can work to reduce productivity (output) unless output measures also increase faster than inputs increase. 


The other common issue is that we equate worker attitudes with outcomes. Happier workers might, or might not, be more productive. All we can measure is a subjective attitude. More happy or less happy does not necessarily correlate with outcomes. 


In principle, one could have happier but less productive workers; less happy but more productive workers. One would need a way to correlate output and outcomes with feelings in ways that outlive simple Hawthorne effects (people work better when they know they are part of an experiment). 


Work team collaboration might have fared better under full remote work conditions, but there is some evidence that firm-wide collaboration has decreased, though the amount of time spent collaborating (meetings, emails, messaging) has grown.  


Actual output is different from input or collaboration time and effort. It might be difficult to measure “creativity,” but there is some belief that has not done better under conditions of remote work.  


Meetings are inputs, not outputs. Having more meetings, or spending more time in meetings, does not make firms or organizations more productive. A Microsoft survey of 182 senior managers in a range of industries found support for that thesis. 


“65 percent said meetings keep them from completing their own work,” according to Microsoft. “71 percent said meetings are unproductive and inefficient.”


Fully  64 percent said meetings come at the expense of deep thinking while 62 percent said meetings miss opportunities to bring the team closer together (which is sort of a paradox). 


Unless directly correlated with output, meetings actually can reduce productivity. Some seem to believe emails are outcomes, when in fact they take away time that might otherwise have been spent actually producing an output. 


source: Lucidspark 


The point is that we actually can say very little about whether productivity has grown, stayed the same or decreased because of enforced remote work. We could not measure productivity before, so we have no baseline against which to compare, even if we thought we could measure it.


Friday, September 17, 2021

"Business Hub" or "Ecosystem" is Easier than "Platform"

Quite often, data centers and connectivity service providers are urged to “become platforms,” “become ecosystems” or “business hubs.” It is good advice, for the simple reason that value is increased. And since value tends to correlate with customer growth, hence revenue, higher value always is desirable. 


source: Medium 


That is an example of network effects: the observation that the value of some products or services increases as more nodes are added. A voice network that connects people and organizations in one local area has some value. A voice network that connects all or most people and organizations in a region is more useful. Better still is a network connecting everyone in a nation. A network that connects all users globally has the highest value. 


That also is true of many internet applications and services as well. “Ecosystems” can be built around products that increase value, as Apple demonstrates with its iPhone. The big issue with ecosystems is the additional value partners bring. 


The concept is easier to visualize for a data center provider than a connectivity provider. A colocation provider gains revenue and customers as more partners conclude it makes sense to colocate at a physical location. To the extent a colocation provider makes money from tenants and interconnection, “ecosystem” makes perfect sense. 


source: Equinix 


The concept is harder to commercialize for a connectivity provider as TCP/IP allows any and all partners to connect with each other without formalized business relationships, and therefore without necessarily increasing connectivity provider revenue in a direct sense. Such “permissionless” access means business partners do not need a connectivity provider’s agreement to conduct business with each other, even when using networks. 


source: STL Partners 


Ecosystems can create additional value, for the same reason that geographic proximity creates value, by increasing the number and range of business partners with whom transactions can happen, and therefore value and features sold to customers and available to end users. 


But that does not make such ecosystems “platforms.” 


source: Goodhum 


“Companies must think of the edge as more than just a collection point for data from intelligent devices,” Iron Mountain argues. “They should broaden their vision to see the edge as a new business hub.”


Regardless of the preferred language, the key business idea is that interconnection or colocation have value. That makes it easier for a data center or colocation provider to envision becoming a business hub, creating an ecosystem.


It is far harder for a connectivity provider to do so. Becoming a true business platform is not easy for either type of provider.


What is less clear is the best description of the business model.


U.S. Broadband Growth Slowing Again?

With broadband net additions apparently slowing, the issue is “why?” Is the Covid push over? Is the market therefore slowing? Are rivals beginning to take share? Comcast reported slower net account growth in the third quarter. But all major internet service providers (telcos and cable) saw slower net additions in the second quarter of 2021, compared to the second quarter of 2020. 


That is the most-likely explanation: simple slowing of growth rates, industry wide. There still is not much evidence that telcos have actually begun to take back market share from the cable companies. 


As telcos continue to deploy new fiber to home lines, many observers believe telcos are poised to gain share, though.  If so, upstream capacity upgrades of cable hybrid fiber coax networks will be more important, as telcos will tout the superiority of their return path bandwidth. 


Though telcos and cable operators have not yet massively embraced symmetrical bandwidth, that is the expected future. 

Thursday, September 16, 2021

Facilities-Based Competition Often Matters Quite a Lot

Though BT’s Openreach wholesale network has been designed to support an evolution to higher-speed internet access in the United Kingdom, but to this point most of the speed gains have been supplied by rival facilities-based providers, which might be a good argument for allowing facilities-based competition where it makes sense. 


In 2020, some 18 percent of U.K. homes could buy FTTH-based gigabit services. That equates to about five million lines. Somewhere more than two million lines were supplied by BT’s Openreach network in May 2021, though the pace of installation is increasing fast. 


That means three million lines--or about 60 percent--of U.K. FTTH accounts were supplied by facilities-based competitors to BT. 


However, in 2020, a total of eight million homes actually could buy gigabit service, the difference being Virgin Media’s gigabit service available to another three million homes. 


According to Ofcom there is about a two-percent overlap between the two different types of networks, in terms of supply.


 

source: Ofcom 


Supply is one thing, take rates another. In the U.S. market, for example, gigabit connections are purchased by about 10.5 percent of households, though available to more than 80 percent of homes passed by networks that can supply it. 


Over the last half decade, Virgin Media has had more “higher speed” customers than have all the competitors using BT’s wholesale network. Today, “altnets” have emerged as additional suppliers of gigabit speeds. 


Many might assume FTTH means gigabit speeds. It does not. Historically, FTTH might have meant speeds in the hundreds of megabits. Some U.S. FTTH networks installed in the mid-1990s to late 1990s offered speeds only up to 10 Mbps.    


Also common are price comparisons or tracking of “average” or “typical speeds experienced by consumer customers.  Less common are measurements of provisioned speeds. In other words, instead of looking at access technologies, what is the expected bandwidth a customer might obtain, on any network?


That matters for a simple reason. FTTH is not the only available access technology, and not the only possible fixed network platform. Looking only at the numbers of deployed lines, or take rates on those lines, tells us much. It does not tell us the whole story. 


In Germany, for example, Vodafone expects gigabit-per-second connections to be driven by rival hybrid fiber coax networks, not FTTH/B. By 2022, Vodafone expects 72 percent of all gigabit lines to be supplied by cable operators, not FTTH/B providers. 


source: Vodafone 


The point is that we get different pictures of where advanced fixed network internet access stands when we measure by access technology instead of available speed. 


FTTH available lines or provisioned lines alone does not necessarily tell us all we would like to know about user experience. What is the designed-for speed, upstream as well as downstream? A “mere” statistic on FTTH homes passed does not shed light on that question.


If one asks a different question, such as “what percentage of home passings offer downstream speeds of 1 Gbps,” we get a different answer. Or perhaps we cannot get a very good answer. Very few connections are capable of offering such speeds, even using FTTH. 


If we ask other questions, such as “what percentage of lines are symmetrical?” we would get yet another set of answers.  


Even when deploying FTTH, an internet service provider must yet decide what optoelectronics to use, and that of course affects network capabilities. So FTTH does not necessarily tell us much about available speeds.


Nor does the deployment of FTTH by one legacy provider necessarily tell us much about the actual state of gigabit per second or even “hundreds of megabits per second” service. Cable hybrid fiber coax is important in many markets. Rival overbuilders or altnets are important in some markets. 


Eventually, mobile networks will emerge as challenges in some instances. 


Methodology always matters when evaluating the quality of consumer broadband. FTTH is one measure of potential progress. But it is not the only important metric. We always need to know the designed-for speeds. And other platforms also compete. 


So, in many cases, the issue is not “FTTH.” The issue is “gigabit per second speeds.” FTTH is a matter of media, not commercially-available gigabit speeds.


Rural FTTH: Media Does Not Always Tell Us Much About Speed

It is common when measuring internet access progress to look at coverage, take rates, speeds and prices. Fiber to the home adoption almost always is measured in terms of coverage.That is only analytically helpful up to a point. 


In some markets FTTH might be the only practical way to supply gigabit per second and eventually multi-gigabit per second speeds. Not in all markets, however. And though FTTH might eventually be the only way to supply terabit-per-second speeds by mid-century, that is a ways off. 


In the meantime, media choices are one thing, but commercially-available speeds, competition and facilities-based differentiation extend beyond FTTH itself. 


Consider FTTH in rural areas. 


For example, fiber to the home (or basement for multiple dwelling units) covers about 22 percent of rural households, compared to 45 percent for all territories in the European Union and United Kingdom, a new study by the Fiber to the Home Council Europe. 


Spain has 60.5 percent rural FTTH/B coverage in 2020 while Germany has 9,8 percent coverage of rural dwellings.  


source: IDATE 


What is not clear is how much additional bandwidth, and therefore speed, is available on those lines. According to the NTCA, a trade group representing more than 650 rural telcos in the United States, nearly 70 percent of their customers are connected by FTTH, up from 58 percent in 2018.  


That 2020 survey also reported that 68 percent of NTCA member customers can receive downstream speeds greater than 100 Mbps. Some 80 percent can receive downstream speeds greater than 25 Mbps, up from 57 percent and 70 percent in 2018, respectively. 


Just under half (45 percent) of customers have access to 1 Gbps or higher downstream broadband speed, a metric that has nearly doubled in just two years (23.4 percent in 2018), NTCA says. 


So FTTH and gigabit per second speeds are not identical, it appears. Some 45 percent of customer locations can buy service at such speeds, while FTTH is deployed to 70 percent of locations. 


So fiber and gigabit are not identical. Nor does the existence of FTTH tell us anything for certain about available speeds.


Tuesday, September 14, 2021

Comcast and At&T Both Have a Portfolio Approach to Media and Connectivity

AT&T and Verizon are not the only U.S. connectivity firms facing calls to divest content assets. Some observers want Comcast to get out of the content business to focus on connectivity. Note Comcast’s response. 


“We believe in media and technology,” Brian Roberts, Comcast CEO, has said. Those two go together, Roberts said. 


But note a subtle qualification. “It’s not really vertically integrated, it’s delivering to customers and viewers experiences and memories and things like our theme parks.”


“We see the two working together,” he added.


Though virtually all observers describe AT&T’s spinning out its DirecTV and Warner Media assets as getting out of the business, that actually is incorrect. AT&T retains a 70-percent ownership stake in both businesses. 


It removes the assets from its books, and monetizes up to 30 percent of the assets. But make no mistake, AT&T still owns 70 percent of both businesses, but now does not have to manage those assets. 


Think of it this way, as Comcast does: the connectivity and content businesses are not vertically integrated, as some might have described the former arrangement with DirecTV and Warner Media. But both aspects of the ownership model provide value. 


But both the Comcast and AT&T content ownership strategies allow both firms to diversify assets, revenue streams and markets. Both derive the benefits of ownership, cash flow and profits. 


Though Comcast consolidates the results, AT&T does not. 


Still, essentially, both firms have a business footprint that extends beyond connectivity services, “up the stack.” Most believe AT&T benefits by allowing its management to focus on its connectivity business. But AT&T still owns 70 percent of DirecTV and Warner Media (which will be merged with Discovery and managed by Discovery). 


Keep in mind that many expected or touted growth areas--internet of things, edge computing, private networks, control of unmanned aerial or autonomous vehicles--also, to one extent or another--move connectivity firms beyond connections, and towards platforms and apps. 


A portfolio approach arguably allows participation in a wider swath of revenue and value upside, while still allowing a focus on the core connectivity business. It is a defensible approach, if not always a popular tack for financial engineers who profit from transactions.


How Will Cable Operators Re-Architect to Add Upstream Bandwidth?

Hybrid fiber coax upgrades intended to increase upstream bandwidth can take a number of forms. Shrinking the serving areas; switching to fiber-to-home and re-architecting the network for different frequency plans are the typical choices. 


For operators who want to delay the shift to FTTH, moving from the standard HFC low-split design, and substituting a mid-split or high-split frequency plan, are the two architectural choices other than shrinking the fiber node serving areas or moving to an entirely-new FTTH network. 


As always, incrementalism is favored. Comcast appears to prefer the mid-split option, while Charter seems to be leaning towards a more-radical high-split approach. In terms of capital investment, the mid-split choice might be a shorter-window bridge to FTTH, while high-split might allow a longer window before FTTH is required. 


More symmetrical bandwidth is a large part of the thinking.  


DOCSIS 4.0 is going to force decisions about which path to take to support symmetrical multi-gigabit-per-second speeds of as much as 10Gbps downstream and up to 6 Gbps upstream.

source: Comscope 



Hybrid fiber coax networks still use frequency division, separating upstream and downstream traffic by frequency. So when a cable operator contemplates adopting mid-split or high-split designs, there are implications for active and passive network elements, especially for the more-radical high-split design. 


At this point, executives also will ask themselves whether, if radical changes are required, whether it would not be better to simply switch to fiber-to-home. 


source: Broadband Library 


Our notions of mid-split and high-split frequency plans have shifted a bit over the years, as total bandwidth has grown beyond 450 MHz up to 1.2 GHz. A designation of “mid-split”  made more sense in an era where total bandwidth was capped at about 450 MHz or 550 MHz. In those days, 108 MHz to 116 MHz of return bandwidth was perhaps 42 percent of the usable bandwidth. 


Hence the “mid-split” designation. 


Likewise for high-split designations, where as much as 186 MHz was designated for the return path, the return bandwidth represented as much as 67 percent of usable bandwidth on a 450-MHz coaxial cable system. 


source: Broadband Library  


Definitions remain, though with some new standardization of return bandwidths. “Mid-split” now features 85 MHz of return bandwidth, while “high-split” offers 204 MHz of upstream bandwidth. 


source: Broadband Library  


“Ultra-high-split” designs also are being investigated, where the upstream spectrum’s upper frequency limit can be 300 MHz, 396 MHz, 492 MHz, or 684 MHz, says Ron Hranac, consulting engineer. 


What remains true is that the ability to wring more performance out of hybrid fiber coax plant has proven more robust than many expected a decade ago. 


Also being considered are full duplex designs that swap time division for frequency division multiplexing. That is an option for DOCSIS 4.0 networks, and is a break from the frequency division HFC has used.




source: CableLabs 


Full duplex networks would allow the upstream and downstream traffic to use the same spectrum at the same time. That would require an HFC upgrade to a node-plus-zero amplifiers” design that is similar to fiber to the curb. The drop to the user location still uses coaxial cable, but without any radio frequency amplifiers. 

source: CableLabs 


The whole point of all these interventions is to supply more upstream or return bandwidth than HFC presently provides. 


source: Qorvo


Cable operators are a practical bunch, and will prefer gradualism when possible. So one might hypothesize that either mid- or high-split designs will be preferred. 


Net AI Sustainability Footprint Might be Lower, Even if Data Center Footprint is Higher

Nobody knows yet whether higher energy consumption to support artificial intelligence compute operations will ultimately be offset by lower ...