Friday, August 14, 2020

Second Law of Motion, Second Law of Thermodynamics Provide Key Business Analogies

Newton's second law of motion explains that the acceleration of an object as produced by a net force is directly proportional to the magnitude of the net force, in the same direction as the net force, and inversely proportional to the mass of the object. 


A corollary of sorts is that “friction” is an uncorrelated or unbalanced force. In other words, friction always acts in the direction opposing motion. 


If friction is present, it counteracts and cancels some of the force causing the motion when any object is being accelerated. That means a reduced net force and a smaller acceleration. 


In this illustration Fa is the intended force, but faces uncorrelated forces including gravity and friction that essentially resist the applied force. 


source


And that is a similar concept to the idea of friction in business and life. No matter what resources are mobilized in pursuit of some objective, friction will reduce yield, effectiveness and impact. All organizational effort therefore must overcome friction. That is true for capital investment, competition, operating procedures, product development and production, distribution channels and customer support and service, plus marketing and legal or regulatory tasks. 


Virtually every business activity therefore involves some element of overcoming friction, which is the resistance to desired change or outcomes. 


As an energy conversion is somewhat inefficient, producing unwanted heat in addition to desired energy output, so friction prevents full and complete application of resources to any process. 


Friction is akin to the Second Law of Thermodynamics, which states that disorder (entropy) increases over time. In other words, order proceeds to disorder. So much organizational effort must be devoted to preventing entropy (decay). 


That is why all efforts to create a more frictionless business is an unstated objective of virtually all organizational activities.


Newtonian Technology Trends Post-Covid

Twenty years ago, futurist John Naisbitt wrote High Tech/High Touch, an examination of technology and a follow-on to his 1982 book Megatrends. It was Megatrends which predicted that people immersed in technology would be driven to seek human contact. 


High Tech/High Touch essentially concluded that the trend remains intact, shown in the prominence of both consumer technology markets and products, services and markets that offer escape from technology.


As the Covid-19 pandemic wears on, increasing our reliance on technology and restricting human contact, Naisbitt’s observations still hold. The more we are now forced to use technology, the more important will actual “high touch” matter. 

source: Megatrends


It is almost Newtonian: for every action there is an equal and opposite reaction. Kept indoors, demand for outdoor activities has grown significantly. Forced not to travel, people will want to travel. Required to interact virtually, people will want face-to-face encounters. 


The conventional wisdom that “everything has changed” suggests disruptive change in work and living habits that are permanent. That likely will prove to be a one-sided analysis. 


Post-pandemic behavior might be more unexpected than is commonly suspected, for several reasons. First, linear extrapolation from the present nearly always proves wrong. Non-linear change is more likely, an argument the “everything has changed” view also suggests.


But non-linearity cuts both ways. We might well see non-linear regression to the mean, as well as accelerated change of behavior. 


Many trends that already were underway before the pandemic  will be accelerated to an extent, though not nearly so much as many seem to believe. Naisbitt’s observations suggest why: to the extent we continue to work remotely, more often, we also are going to want and desire face-to-face contact. Unable to freely travel, humans will want to do so again. 


Zoom is not a perfect, or nearly perfect substitute for face-to-face interactions. People will want to get away from their screens, to the extent they are forced to rely on them. 


The Newtonian reaction to high tech will be high touch. 

Why the Broadband "Problem" Cannot be Permanently "Solved"

 So long as we keep changing the definition of “broadband,” we are likely “never” to see “improvement” in the number or percentage of homes or people able to buy the product, no matter how much investment is made in facilities. 

When we change definitions of minimum speed, for example, we automatically increase the number or percentage of locations or people that cannot buy the product. Colloquially, that is known as “moving the goalposts.” Put another way, our understanding of “broadband” changes over time. 


The classic definition of broadband was that it was any service running at speeds of 1.5 Mbps. In the U.S. market the official definition of “broadband” is 25 Mbps. But most consumers buy service at speeds an order of magnitude higher than the minimum definition. Yesterday’s power user is today’s light user. 


source: Openvault


And though new platforms might help, a continuing evolution of our definitions to support an increase in minimum speeds will continue to be a challenge for any market or country with lots of rural or thinly-populated areas. In the United States, six percent of the land mass is where most of the people live. 


How we define the market also affects our analysis of the amount of competition in the consumer broadband market. The common observation in the U.S. market, for example, is that minimum service at 25 Mbps is unavailable to “millions” of people. 


Of course, that finding requires a big assumption, namely that all satellite and mobile services are excluded from the analysis. Two U.S. satellite suppliers sell broadband access across virtually the entire continental land mass, while mobile speeds already exceeded the minimum threshold in 2019 and early 2020. 


If any and all services supplying 25 Mbps or faster speeds are considered, it might be very difficult to find any U.S. locations unserved by at least two providers. 


The point is that definitions and assumptions matter. By continually increasing the speed used as the definition of “broadband,” we will almost arbitrarily keep moving the goal line on who has it, where it is available and how many competitors can sell it. 


Ignore for the moment consumer choice, which has shown that most consumers buy services in the middle of the range: not the most costly or least costly; not the fastest or slowest offerings. 


Because “typical, average or median speeds” will keep getting higher, so will our definitions be adjusted. But at a time when satellite and mobile minimum and average speeds often already exceed the minimum definitions, and where most fixed network consumers buy services an order of magnitude above the “minimum” threshold, it is hard to “close the digital divide.”


There likely will always be some statistical gaps. Where there is a serious “problem” actually is--or will be--more debatable.


Thursday, August 13, 2020

5G Is Not the Issue Anymore

 It increasingly is impossible to clearly delineate the value, strategic or revenue potential of 5G separately from the other companion developments that create value, strategic potential and incremental revenue. 

Consider edge computing and 5G for consumer mobile devices. Most observers now would agree that internet access, supplied by connectivity providers, is part of the broader internet ecosystem. Likewise, most of the growing use cases and revenue drivers depend on connectivity, but are not directly “owned” by connectivity providers. 

source: State of the Edge 202


That is the basic reason behind the interest, in some cases, in connectivity provider ownership and participation in adjacent parts of the ecosystem (applications, real estate, platform), beyond connectivity. 


By 2028, 4G and 5G mobile consumer and residential consumer applications will dominate the edge computing footprint, for example, says the State of the Edge 2020 report. A growing number of those apps would not be consumable without edge computing plus the latency performance and bandwidth made possible by 5G. 


source: State of the Edge 202


The edge computing power footprint for mobile and residential consumers will reach 16938 megawatts and 10843 MW, respectively by 2028, the report says. 


Edge computing power footprints for service providers and enterprise IT is expected to increase 7117 MW and 5800 MW, respectively. 

source: State of the Edge 2020


Mobile operators especially will be significant users of edge computing to support their own internal operations, including their virtualized 5G core and access network facilities. 

source: State of the Edge 2020


Tuesday, August 11, 2020

U.S. Gigabit Accounts Near 5%

 About five percent of U.S. fixed network internet access lines (also including business lines) in the second quarter of 2020 reached almost five percent, according to Openvault, up from 2.1 percent in the second quarter of 2019, and up from 3.75 percent in the first quarter of 2020. 


source: Openvault


6% of U.S. Land Mass is Where Most of the People Are

One oft-ignored facet of communication network economics is that the business model is helped when population density is high, and generally suffers when population density is low. Also, it is far easier to build a whole new network in a small area, compared to a continental land mass. 

About six percent of the U.S. land mass is “developed” and relatively highly populated. About 94 percent is unsettled or lightly populated, including mountains, rangeland, cropland and forests. 


source: USDA


Pastures are located largely east of the hundredth meridian. 


source: USDA


Croplands, at least at the moment, stretch a bit further west. 


source: USDA


Rangelands mostly are found in the West.

source: USDA


But even the higher-population areas east of the Mississippi River contain huge tracts of forest land. 


source: USDA


All those facts have implications for networks. Most of the population or potential users can be covered by a smaller amount of network infrastructure, at first. But network coverage is another matter, as 94 percent of the land surface is lightly populated, consisting of forests, mountains or range land. 


In just about any instance, in any market, though, 80 percent of the revenue might be generated by 20 percent of the accounts, locations or use cases. Ericsson studies show that just 30 percent of cell sites support 75 percent of all traffic on the network. 


So it should come as no surprise that the cost of rural infrastructure can be an order of magnitude higher than for denser urban infrastructure.

Saturday, August 8, 2020

Does "Free Speech" Right Belong to the Speaker or the Public?

 Section 203 of the Telecommunications Act of 1996 makes clear that platforms are not responsible for the content posted by users of the platforms. That quite arguably has allowed platforms to build on third party expression without fear of legal action. If a user of a platform engages in “unprotected speech,” such as defamation, obscenity or speech in pursuit of a crime, the platform is not considered legally liable. 

On the other hand, potential users are not protected from decisions made by platform owners not to publish user content, either, and section 203 prevents legal action by users who feel aggrieved by the content moderation practices, whatever they happen to be. The issue in that case is a breach of acceptable community standards, platforms have argued. Others say it is censorship, a violation of free speech norms, and possibly rights. 


It’s a huge minefield, to be sure, but the protection of political free speech has--since the advent of electronic media--had to answer the question of “whose rights are protected?” In some ways it is a zero-sum game: every winner has to be compensated by a loser.


If it is the “speaker” who has the right, the “listener, viewer or reader” (the public at large) does not have the protected right. If it is the “listener, viewer or reader” (the public) who has the right, it is the speaker whose rights are circumscribed. 


The argument might be that if and when social media sites threaten the use of the medium for political speech, courts might approve of content-neutral regulations intended to solve those problems.”


As always, the issue also includes “what is political speech?” In the modern era, that is understood more broadly than it once was. The Supreme Court has extended the First Amendment’s protections to individual and collective speech “in pursuit of a wide variety of political, social, economic, educational, religious, and cultural ends.


There also are some narrow areas where First Amendment protections have been held not to hold. Obscenity, defamation, fraud, incitement, fighting words, true threats, child pornography and speech integral to criminal content are not protected. The problem is that people of good will can disagree about when those conditions exist. 


Courts also have upheld  “time, place and manner” restrictions that are content-neutral limitations imposed by the government on expressive activity. 


Such restrictions come in many forms, such as imposing limits on the noise level of speech, capping the number of protesters who may occupy a given forum, barring early-morning or late-evening demonstrations, and restricting the size or placement of signs on government property.


There are at least three possible frameworks for analyzing governmental restrictions on social media sites’ ability to moderate user content. First, social media sites could be treated as state actors who are themselves bound to follow the First Amendment when they regulate protected speech https://fas.org/sgp/crs/misc/R45650.pdf


If social media sites were treated as state actors under the First Amendment, then the Constitution itself would constrain their conduct, even in the absence of specific legislative action. In this framework, the public essentially has the protected right. 


The second possible framework would view social media sites as analogous to special industries like common carriers or broadcast media. Likewise, in this framework the public has the protected right. 


Past applications of political content fairness have sometimes been shaped by the concept of preserving freedom “for the listener or viewer,” even when shaping or restricting the freedom of the speaker. 


That has been most clear, in the United States, in television or radio  broadcasting, based on the idea that private firms are using public resources. The same idea was extended to cable TV operators, who use public rights of way. 


On the other hand, social media sites could be considered to function as do news editors. In that case, the publisher has the protected right. 


If social media sites were considered to be equivalent to newspaper editors when they make decisions about whether and how to present users’ content, then those editorial decisions would receive the broadest protections under the First Amendment. 


As a practical matter, speakers also can exercise prudence, manners, judgment or courtesy in their political speech. Speakers have the right to be rude, wrong, loud or boorish, but might choose not to do so. We call that civility. 


AI Will Improve Productivity, But That is Not the Biggest Possible Change

Many would note that the internet impact on content media has been profound, boosting social and online media at the expense of linear form...