Saturday, November 21, 2020

Did the U.S. National Broadband Plan Succeed, or Not?

Did the U.S. National Broadband Plan fail or succeed? Some argue the plan failed. Others might argue it clearly has succeeded. So what is the truth of the matter? It actually is hard to say. 


There are but two quantifiable goals stated in the plan. 


The document says a goal, at the end of 10 years, is connecting 100 million U.S. homes with “affordable” access to actual download speeds of at least 100 megabits per second and actual upload speeds of at least 50 Mbps. 


Another goal was to provide gigabit per second access to anchor institutions such as schools, hospitals and government buildings. 


All the other goals are not quantifiable, except in “yes-no” fashion: did a recommended action actually happen within the plan time frame or not? As with many plans, the issue is targets,  frameworks and rule changes, rather than quantifiable outcomes. 


The plan was couched in terms of “goals” that are either hard to quantify, require the cooperation of many entities in the ecosystem or are not easy to define. Also, the plan itself says it is a “roadmap,” not a firm set of outcomes. 


The plan itself mostly deals with what the government can do, in response to a Congressional mandate to provide “detailed strategy for achieving affordability and maximizing use of broadband to advance “consumer welfare, civic participation, public safety and homeland security, community development, health care delivery, energy independence and efficiency, education, employee training, private sector investment, entrepreneurial activity, job creation and economic growth, and other national purposes.”


Some cite the portions of the plan described as “long term” goals, when making their evaluations of plan success. Also, keep in mind that the plan itself was only designed to facilitate commercial actions by others. The government’s role was limited to spectrum allocation and other policies that create incentives for other actors to fulfill. 


So what of the two numerical outcomes? Are 100 million U.S. homes presently buying “affordable” access at 100 Mbps downstream speeds? First off, “affordable” is not quantified and is a matter of interpretation. But are 100 million U.S. homes buying internet access at 100 Mbps?


According to measurements by Speedtest, the average U.S. consumer on a fixed network is getting access at between 124 Mbps and 166 Mbps. And Speedtest reports that 61 percent of all U.S. fixed network internet access services purchased by consumers offer 100 Mbps or higher speeds


But there is a difference between supply and demand. The plan specified only demand, not supply. 


Current supply exceeds what the plan called for. But current demand is lower. Only six in 10 customers choose to buy a service operating at 100 Mbps or faster. So 40 percent largely choose service operating at less than 100 Mbps, or some will note, perhaps cannot buy such service. 


Assume there are 139.4 million U.S. households. Assume fixed internet access is purchased by 80 percent of households. That implies a total of 111.5 million locations buying internet access. That seems too high. 


But assume only 126.7 million housing units  actually are occupied. If 80 percent of occupied housing units buy fixed network broadband, that suggests there should be about 101 million subscriptions. That accords with other estimates. 


If 61 percent of those locations buy internet access at 100 Mbps or faster, then 61 million U.S. customers choose to buy service at 100 Mbps or faster. That, of course, is far less than the National Broadband Plan called for. 


Provisioned speeds--bought by customers--differs from available speed, in other words. So should the plan have differentiated between available and provisioned speeds? We cannot say, at this point. So the evaluation of “did the plan achieve its goals” also is a matter of opinion, not “truth.”


Even in hard-to-serve rural areas, 60 percent of residents can buy internet access at speeds of at least 100 Mbps. That does not mean they do so. So what is the “truth” of the matter?


While it is difficult to measure speed, actual U.S. broadband speed is more than 100 Mbps, on average, according to Akamai in 2017. Upstream speeds vary by location, but are at or above plan goals in most cities, with performance varying by provider.   


But is access “affordable?” That is a matter of opinion, not fact. Still, prices have fallen significantly. 


“The most popular tier of broadband service in 2015 (BPI-Consumer Choice) is now priced 20.2 percent lower and offers 15.7 percent faster speeds in 2020 on an average subscriber-weighted basis,” says USTA. 


“The highest speed offerings in 2015 (BPI-Speed) are now priced 37.7 percent lower and offer 27.7 percent faster speeds in 2020 on an average subscriber-weighted basis,” USTA says.

 

“When inflation is considered, the real price of the most popular tier of broadband service has dropped 28.1 percent since 2015; and the real price of the highest speed broadband service has dropped 43.9 percent,” USTA notes. 


At the same time, cost per Mbps has dropped 37.9 percent for the most popular service and 56.1 percent for the highest speed service, says USTA. 


Beyond that, an outcome not specified was “Can 100 million U.S. homes purchase--if they choose--internet access at 1 Gbps downstream rates?” The answer to that unasked question also unquestionably is “yes.” Looking solely at cable operators, that portion of the industry alone has 72 million actual accounts. Not all are consumer accounts. 


The cable industry says 80 percent of U.S. homes have gigabit access, while 90 percent of U.S. homes have high-speed access but at top speeds less than 1 Gbps. 


Cable operators alone pass 80 percent of U.S. homes with networks selling gigabit per second internet access, and about 90 percent of U.S. homes can buy high speed access access, but at rates less than 1 Gbps. 


Keep in mind that U.S. telcos have about 33 million internet access accounts, but cable operators have about 72 million accounts, or 70 percent of the installed base. 


So how about gigabit service for anchor institutions. Consider matters in rural areas, where 59 percent of schools, for example, have broadband at gigabit per second rates, or higher. Beyond that, it is hard to see what percentage of schools, hospitals and other anchor institutions presently have gigabit connections. In perhaps 80 percent of communities, that is possible. 


Still, the truth of the matter--whether the plan succeeded or not--is clouded by opinions. 


“What is truth?” occupies much of traditional philosophy, and still manages to maintain its relevance, even in the communications business. 


Truth is that will accords with reality, Wikipedia notes. Another way of saying this is that “truth accords with facts; there being a difference between what is factual and what is merely opinion. A related key concept is that there is a difference between a fact and a value. 


And there often is more “value” or “opinion” than “truth” in most parts of the industry. Consider almost any claims made by marketing staffs, industry lobbyists or the policy advocates who oppose them, public officials or customers. 


The adage that “you are entitled to your own opinions but not your own facts” encapsulates the issue. “Facts” are open to interpretation. Is any country, state, province, city or service provider “doing well” in terms of its deployment of advanced information or communications connectivity? 


That is consequential, since the oldest school of philosophy asserts that truth is that which corresponds with facts, at least since the time of Aristotle. 


How successful are we at value generation and social or educational benefit? And what is the basis for such evaluations. Quite often, we do not agree on the facts. If truth is that which accords with the facts, then contention is inevitable. 


There are more modern systems as well. In the mid-19th century the focus shifts away from individual propositions and towards a system of logically interrelated components or web of belief.


Postmodernism--especially in its radical deconstructionist variants-- essentially abandons the notion that truth is absolute. “Most radical postmodernists do not distinguish acceptance as true from being true; they claim that the social negotiations among influential people ‘construct’ the truth


The deconstructed view essentially permits a definition of “truth” which is merely “opinion,” albeit opinion ratified by its acceptance. 


In the early 20th century a school of pragmatists used what we might call a scientific framework, suggesting that what is true is that which works. One obvious issue is that truth becomes relative, since trial, error and interpretation is required to determine what “works.” 


The point is that the difference between fact and value is not as clear as you might think, in non-mathematical endeavors, especially. By extension, it is not as easy to determine “truth” from “falsehood,” either. The social constructionists argue that is simply a matter of the imposition of power. 


As in Lewis Carroll’s book Through the Looking Glass, “truth” is subjective. 


"When I use a word," Humpty Dumpty said, in rather a scornful tone, "it means just what I choose it to mean—neither more nor less."


"The question is," said Alice, "whether you can make words mean so many different things."


"The question is," said Humpty Dumpty, "which is to be master—that's all." 


Friday, November 20, 2020

U.S. Telcos Gain Net New Broadband Accounts in 3Q 2020

It might be too early to say whether this is a trend, but U.S. telcos actually gained net broadband accounts in the third quarter of 2020, gaining about 14 percent of total net new additions in the quarter, while cable TV operators got the rest, about 86 percent. 


That might not sound like a big deal, but in most quarters over the last decade, telcos collectively have lost broadband accounts. There were a couple of exceptions, but the net positive growth is quite a change. 


Net Broadband Additions, U.S. Telcos, Third Quarter 2020

AT&T

15,375,000

174,000

Verizon

7,069,000

110,000

CenturyLink/Lumen^

4,563,000

(75,000)

Frontier

3,119,000

(23,000)

Windstream

1,102,300

12,900

Consolidated

792,211

1,008

TDS

487,700

8,200

Cincinnati Bell

434,500

2,500

source: Leichtman Research


Thursday, November 19, 2020

FCC Reallocates 45 MHz for Wi-Fi

The Federal Communications Commission has moved to make 45 MHz of spectrum in the 5.9 GHz band (5.850-5.925 GHz) available for unlicensed uses such as Wi-Fi, indoors and outdoors. 


The new band plan designates the lower 45 megahertz (5.850-5.895 GHz) for unlicensed uses and the upper 30 megahertz (5.895-5.925 GHz) for enhanced automobile safety using Cellular Vehicle-toEverything (C-V2X) technology.


The 45 MHz reallocated for Wi-Fi formerly was assigned to Dedicated Short-Range Communications (DSRC) services more than 20 years ago, but DSRC has not been meaningfully deployed, and the spectrum has largely been unused for decades, the FCC notes. 


The FCC will propose technical rules for outdoor unlicensed operations across the United States using the new unlicensed spectrum.


One reason the FCC prefers not to make spectrum allocations which mandate use for particular purposes is precisely what happened with DSRC. Sometimes new proposed uses do not develop, and the assigned spectrum lies fallow. The preference these days is for general purpose assignments that are not application specific. 


In a sense, that also parallels the movement of all communications networks, especially public networks, away from application-specific use and towards general-purpose or multi-service modes of operation.


Will 5G Prove to be Another Example of Innovation Despite our Efforts?

Sometimes big changes in communications demand happen almost despite our best efforts. Some might argue the 1996 Telecommunications Act succeeded despite itself, for example. Innovation came not so much from telecom competition but from product substitutions based on mobility and the internet, it can be argued.


If you remember the major revision of U.S. telecommunications law called the Telecommunications Act of 1996, you will remember the practical consequences of deregulating the local telecom access business. 


Revising U.S. law, the Act enabled competition for local telecom services, lawful operation and ownership of Class 5 voice switches, the right to sell customers voice and other services and wholesale access to incumbent networks. 


All that happened just prior to voice communications reaching a historic peak about 2000, with a rapid decline. Most incumbent telcos lost 35 percent of their customers for that service in 10 years, as much as 65 to 70 percent over two decades. 


Service providers also lost half their revenue from long distance calling over that same period. 


source: CDC, Statista 


At the same time, other big changes in end user demand were happening: substitution of mobile phone service for fixed service; use of mobiles instead of cameras or music players, GPS devices or video screens. 


source: Wikimedia

There also was increasing use of the internet as a substitute for a wide range of other activities and products. In 1996, for example, it is estimated there were 36 million global users of the internet, representing less than one percent of the world population. A decade later, that had grown to 17 percent. 


About that time, some 14 percent of the U.S. population was using the internet, on dial-up connections. A decade later, that had grown to about 66 percent. 


source: Pew Research 


The point is that disruptive changes in regulatory framework can produce outcomes we did not expect, especially when disruptive enabling technologies happen at the same time, allowing massive product substitution and behavioral changes. 


The same thing might happen with 5G. It arrives in tandem with other key technologies and platforms, including commercial artificial intelligence, edge computing and internet of things. It may, in the end, be hard to separate the various threads from each other. 


In part, that is because computing architectures shift over time, oscillating between centralized and decentralized approaches. That puts computing resources at different places within the architecture, fluctuating between centralized and decentralized designs. 


In the mainframe era, computing resources were centralized at far ends of the network. That shifted in the client-server era to more local processing on devices themselves or on local servers. In the internet era computing switched back to far end hyperscale data centers. 


source: GSMA 


But most observers believe we are now in a stage of shifting more workloads back locally, to take advantage of artificial intelligence, heavy local analysis of sensor data to support the internet of things and compute-intensive applications using virtual or augmented reality. 


“These days lots of companies want to turn bandwidth problems into compute problems because it’s often hard to add more bandwidth and easier to add more compute,” said Andrew Page, NVIDIA media group director of advanced products. 


So maybe 5G will ultimately not be the big story. Maybe other simultaneous changes will provide the most-consequential effects. Put another way, 5G might not be as transformational as edge computing, applied artificial intelligence or IoT.


By the 6G era, network slicing and heterogeneous access might turn on to be more consequential than mobile platform performance. Perhaps value will have migrated further in the direction of orchestration, and away from underlying facilities.


So 5G might be part of a consequential change that we are not deliberately planning.


Wednesday, November 18, 2020

Digital Redlining or Response to Demand?

Terms such as digital redlining imply that U.S. internet service providers upgrade neighborhoods able to pay for higher speed internet access underinvesting in poorer neighborhoods. At some level, it is hard to argue with that point of view, at least where it comes to gigabit internet access. 


Google itself pioneered the tactic of building in neighborhoods where there is demonstrated demand, building Google Fiber first in neighborhoods (typically higher-income areas) where potential customers were most interested. Other gigabit service providers have used the placing of deposits for the same reason. 


And regulatory officials at the local level seem to now agree that “universal service” (building a gigabit network past every home and business) is desirable in some cases, but not absolutely mandatory in all cases. The thinking is that allowing new internet service providers or facilities to be built wherever possible is a better outcome than requiring ubiquity, and getting nothing. 


Also, higher-speed facilities often are not found everywhere in a single market or city. CenturyLink does sell gigabit internet access in Denver, just not everywhere in the metro area. That is not necessarily “redlining,” but likely based on capital available to invest; expectations about financial return; customer density or any other combination of business issues that discourages the investment in new access facilities. 


The economics of communication networks also are clear. Density and cost per location are inversely related. Mobile networks typically have 10 percent of cell sites supporting 50 percent of usage. About 30 percent of sites carry about 80 percent of traffic. That has been true since at least the 3G era.  


In fixed networks, network cost and density also are inversely related. So population density has a direct bearing on network costs. In the U.S. market, network unavailability is concentrated on the last couple of percent of locations.  


With cable operators already holding at least 70 percent share of the internet access installed base of customers, any new investment in faster facilities faces a tough challenge. Any new fiber to home network, for example, essentially is playing catch-up to a cable operator, as roughly 80 percent of U.S. households already also are reached by gigabit speed cable networks. 


And cable share has grown, up from possibly 67 percent share in 2017. 


That noted, internet speeds do vary by geography: speeds in urban areas frequently are higher than in rural areas. But the argument that large numbers of U.S. households are underserved often is correct, depending on what standard one wished to apply, and how one defines the supplier market.


Some claim 42 million U.S. residents are unable to buy broadband internet access, defined as minimum speeds of 25 Mbps in the downstream.  That actually is incorrect. 


Virtually every household in the continental United States is able to buy 25 Mbps or faster service from at least two different satellite providers. But those who claim “42 million” people cannot buy broadband simply ignore those choices, and focus only on the claimed availability of 25 Mbps service by fixed network providers. 


There are other estimates which also vary wildly. Roughly 10 percent of U.S. households are in rural areas, the places where it is most expensive to install fast fixed network internet access facilities, and where the greatest speed gaps--compared to urban areas--almost certainly continue to exist.


In its own work with TV white spaces, Microsoft has targeted perhaps two million people, or roughly a million households, that have no fixed network internet access. That assumes there are two people living in a typical household, which is below the U.S. average of roughly 2.3 to 2.5 per household.


Recall that the definition of broadband is 25 Mbps downstream. Microsoft has argued that 20 million people (about 10 million homes) or perhaps eight percent of the population (perhaps four percent of homes) cannot get such speeds from any fixed network service provider.


Microsoft also has cited figures suggesting 25 million people cannot buy broadband--presumably using the 25 Mbps minimum standard, most of those people living in rural areas. 


That conflicts with data from Openvault that suggests 95 percent of the U.S. population can buy internet access at a minimum of 25 Mbps, while 91 percent to 92 percent can buy service at a minimum of 100 Mbps. 


Using the average 2.5 persons per U.S. household average, that suggests a universe of about 10 million U.S. homes unable to purchase internet access at 25 Mbps from a fixed network supplier, in 2018. What is not so clear is the percentage of households or persons who can do so using a mobile network. 


None of that explains urban areas with slow speeds, though. There the issue is more likely to be high construction costs in urban areas where underground construction is necessary, along with demand expectations that are lower than in suburban areas. That is true whether it is electrical lines or communications networks being considered.   


But at least one Microsoft analysis suggests that about half of all U.S. households are not using 25 Mbps access. The claim is that 162.8 million people are “not using the internet at broadband speeds.” That seems to clearly contradict data gathered by firms such as Ookla and Opensignal suggesting that average U.S. speeds are in triple digits.


In 2018, the average U.S. broadband speed was 94 Mbps, according to the NCTA. That same year, Ookla reported the average U.S. speed was 96 Mbps. 


It is not quite clear how the Microsoft data was generated, though one blog post suggested it was based on an analysis of “anonymized data that we collect as part of our ongoing work to improve the performance and security of our software and services.” 


The claim of 162.8 million people “not using the internet at broadband speeds” (probably using 25 Mbps as the definition) equates to about 65 million households, using the 2.5 persons per household definition. That does not seem to match other data, including the statistics Microsoft itself cites. 


What remains difficult, but might explain the divergence, is if applications and services include both apps run on smartphones as well as PCs and other devices connected to fixed networks. That would explain the number of users, while usage on mobile networks might account for large numbers of sessions where 25 Mbps speeds downstream were not noted, or perhaps it was the upstream speed definition (minimum of 3 Mbps) that was the issue.  


Even then, downstream average 4G speeds in 2018 were in excess of 40 Mbps downstream, so even that explanation is a bit difficult. 


Perhaps there are other ways to make sense of the data. There is a difference between users (people) and households. There is a difference between usage and availability; usage by device (mobile, PC, tablet, gaming device, sensor); application bandwidth and network bandwidth. 


Perhaps the issue is application performance on a wide range of devices including mobiles and untethered devices using Wi-Fi, which would reduce average experienced speeds, compared to “delivered access speed.” 


Methodology does matter. So do the costs and benefits of broadband capital investment under competitive conditions, in areas with high construction costs or low demand for advanced services, especially when newer platforms with better economics are being commercialized. 


Telecommunications is a business like any other. Investments are made in expectation of profits. Where a sustainable business case does not exist, subsidies for high-cost areas or universal service support exist. 


The point is that every human activity has a business and revenue model: it can be product sales, advertising, memberships, subscriptions, tax support, fees, donations or inheritances. Children have a “parents support me” revenue model, supported in turn by any of the aforementioned revenue models. 


But every sustainable activity has a revenue model, direct or indirect. The whole global communications business now operates on very different principles than the pre-competitive monopoly business prior to the 1980s. We still have a “universal service” low end, but we increasingly rely on end user demand to drive the high end. 


Our notions of low end change--and higher--over time. We once defined “broadband” as any data rate of 1.544 Mbps or higher. These days we might use functional definitions of 25 Mbps or 30 Mbps. Recall that 30 Mbps--in 2020--was called “superfast” as a goal for U.K. fixed network broadband. 


Few of us would consider 30 Mbps “superfast” any longer. Some might say the new “superfast” is gigabit per second speeds. But that is the change in real-world communications over just a decade. What was a goal in 2010 now is far surpassed. 


What some call “redlining” is simply a response to huge changes in the internet access and communications business. “Maximum” is a moving target that responds to customer demand. “Minimums” tend to be set by government regulators in search of universal service. 


As independent internet service providers cherry pick service areas where they believe the greatest demand for gigabit per second internet access exists, so do incumbents. 


Similar choices are made by providers of metro business services; builders of subsea connectivity networks or suppliers of low earth orbit satellite constellations and fixed wireless networks. They build first--or pick customer segments--where they think the demand is greatest.


Saturday, November 14, 2020

Technology Displacement is Harder than it Seems

Technology displacement--new for older--is a tricky business. Sometimes a whole ecosystem has to be built before a key innovation can reach mass adoption.  


Not every feasible technology substitute actually displaces other solutions with which they potentially compete, even when the argument is made that the substitute is “better” on some relevant performance metric. 


Sometimes the failures are the result of business execution, as when a promising startup runs out of money, grows too fast or too slowly. 


And customer adoption is almost always related to potential customer underlying habits and preferences. Changing has costs, so the innovation must deliver value in significant excess to the costs of changing. 


Customer experience, broadly defined, always is important. “Better” in some sense is offset by “hard to use,” “inconvenient” or “not worth the extra money.”


Politics and culture sometimes also play a key role. Is an existing way of doing things beneficial to important and powerful interests? Can they resist innovations that threaten those interests? 


Sometimes it is deemed too much hassle to displace an existing solution and ecosystem with a rival. Despite its inferiority to other keyboard layouts, we still use QWERTY, which originally was developed to slow down typing and prevent key jamming on mechanical typewriters.


Some call that path dependence, the idea that small, random events at critical moments can determine choices in technology that are extremely difficult and expensive to change.


Innovation is more a human process than a technological one,” says Stacy Wood, North Carolina State professor. “Persuasion, environment, culture and context always matter.” 


If the primary end-use value of a smartphone is the expected ability to remain connected “anywhere,” on the go, then it makes sense that Wi-Fi--though a key part of the connectivity ecosystem and experience--is not a direct or convenient substitute.


For perhaps similar reasons, few of us use smartphones without cellular service, though some functionality is possible. 


In the mobile communications business, the service always is bundled: text messaging, voice and internet access being the foundations. It remains possible to purchase a basic bundle including only voice and messaging, but increasingly, the foundation package includes internet access. 


Decades ago, the emergence of Wi-Fi was touted as the potential foundation for mobile phone service, and so it has become, though not in the way some expected. Periodically, it has been suggested that Wi-Fi could be the sole connectivity mechanism for mobile phone service. 


Voice and text messaging still are required features of a 5G network, whether they directly generate lots of specific revenue or not. Customers might willingly buy a 5G-based home broadband service, without voice or texting capabilities. They might buy a 5G dongle for PC internet  access. 


But it remains an open question whether smartphone service without voice and texting is viable, lawful or desirable. In principle, a smartphone can function without a “mobile” account enabling voice, using Wi-Fi and VoIP. 


There are some issues, such as inability to use a phone number or communicate easily with other users of the public telephone network. But think of a smartphone connected to Wi-Fi, with no subscriber identification module and mobile service, as a PC connected to the internet and using Zoom or any other messaging or VoIP service.  


It can be done, but the utility or value is not high, for most people, if the mobile service bundle of value also includes low-cost public network voice and messaging (for domestic communications, for example) as well as the ability to use Wi-Fi or the mobile network when roaming or conducting international communications.


Calling using VoIP over Wi-Fi is possible and useful. In a mobile device context the overall value of a mobile service might be high enough, and the cost low enough, that bothering with a Wi-Fi-only use of the phone is not worthwhile. 


Technology displacement often is quite a bit more complicated than it appears. 


Content Versus Distribution: Netflix Versus Disney: Where's the Value?

It has long been possible to get a reasonable debate on the respective values of content and distribution in the media business. Simply put, the issue is which part of the value chain is better positioned. Consider one illustration. The market value of Netflix is something on the order of $213 billion. Content creator Disney has a market cap of about $250 billion. 


Market cap is not everything, and both firms produce and distribute content. But it might be surprising that Netflix is so close to Disney, given its simpler business model, which includes original content production, but relies on distribution (subscriptions) for its revenue. 


Disney is far more complex, spanning movie and animation studios, broadcasting networks, content networks, theme parks, hotels, merchandise and video streaming operations.  


Think of Netflix as a content distributor, much as cable TV, satellite and telco video service providers; movie theaters; TV or radio broadcasters and increasingly, many online services and apps now act. True, Netflix invests heavily in original content, as do some other leading streaming video services. But its direct revenue comes solely from subscriptions. 


Notice something about Disney, though. Direct-to-consumer, which includes the Disney streaming service, with 73 million paid subscribers, generates significant revenue, but negligible operating income. Granted, that is partially because the Covid-19 restrictions closed the theme parks, while Disney’s streaming service is in start-up mode, so operating income might not be expected for a bit. 

source: Investopedia 


In a non-Covid environment, the theme parks (which includes hotels and merchandise) contribution to operating income would be vastly higher, between 20 percent and 33 percent of revenue. 

source: Nasdaq 


Operating income from theme parks, hotels and merchandise might range as high as 37 percent. 


source: Valuewalk


The point is that direct-to-consumer, lead by the streaming networks, should ultimately produce significant cash flow and operating income for Disney. The issue for some is how that cash flow and income might affect Disney value, and whether some different legal status for direct-to-consumer might affect the value of that unit. 


To be sure, some are unsure Disney streaming businesses can approach the size of Netflix, which already has perhaps 201 million million subscriptions. All Disney streaming properties collectively might reach about 100 million subscriptions. 


So the question is whether, someday, the value of Disney streaming is such that those assets would fetch a higher valuation if independent, as is Netflix.


DIY and Licensed GenAI Patterns Will Continue

As always with software, firms are going to opt for a mix of "do it yourself" owned technology and licensed third party offerings....