Saturday, December 2, 2023

Fixed Wireless Now Accounts for 90%-Plus of Net Home Broadband Additions

By now, it is clear that fixed wireless access does resonate with substantial portions of the home broadband market. Some of us estimate that fixed wireless appeals mostly to about 20 percent to 25 percent of buyers who take slower-speed services up to about 200 Mbps to 300 Mbps. 


“At current prices, full FWA (fixed wireless access) entry to a cable-only market, which constitutes approximately 30 percent of all cable modem subscribers in the United States, would convert 18 percent of cable-only households to FWA,” a study by EconOne estimates. 


Take rates are lower when FWA is introduced into a market with both cable operator and at least one provider of fiber-to-home service. 


“When two FWA packages are introduced, they capture a 2.5 percent market share, with 0.9 percent coming from Cable and 1.6 percent coming from Fiber,” the study also suggests. 


Separately, Leichtman Research estimates that six percent of U.S. home broadband accounts now are supplied using fixed wireless networks, and in 2023 is generating in excess of 90 percent of all net broadband account additions in the U.S. market. 


Most observers would tend to agree that fixed wireless competes with other lower-speed home broadband services operating at less than 200 Mbps. Most observers also would note that FWA suppliers are careful to offer the service only in areas where they can supply both mobile customer demand and home broadband usage without degrading mobile experience. 


For perhaps obvious reasons, the study did not study the impact of FWA in areas served by a cable operator and a telco digital subscriber line network. In such areas, one presumes there is market share shift away from DSL as well, often from one fixed network supplier to a rival mobile supplier. 


In virtually all cases studied, fixed wireless takes more share from DSL than it does from cable operators. Of course, no study funded by a telco organization may wish to point this out. 


Study

Year

Impact on Cable Market Share

Impact on DSL Market Share

Recon Analytics

2022

-1.1%

-2.2%

FCC

2021

-1.5%

-3.0%

Dell'Oro Group

2020

-2.0%

-4.0%

Leichtman Research Group

2019

-2.5%

-4.5%


We can presume that the primary form of market share shift away from DSL is from an existing incumbent telco to an attacking mobile operator.

Friday, December 1, 2023

Not All 5G Customers Use It

5G accounts continue to climb as more 5G mobile networks are built, reaching about 1.6 billion accounts by the end of 2023, according to Ericsson. That is out of a total of about 8.5 billion mobile accounts globally. 


But actual time connected to 5G varies, and a significant percentage of 5G customers may not use the network at all. A study by the GSMA (Global System for Mobile Communications) found that 20 percent of 5G subscribers worldwide were not using 5G services in 2022.


Market

Estimated percentage of 5G subscribers who do not use 5G

United States

15-20%

Europe

20-25%

Asia

10-15%


Some consumers who buy 5G accounts cannot get connected to them all the time because signal strength is too low. 


A 2023 survey by T-Mobile found that 65 percent of its 5G subscribers spend more than half of their time connected to 5G networks. 


Also, some customers with 5G-capable service might only be using 4G-capable phones on those networks. 


According to Ookla data, the percentage of 5G connection time can range from a high of 47 percent to a low of about five percent. 


Market

5G Availability (%)

Time Spent on 5G Networks (%)

South Korea

42.9%

47.2%

Puerto Rico

48.4%

43.5%

Finland

24.2%

22.7%

Bulgaria

24.7%

22.2%

Taiwan

30.0%

21.6%

Singapore

30.0%

21.2%

Malaysia

20.5%

18.0%

United States

31.1%

18.0%

France

20.6%

13.6%

Germany

13.3%

8.5%

Italy

17.9%

9.2%

Spain

15.2%

8.4%

United Kingdom

10.1%

5.0%


A recent study by Ookla found that 5G users in the United States are connected to 5G networks about 30 percent of the time, while 4G users are connected to 4G networks about 70 percent of the time. 


A study by Opensignal found that only 15 percent of 5G subscribers in the United States are actively using 5G networks. And a study by Telenor Research found that 5G users are connected to 5G networks for an average of 20 percent of the time, though all such numbers are improving as 5G coverage grows, and more consumers buy 5G phones.


"Sampling" is Driving Much ChatGPT Usage

Most people are likely impressed by rapid ChatGPT usage statistics. But very-early “usage” of a new type of app is likely driven by curiosity, media hype, or social media trends rather than genuine engagement and continued use that creates a firmer daily active user metric. Consider one metric: total users. 


Most observers agree that ChatGPT reached 100 million users in two months. Other popular apps took longer to reach that level of “usage.” But it is hard, in the very-early days, to determine what that statistic means, as it includes lots of users who are sampling, but decide the app is not immediately useful, and do not become repeat users. 


App

Time to Reach 100 Million Users

ChatGPT

2 months

WhatsApp

3.5 years

Facebook

2.5 years

TikTok

9 months

Instagram

2.5 years

Twitter

5 years

Netflix

3.5 years


So many would focus on “daily active users,” which tends to better measure repeat user behavior, and should provide a better idea of app adoption. ChatGPT also appears to have been adopted faster than many other now-popular apps. 


App

Time to Reach 1 Million DAU

ChatGPT

5 days

TikTok

9 months

WhatsApp

18 months

Facebook

4 months

Google

8 months

Netflix

13 months


The point is that optimism about LLM and generative AI is reasonable: DAU stats suggest the potential. 


According to a survey by OpenAI, perhaps two percent of internet users have used a large language model at least once, which would suggest 160 million people worldwide have used ChatGPT at least once for creative writing, learning, and entertainment, which are the three most-frequent uses. 


Another survey, conducted by Pew Research Center, suggests that 13 percent of Americans have heard of large language models, and three percent have used one. Another study by Pew suggests 20 percent of teenagers who have heard of ChatGPT have used it for homework., 


A separate study by Pew Research suggests the percentage of respondents who have heard of ChatGPT and have tried it has grown substantially by August 2023. Repeat studies suggest sampling or usage has grown rapidly in 2023. 

Here Come the NPUs

As important as central processing units and graphics processing units are for modern computing, other application specific integrated circuits and now neural processing units are becoming important as artificial intelligence becomes a fundamental part of computing and computing devices. 


A neural processing unit (NPU) is a specialized microprocessor that is designed to accelerate the performance of machine learning algorithms, particularly those involving artificial neural networks (ANNs). 


Often called “AI accelerators,” neural processing units are dedicated hardware that handle specific machine learning tasks such as computer vision algorithms. You can think of them much like a GPU, but for AI rather than graphics.


Though important for virtually any AI processing task, in any setting, NPUs will be vital for onboard smartphone processing tasks, as they reduce power consumption. 


NPU's are specifically designed to handle the large matrix operations that are common in ANNs, making them much faster and more efficient than traditional CPUs or GPUs for these tasks.


Producers of NPUs already include a “who’s who” list of suppliers:


  • Google (Tensor Processing Unit)

  • Intel Nervana

  • NVIDIA's AI Tensor Cores are a type of NPU that is integrated into NVIDIA's GPUs.

  • IBM's TruAI

  • Graphcore Intelligence Processing Unit

  • Wave Computing's Data Processing Unit

  • Cambricon's Machine Learning Unit

  • Huawei's NPU

  • Qualcomm's AI Engine is integrated into Qualcomm's mobile processors. 


Why are they used? Performance, efficiency, latency. 


NPUs can provide significant performance improvements over CPUs and GPUs for machine learning tasks. NPUs are also more efficient than CPUs and GPUs for machine learning tasks, consuming less power and producing  less heat. NPUs can also reduce the latency of machine learning tasks. 


NPUs are used for natural language processing, computer vision, recommendation systems and to power autonomous vehicles, for example. 


"Back to the Future" as Extended Edge Develops

In some ways, extended edge AI processing on devices--and not at remote cloud computing centers, is a form of “back to the future” technology, as in the days when most data processing happened directly on devices (PCs). 


That suggests substantial movement back to a distributed, decentralized processing environment, in large part, where the cloud era brought a centralized model into being. 


Just as certainly, extended edge will recreate smartphone, smart watch and PC markets, as those devices are outfitted to handle AI directly on board. 


If one takes the current retail market value of smartphones, PCs and smart watches, and then assumes adoption rates between 70 percent and 90 percent by 2033, markets supporting extended edge will be quite substantial, including between a doubling to five-times increase in software, hardware, chip, platform, manufacturing and connectivity increases. 


Market

Market Size in 2023 (USD Billion)

Estimated Market Size with Edge AI in 2033 (USD Billion)

Percentage of AI-Capable Devices in 2033

Smartphones

430

1,999

90%

PCs

250

600

80%

Smartwatches

30

150

70%

Total

710

2,749

80%


Just as PCs made computing power available to anyone with a computer, extended edge AI is making AI capabilities accessible to a wider range of devices and users, right on the device itself.


Extended edge AI also will embed AI operations into everyday objects and environments, enabling a range of new operations requiring immediate response (low latency). 


That will require more powerful processors and more storage, which can be a problem for smartphones and wearable devices with limited resources.


Increased power consumption also will be an issue. And AI models will have to be updated. 


Over-the-air updates, federated learning (where devices train a shared model without exchanging raw data), model compression and quantization, model pruning and knowledge distillation or adaptive learning are tools designers can use to ensure that AI models running on extended edge devices can be updated. 


Model pruning techniques identify and remove redundant or less important connections within the AI model, reducing its complexity without significantly impacting performance. Knowledge distillation involves transferring the knowledge from a large, complex model to a smaller, more efficient model, preserving the original model's capabilities.


Adaptive learning algorithms enable AI models to continuously learn and adapt to changing environments and user behavior.


Will AI Disrupt Non-Tangible Products and Industries as Much as the Internet Did?

Most digital and non-tangible product markets were disrupted by the internet, and might be further disrupted by artificial intelligence as w...