Showing posts sorted by relevance for query 6G. Sort by date Show all posts
Showing posts sorted by relevance for query 6G. Sort by date Show all posts

Wednesday, October 14, 2020

Work on 6G Might Actually be Starting Later than for 4G and 5G

With 5G just launching commercially, it might seem odd that we already are hearing talk about 6G. But there is an argument to be made that, historically, the 6G process might actually have been started far later than did either 4G or 5G. 


Consider that the first 4G framework was set by the International Telecommunication Union in 1998. What became Long Term Evolution was proposed in 2004. and that early commercial deployment began about 2010. 


So 4G early conceptual work to commercialization took about 12 years, complicated by the distraction of two major alternatives, WiMax and LTE. 


For 5G, early conceptual work began about 2008. The standard was largely solidified by 2017. South Korea launched commercial 5G in 2019. The point is that the time from early conceptual work to commercial deployment took about 13 years. 


Samsung believes 6G could be available commercially as early as 2028, with widespread availability by 2020. So early commercialization could happen in about seven years, with deployment at scale in about nine years.


Many of us would expect to see early 6G deployment by about 2030. If so, then work on 6G actually is starting later than was the case for either 4G or 5G. 


So two outcomes might be suggested. On one hand, 6G might arrive later than we presently expect. On the other hand, if 6G arrives about when we expect (2030), then the development process from conceptual work to standards completion and commercial deployment will happen faster than was the case for 4G and 5G.


As one example, the NextG Alliance, formed by the Alliance for Telecommunications Industry Solutions (ATIS) aims to “advance North American global leadership over the 5G evolutionary path and 6G early development” and will hold its first meeting in November 2020.


The NextG Alliance says it hopes to:

  • Create a Next G development roadmap

  • Develop a set of national priorities that will influence government applied research funding and promote incentivized government actions.

  • Align development with commercialization outcomes.


Skeptics might argue it is way too early to talk about 6G. But the history of 4G and 5G suggests we might be starting later in the 6G process. If early conceptual work is just starting now, then the full development process--compared to 4G and 5G--would be compressed by three to four years.

In some ways the 6G development timeline might be easier. There were two different versions of 4G proposed and adopted commercially. That arguably slowed the development process.

5G did not suffer from that problem, but did introduce some new concerns about capital investment cost, as the addition of millimeter wave spectrum for the first time raised new issues about the number of required cell locations and the cost of "X" haul traffic from radio heads back to the core network.

6G likely will not have the confusion of two competing proposed standards or as much concern about X haul or small cells, as much of that infrastructure will have been put into place to support 5G. If so, then a more-compressed development cycle is feasible.

As 5G built on 4G, so 6G is likely to build on 5G, both in terms of infrastructure and other architectural choices. The inclusion of millimeter wave spectrum should ease issues associated with a possible move to terahertz frequencies. 

New antenna technologies to support millimeter wave signals, advanced duplex technologies (TDD), dense fiber X haul, spectrum sharing and use of artificial intelligence all should apply to 6G as well. 


Monday, January 27, 2020

Applications and Use Cases are Big 6G Challenge

The whole point of any access network is send and receive user and device data as quickly as possible, as affordbly as possible, to the core network and all the computing resources attached to the core network. The future 6G network, no less than the new 5G network, is likely to feature advancements of that type.

Bandwidth will be higher, network ability to support unlimited numbers of devices and sensors will be greater, latency will be even lower and the distance between edge devices and users and computing resources will shrink.

The biggest unknowns are use cases, applications and revenue models, as has been true since 3G. The best analogy is gigabit fixed network internet access. Users often can buy service running at speeds up to a gigabit per second. Few customers presently have use cases requiring far lower speeds.

So it is likely that 6G, as will 5G, often will feature capabilities that exceed consumer use cases, initially.

NTT Docomo has released a white paper with an initial vision of what 6G will entail. Since every mobile generation since 2G has increased speeds and lowered latency, while connection density grew dramatically between 4G and 5G, we might look first to those metrics for change.


Docomo suggests peak data rates of 100 Gbps, latency under one millisecond and device connection density of 10 million devices in each square kilometer would be design goals. 


Along with continued progress on the coverage dimension, 6G standards might extend to space, sky and sea communications as well. Docomo also believes quality of service mechanisms exceeding “five nines” and device performance (no charging devices, cheaper devices) would be parts of the standard. 




Looking at commercial, economic or social impact, since the 3G era we have tended to see a lag of execution compared to expectations. In other words, many proposed 3G use cases did not emerge until 4G. Some might say a few key 4G use cases will not flourish until 5G is well underway.


For that reason, we might also discover that many proposed 5G innovations will not actually become typical until the 6G era. Autonomous vehicles are likely to provide an example. 


So Docomo focuses on 6G outcomes instead of network performance metrics. Docomo talks about “solving social problems” as much as “every place on the ground, sky, and sea” having  communications capability. Likewise, 6G might be expected to support the cyber-physical dimensions of experience. 


Also, 5G is the first mobile platform to include key support for machine-to-machine communication, instead of primarily focusing on ways to improve communication between humans. Docomo believes 6G will deepen that trend. 




It is worth noting that the 5G spec for the air interface entails availability higher than the traditional telecom standard of “five nines” (availability of 99.999 percent). 5G networks are designed to run at “six nines.” So 6G might well run at up to “seven nines” (99.99999 percent availability). 


The legacy telecom standard of five nines meant outages or service unavailability of 5.26 minutes a year. The 5G standard equates to less than 32 seconds of network unavailability each year. A seven nines standard means 3.16 seconds of unavailability each year. 




Some might say 4G was the first of the digital era platforms to design in support for internet of things (machines and computers talking to machines and computers) instead of the more traditional human mobile phone user. That trend is likely to be extended in the 6G era, with more design support for applications and use cases, with artificial intelligence support being a key design goal as well. 


In part, that shift to applications and use cases is more important as the wringing of traditional performance out of the network becomes less critical than new use cases taking advantage of performance boosts. 


As it already is the case that almost no consumer users actually “need” gigabit speeds, much less speeds in the hundreds of megabits per second, so few human users or sensors will actually “need” the 6G levels of throughput and latency. 


Architecturally, the evolution towards smaller cells will continue, in part to support millimeter wave frequencies, in part to assure better connectivity. Where traditional cell architectures have emphasized non-overlapping coverage, 6G networks might use orthogonally non-aligned cells (overlapping), deliberately overlapping to assure connectivity. 


That almost certainly will require more development of low-cost beam forming and signal path control. Having cheap artificial intelligence is going to help, one might suggest.

Saturday, September 30, 2023

How Feasible is a Software-Only 6G Upgrade to 5G?

If telco executives get their way, 6G will be a software upgrade that does not require replacement of 5G network elements such as radios. In some ways, that will be challenging. 


"We believe that a software-only upgrade to 6G is the best way to meet the increasing demands of mobile users and businesses,” said Niklas Heuveldop, Vodafone CTO. 


"A software-only upgrade to 6G is essential for us,” said Hannes Ametsreiter, Deutsche Telekom CTO.


Perhaps surprisingly, even Rajeev Suri, Nokia CEO, has said "a software-only upgrade to 6G is the only way to meet the ambitious goals of the 6G roadmap.” It will be challenging. 


It is not clear whether the in-place radios are frequency-agile enough to handle huge new blocks of millimeter wave or teraHertz frequencies. So it is not clear whether virtualized or software-defined radios can be used with the existing 5G infrastructure to allow the 6G upgrades without major upgrades or replacement of existing radio infrastructure. 


Then there is the issue of whether the existing 5G network radio sites are compatible with the signal propagation characteristics of new millimeter or teraHertz spectrum that might be added, or how much new radios or new small cell sites will be required. 


Easier to implement are new modulation techniques, for which there are a number of possible alternatives to the 5G orthogonal frequency-division multiplexing standard. 


What might make adaptive modulation possible--the ability to use different modulation methods depending on local conditions, is the 5G ability to support 5G networks can also use adaptive modulation, which allows the modulation scheme to be changed dynamically, depending on the channel conditions. 


That feature should support dynamic modulation that is more robust in areas where signal propagation is more challenging (though supporting less bandwidth); but supporting maximum throughput in other areas with favorable signal propagation characteristics.


6G is expected to use higher-order modulation schemes than 5G, such as 256QAM and 1024QAM. This will allow for more bits to be transmitted per symbol, increasing the spectral efficiency of the network. 


But there also are a number of potential modulation approaches. 


Index modulation: Index modulation is a technique that uses the indices of active transmit antennas, subcarriers, or time slots to transmit additional information. This can be used to further increase the spectral efficiency of the network.


Non-orthogonal multiple access (NOMA): NOMA is a technique that allows multiple users to share the same spectrum resources at the same time, without causing interference. This can be used to improve the network capacity and support more connected devices.


Machine learning (ML)-based modulation: ML can be used to develop new modulation schemes that are more efficient and robust to interference.


Hybrid modulation schemes: Hybrid modulation schemes combine elements of different modulation schemes to achieve the best possible performance in different operating conditions.


Polar modulation: Polar modulation is a new type of modulation that is more efficient and robust than conventional modulation schemes. Polar modulation is expected to be used in 6G to achieve higher data rates and improve reliability.


MIMO modulation: MIMO modulation uses multiple antennas to transmit and receive data simultaneously. This can significantly increase data rates and improve reliability. 6G is expected to use MIMO modulation with a larger number of antennas than previous generations of cellular technology.


MIMO-OFDM: MIMO-OFDM is a multiplexing technique that uses multiple antennas at the transmitter and receiver to transmit and receive multiple data streams simultaneously. MIMO-OFDM is already used in 5G networks. 


In addition to OFDM, 5G networks can also use other modulation techniques, such as filter bank multicarrier (FBMC) and universal filtered multicarrier (UFMC). However, OFDM is the most widely used modulation technique in 5G networks today.


It is the existing 5G network’s ability to use adaptive modulation, supporting modulation schemes that can be changed dynamically depending on the channel conditions, which will support 6G. 


It remains to be seen how much such approaches can support a software-only upgrade of 5G to support 6G. Many will guess that hardware upgrades will still be necessary, though on a perhaps-reduced level compared to earlier mobile network upgrades. 


That there is growing buyer resistance to the traditional hardware-based platform updates is obvious. Just as obviously, there are possible new opportunities for non-traditional suppliers, such as the hyperscale cloud computing providers.


Friday, July 31, 2020

As 5G Focuses on Enterprise Use Cases, 6G Might Focus on Virtualized and Self-Learning Networks

Mobile and fixed network operators constantly are challenged to reduce capital investment and operating costs as a way of compensating for low revenue growth, challenged profit margins and ever-increasing bandwidth consumption by customers whose propensity to pay is sharply limited. 

The very design of future 6G networks might work to help reduce capex and opex, while incorporating much more spectrum, at very high frequencies and basing core operations on use of machine learning (a form of artificial intelligence that allows machines to learn autonomously). 

New 6G networks might rely even more extensively on virtualization than do 5G networks, featuring now-exotic ways of supporting internet of things sensors that require no batteries, a capability that would dramatically reduce IoT network operating costs. 

It is possible 6G networks will be fundamentally different from 5G in ways beyond use of spectrum, faster speeds and even lower latency. 6G networks might essentially be “cell-less,” able to harness ambient energy for devices that require no batteries and feature a virtualized radio access network. 


The “cell-less” architecture will allow end user devices to connect automatically to any available radio, on any authorized network. Harvesting of ambient energy will be especially important for internet of things devices and sensors that might not require any batteries at all to operate, reducing operating cost. 


source: IEEE


The virtualized radio access network will provide better connectivity, at possibly lower cost, as user devices can use the “best” resource presently available, on any participating network, including non-terrestrial platforms (balloons, unmanned aerial vehicles or satellites). 


Backhaul might be built into every terrestrial radio, using millimeter wave spectrum both for user-facing and backhaul connections, automatically configured. That will reduce cost of network design, planning and backhaul. 


Researchers now also say such federated networks will be based on machine learning (artificial intelligence), which will be fundamental to the way 6G networks operate. Devices will not only use AI to select a particular radio connection, but will modify behavior based on experience. 


The network architecture might be quite different from today’s “cellular” plan, in that access is “fully user centric,” allowing terminals to make autonomous network decisions about how to connect to any authorized and compatible network, without supervision from centralized controllers.


Though machine learning arguably already is used in some ways to classify and predict, in the 6G era devices might also use artificial intelligence to choose “the best” network connection “right now,” using any available resource, in an autonomous way, not dictated by centralized controllers.  


To be sure, in some ways those changes are simply extrapolations from today’s network, which increasingly is heterogeneous, able to use spectrum sharing or Wi-Fi access, using radio signal strength to determine which transmitter to connect with. 


Architecturally, the idea is that any user device connects to the radio access network, not to any specific radio, using any specific base station, say researchers Marco Giordani, Member, IEEE, Michele Polese, Member, IEEE, Marco Mezzavilla, Senior Member, IEEE, Sundeep Rangan, Fellow, IEEE, Michele Zorzi, Fellow, IEEE. 

source: IEEE


Overall, many 6G features will be designed to reduce the cost and improve the efficiency of the radio access network, especially to create “pervasive” connectivity, not just to add more bandwidth and lower latency for end users and devices.


Wednesday, August 16, 2023

"You Get to Keep Your Business" Will be the Fundamental Driver of 6G

Most 5G infra suppliers and mobile operators have been insistent that 5G would enable new use cases, novel applications and drive higher revenue, to some extent. So far, those proponents have been “wrong,” but only to the extent that they also were wrong about 3G and 4G. 


Though some important new use cases have emerged in each digital generation (from 2G on), most of the innovation has not been of the sort mobile operators can directly participate in as equity owners. 


In other words, most of the new value and revenue from new use cases has flowed to third-party app developers. And if you think about it, that is what is “supposed” to happen when a layered app architecture is assumed. 


By definition, the internet is “permissionless.” App creators do not require a formal business relationship with an internet access provider to reach users and customers. 


Eventually, some new 5G use cases will develop. But infra suppliers and mobile operators have routinely “over-promised and under-delivered” in the area of new apps, use cases and value, for every digital mobile generation.


An SKT white paper says 5F failed to achieve its goals, among which were the rapid development of new use cases, apps and services that collectively would fuel mobile operator revenue growth. There was no “killer service.”


SKT also essentially argues that 5G “over-promised and under-delivered.” Customers expected much more than what was delivered. 


As was the case for 4G, 6G will enable “services that were difficult to fully implement with 5G.” Anybody who followed 4G will get this. The promises of one mobile generation often are not realized--if at all--until the subsequent generation. 


In other words, some use cases hoped for in the 3G era did not develop until 4G. Perhaps some 4G use cases will flourish during 5G. Perhaps some 5G innovations will happen when 6G arrives. 


Maybe the industry is simply collectively wishful, without sufficient basis in fact. What a given network can do is not the same as assurance customers will value the innovations, or pay to use them. 


Quite to the contrary, the very architecture of internet-based apps and services militates against the ability of access providers to capture the value of app development. 


Perhaps a comparison with home broadband will illustrate why the “over-promising” always happens. Over time, home broadband has moved capacity upwards from kilobits per second to megabits to gigabits per second. As with mobile platforms, home broadband networks have used different media to support those advances. 


But nobody actually argues that “faster home broadband” will directly lead to new use cases and value supplied by the internet service provider. People understand that virtually all of the development will be fueled by third parties. The faster internet access only enables use of those innovations. 


Mobile operators might argue that they have a more-embedded role, as they offer managed services including voice and messaging. True, but some fixed network suppliers also offer voice, as well as internet access. 


The point is that a mobile service provider, in its role as an ISP, supplies “internet access” but not apps. And the primary value of 5G is that it supports more capacity than did 4G, as 4G enabled more capacity than 3G. 


Such capacity increases are essential. But ISPs are not primarily the producers of application value. 


To be sure, ISPs and their infra suppliers have to argue that wonderful new apps will be possible. Otherwise, it is hard to convince regulators to grant use of more spectrum. But everyone also understands that the new apps will mostly be produced by third parties. 


5G and 6G are vital, nonetheless. As with home broadband networks, capacity must continually be increased. 


But the hard truth is that 5G mostly means “you get to keep your business.” It is a means of supplying needed capacity, primarily. Someday, 6G will be required to enable mobile service providers to stay in business.


But the claimed benefits will extend quite a bit beyond that. They always do. 


Prosaic though it might be, the next-generation mobile networks are the functional equivalent of increasing home broadband and fixed network capacity from kilobits per second to megabits to gigabits. “More capacity” is the value. 


4G, 5G, 6G and beyond are the means by which mobile operators are able to supply faster speeds and more capacity over time. It means they get to stay in business. But it generally does not mean the mobile operators themselves will be creating new apps and use cases. 


So expect 6G to be yet another example of “failure.” Proponents will again over-promise. To get additional spectrum, they almost have to do so. 


But do not be fooled. They need more capacity. The way they will get it is partly by adopting 6G. It is important; they need to do so. But most of the hype about new value, apps and use cases--as produced by the mobile operators themselves--will fail. 


The architecture ensures it. The whole point of internet access is to enable people and machines to use apps available to internet-connected devices. We need more capacity, over time. New mobile networks are how we get there. 


But think of 5G and 6G as a necessary precondition for remaining in business, as faster fixed network access also is a fundamental requirement. Proponents will emphasize bells and whistles. Ignore all that. It is about remaining in business, as that business requires more capacity over time.


Tuesday, December 5, 2023

Why Orange Will Not Market "6G"

It is a bit of a subtlety, but Orange is not sure it will “market 6G,” which is not the same thing as saying it will not use 6G. Unless something very unusual happens, such as the global industry deciding it does not want creation of a “6G” standard, 6G is going to happen, for the simple reason that mobile operators will continue to need additional bandwidth and capacity, and 6G is going to be needed to accomplish that.


Aside from all other matters, 6G will mean regulators must authorize additional spectrum for the platform, and additional spectrum is among the main tools mobile operators have for increasing capacity on their networks. 


Nor does such a stance really mean that Orange will stop investing in the latest generations of mobile networks. It does mean Orange will deemphasize “generation” as personal computer makers have deemphasized “clock speed” as a value driver or differentiator. 


Mobile phone suppliers, meanwhile, once marketed “smartphones” based on screen size,  touchscreen interfaces rather than keypads and ability to use mobile internet and apps. 


These days, much more emphasis is placed on battery life and camera features. One can safely predict that artificial intelligence features will be the next marketing battleground. 


In similar fashion, personal computers once marketed their devices on “performance” and a few lead use cases (word processing or spreadsheets). So processor speed, storage and memory were key messages. 


Later, bundled apps, connectivity and user-friendly interfaces became more important. These days, mobility (weight, form factor), multi-function use or sustainability are more prominent messages. 


The point is that features once considered differentiators often lose their appeal as markets mature. 


Thursday, May 21, 2020

6G as Industrial Policy

It has been quite some time since the idea of national “industrial policy” has had much currency in the United States, but 6G mobile network platforms seem to be shaping up as one area where attitudes could change, especially in the areas of indigenous supply chain. To be sure, 3G and 4G have been viewed as arenas for industrial policy in other parts of the world, and 6G is viewed as an area of policy for China. 


Despite the growing interest in 6G standards, it might not be so clear how leadership leads to advantage that can be reaped by countries, suppliers, service providers or consumers. The Alliance for Telecommunications Industry Solutions speaks of  “core technologies and recommended government actions,”  “rapid innovation and development” and  “common national purpose.”


Referring to 5G, ATIS notes the advantages of “development and early deployments” that, in a 6G context, might also confer leadership of “ideas, development, adoption and rapid commercialization of 6G.”


The idea is to focus on ways to “complement–not abandon or usurp–global standards in the ICT sector.” Key is “leadership of ideas.”


ATIS says “leadership begins with identifying a vision for the next decade,” although some related competencies include AI-Enabled Advanced Networks and Services, advanced antenna and radio systems, multi-access networks and likely a few key use cases. 


As a practical matter, that means “defining the technological breakthroughs that can lead the U.S. to sustainable technology leadership, with incentives for research and development and early investment.


Those steps, in turn, are viewed as vital to promoting time to market and “wide scale commercial adoption.”


More tactically, ATIs suggests tax credits for development in areas where U.S. firms might lead, continued spectrum policy support and support for efforts to commercialize 6G use cases. 


None of that would sound unusual, in the context of government policy in the 3G, 4G and now 5G eras. It is a mix of policies to spur supply and demand. Similar approaches arguably were common when many other nations--China, Singapore, South Korea, Japan, Israel--likewise chose to target economic growth in leadership, and as many others now also intend (Malaysia, Thailand, India and others). 


The methods will vary, but the idea is to focus effort, perhaps always easier on the supply than the demand side, but both have roles. 


It also is not too soon to argue what ultimately will matter most is not standards, which, by definition, will be global, but the ability to usefully deploy technology. By definition, every firm and nation will have access to the standards. 


But some firms, nations and regions might hope to create competencies in supply, or advantageous demand profiles. Scale, experience curves and intellectual property will matter. But so will skill at the application of new technology and leverage of existing assets.


Were that not the case, we should never see significant differences between productivity gains, for example, among any countries. As we used to say, tele-density and economic development should be directly related. And yet benefits are differential, even when tele-density, or internet usage, or network speeds, are identical or similar. 


The point is that what matters is the ability to leverage technology for economic advantage. High rates of deployed technology are only proxies for what benefit those deployments are expected to bring. 


That is not to say standards are unimportant. 


Technology standards in computing and communications are said to provide benefits for enterprises by reducing cost, minimizing risk, increasing the range of suppliers and making possible standardized training for employees. Such standards historically have been crucial in the hardware realm, much more than in the applications arenas. 


For consumers, standards are expected to produce the best goods and services, more value, lower cost and therefore wide availability. 


Benefits might also accrue to particular suppliers when proprietary standards become consumer or enterprise commercial “standards,” as was true for IBM and become true for Microsoft and Apple, Cisco and others. 


“Open” standards have also grown more important in the hardware and firmware spaces, as Linux, Android and Transmission Control Protocol/Internet Protocol suggest. 


The world of applications is much less dependent on international standards. In the internet era, Google, Facebook, WeChat, Amazon, Alibaba, Netflix and other solutions have not established themselves so much through standards as because consumers simply prefer to use them. 


Broadly speaking, broad global standards reduce risk for infrastructure suppliers, as they create larger markets and create more niches for original equipment manufacturers. 


What matters is productivity; the ability to wring value from investments. Industrial policy might help. Still, success will ultimately be determined by demand, not supply.


Thursday, October 26, 2023

Will 6G Try to Recreate Closed Networks?

Already, some are suggesting 6G will be different from 5G in a significant way: where 5G is still a connectivity mechanism, some tout 6G as a computing mechanism. It will “enable immersive, ubiquitous, and sensory digital experiences on a massive scale,” some argue. 


“Enable,” yes, in the same way that home broadband “enables” use of internet-delivered applications.  “Embed,” in the sense of the network itself being the supplier of the features, probably not, and for good reasons. 


Modern computing is based on layers. That is what allows us to innovate faster and avoid monolithic solutions because functions are compartmentalized; independent objects rather than integrated processes. 


Even if infrastructure suppliers want us to accept new forms of functions integration as a way of convincing us to buy their new platforms, we should resist the notion. We actually do want permissionless app creation, not “integrated” solutions. 


In other words, some are likely to argue for 6G standards that are more centralized and controlled than 5G networks. More “closed,” in other words. 


Some will argue this is necessary because 6G will need to support a wider range of complex and demanding applications, such as immersive virtual reality and real-time AI-powered services. 


We might want to resist that notion. It is a move in the direction of walled gardens, closed networks and app development controlled, to a larger extent, by the entities providing internet access. 


Is that going to be better? 


Some will argue for advantages such as enhanced security or privacy. But permissionless development enabled by the layered architecture  has worked well. It’s easy to see why some in the value chain would prefer more closed, centralized networks. 


It recreates the experience of the public switched telephone network, where telcos controlled “all” the apps running on the network. 


To use the network, you needed permission from the network operator. Is that going to be better?


Every mobile generation gets hyped. Each will, it is said, enable and revolutionize the experience. Improvements happen, yes. Latency is reduced; bandwidth is increased; energy efficiency gets better. 


6G should “enable” immersive experiences such as the metaverse by staying out of the way. Embedding such features into the fabric of the network--beyond measures to control latency and supply lots of bandwidth--will be a mistake.


Wednesday, February 14, 2024

Each Next-Generation Mobile Network Since 2G Has Reduced Latency

As 5G core networks have shifted to a decomposed and virtual architecture, latency can become an issue, since functions can be performed remotely. But SKT and Intel say they have a way to reduce latency in virtualized 5G core networks substantially, by as much as 70 percent for transactions between the session management function (SMF) gateway and packet data unit (PDU) session microservices. 


The approach also enables a 33 percent reduction in gateway CPU usage, the firms say in a white paper


They believe the architecture will be useful for 6G, but the approach also works for 5G, illustrating the ways one mobile generation preps the way for the next, as key features and principles evolve. 


Mobile service providers would like nothing so much as a graceful evolution to “6G” performance, without disruptive changes to platform elements. Obviously, collaboration with device manufacturers, chip suppliers and other stakeholders will happen, to ensure device compatibility, standards alignment, and smooth integration of 6G technologies. 


But we should expect to see many other ways mobile operators will pursue an evolutionary 6G transition. As we have seen with 5G, existing spectrum will be leveraged, even if new spectrum allocations are made. 


Software-defined networks will facilitate network  upgrades that avoid hardware replacements. 


Network slicing might also be used to enable the coexistence of diverse 5G and 6G services on the same infrastructure.


We might also see efforts to conduct Incremental upgrades, where 6G features and functionalities are introduced in stages, in much the way that 4G voice services relied on 3G and 5G relies on 4G for voice. More advanced features, such as network slicing, might be introduced later than basic functions such as new frequency bands for capacity boosts, as happened with 5G. 


Thursday, November 2, 2023

Problem: 5G Cost More than 4G; 6G Will Cost More than 5G

By some estimates, the cost of 4G and 5G networks has gotten more expensive, and 6G is expected to be more expensive than 6G. 


There are several reasons, including the cost of new spectrum; the need for greater numbers of small cells, each supported by optical fiber connectivity; the cost of more-complicated radios; perhaps higher-cost engineering and higher site acquisition costs. 


Technology

Cost per location

Cost per square mile

4G

$10,000-$50,000

$500,000-$2,500,000

5G

$25,000-$100,000

$1,250,000-$5,000,000

6G (estimated)

$50,000-$200,000

$2,500,000-$10,000,000


All of that drives mobile operator concern about the business model for 5G and 6G. Some of that concern is about revenue, but much of the issue is the ever-higher need for capacity. 


By general agreement, mobile operator capacity gains have historically been driven by use of smaller cells (network densification) and allocation of additional spectrum. But most observers would tend to agree that denser architectures have contributed the most. 


In addition to use of smaller cells and additional spectrum, Wi-Fi offload, better radio technologies and modulation techniques also have contributed. And the mix of contributors arguably has changed over time. For example, Wi-Fi offload was not a factor for 2G networks.


In the 4G and 5G era, Wi-Fi offload might represent as much as 75 percent of mobile device data (principally internet access), but rarely less than 45 percent of total mobile internet data. 


Country

Percentage of mobile phone traffic offloaded to Wi-Fi

United States

60%

China

70%

India

50%

Japan

65%

South Korea

75%

United Kingdom

55%

Germany

60%

France

50%

Brazil

45%

Russia

55%


As mobile executives resist the ever-growing amount of capital they must spend to increase capacity, data offload might be one of the most-fruitful ways to add effective capacity while containing capital investment, at least to a point.


DIY and Licensed GenAI Patterns Will Continue

As always with software, firms are going to opt for a mix of "do it yourself" owned technology and licensed third party offerings....