Wednesday, November 18, 2020

Digital Redlining or Response to Demand?

Terms such as digital redlining imply that U.S. internet service providers upgrade neighborhoods able to pay for higher speed internet access underinvesting in poorer neighborhoods. At some level, it is hard to argue with that point of view, at least where it comes to gigabit internet access. 


Google itself pioneered the tactic of building in neighborhoods where there is demonstrated demand, building Google Fiber first in neighborhoods (typically higher-income areas) where potential customers were most interested. Other gigabit service providers have used the placing of deposits for the same reason. 


And regulatory officials at the local level seem to now agree that “universal service” (building a gigabit network past every home and business) is desirable in some cases, but not absolutely mandatory in all cases. The thinking is that allowing new internet service providers or facilities to be built wherever possible is a better outcome than requiring ubiquity, and getting nothing. 


Also, higher-speed facilities often are not found everywhere in a single market or city. CenturyLink does sell gigabit internet access in Denver, just not everywhere in the metro area. That is not necessarily “redlining,” but likely based on capital available to invest; expectations about financial return; customer density or any other combination of business issues that discourages the investment in new access facilities. 


The economics of communication networks also are clear. Density and cost per location are inversely related. Mobile networks typically have 10 percent of cell sites supporting 50 percent of usage. About 30 percent of sites carry about 80 percent of traffic. That has been true since at least the 3G era.  


In fixed networks, network cost and density also are inversely related. So population density has a direct bearing on network costs. In the U.S. market, network unavailability is concentrated on the last couple of percent of locations.  


With cable operators already holding at least 70 percent share of the internet access installed base of customers, any new investment in faster facilities faces a tough challenge. Any new fiber to home network, for example, essentially is playing catch-up to a cable operator, as roughly 80 percent of U.S. households already also are reached by gigabit speed cable networks. 


And cable share has grown, up from possibly 67 percent share in 2017. 


That noted, internet speeds do vary by geography: speeds in urban areas frequently are higher than in rural areas. But the argument that large numbers of U.S. households are underserved often is correct, depending on what standard one wished to apply, and how one defines the supplier market.


Some claim 42 million U.S. residents are unable to buy broadband internet access, defined as minimum speeds of 25 Mbps in the downstream.  That actually is incorrect. 


Virtually every household in the continental United States is able to buy 25 Mbps or faster service from at least two different satellite providers. But those who claim “42 million” people cannot buy broadband simply ignore those choices, and focus only on the claimed availability of 25 Mbps service by fixed network providers. 


There are other estimates which also vary wildly. Roughly 10 percent of U.S. households are in rural areas, the places where it is most expensive to install fast fixed network internet access facilities, and where the greatest speed gaps--compared to urban areas--almost certainly continue to exist.


In its own work with TV white spaces, Microsoft has targeted perhaps two million people, or roughly a million households, that have no fixed network internet access. That assumes there are two people living in a typical household, which is below the U.S. average of roughly 2.3 to 2.5 per household.


Recall that the definition of broadband is 25 Mbps downstream. Microsoft has argued that 20 million people (about 10 million homes) or perhaps eight percent of the population (perhaps four percent of homes) cannot get such speeds from any fixed network service provider.


Microsoft also has cited figures suggesting 25 million people cannot buy broadband--presumably using the 25 Mbps minimum standard, most of those people living in rural areas. 


That conflicts with data from Openvault that suggests 95 percent of the U.S. population can buy internet access at a minimum of 25 Mbps, while 91 percent to 92 percent can buy service at a minimum of 100 Mbps. 


Using the average 2.5 persons per U.S. household average, that suggests a universe of about 10 million U.S. homes unable to purchase internet access at 25 Mbps from a fixed network supplier, in 2018. What is not so clear is the percentage of households or persons who can do so using a mobile network. 


None of that explains urban areas with slow speeds, though. There the issue is more likely to be high construction costs in urban areas where underground construction is necessary, along with demand expectations that are lower than in suburban areas. That is true whether it is electrical lines or communications networks being considered.   


But at least one Microsoft analysis suggests that about half of all U.S. households are not using 25 Mbps access. The claim is that 162.8 million people are “not using the internet at broadband speeds.” That seems to clearly contradict data gathered by firms such as Ookla and Opensignal suggesting that average U.S. speeds are in triple digits.


In 2018, the average U.S. broadband speed was 94 Mbps, according to the NCTA. That same year, Ookla reported the average U.S. speed was 96 Mbps. 


It is not quite clear how the Microsoft data was generated, though one blog post suggested it was based on an analysis of “anonymized data that we collect as part of our ongoing work to improve the performance and security of our software and services.” 


The claim of 162.8 million people “not using the internet at broadband speeds” (probably using 25 Mbps as the definition) equates to about 65 million households, using the 2.5 persons per household definition. That does not seem to match other data, including the statistics Microsoft itself cites. 


What remains difficult, but might explain the divergence, is if applications and services include both apps run on smartphones as well as PCs and other devices connected to fixed networks. That would explain the number of users, while usage on mobile networks might account for large numbers of sessions where 25 Mbps speeds downstream were not noted, or perhaps it was the upstream speed definition (minimum of 3 Mbps) that was the issue.  


Even then, downstream average 4G speeds in 2018 were in excess of 40 Mbps downstream, so even that explanation is a bit difficult. 


Perhaps there are other ways to make sense of the data. There is a difference between users (people) and households. There is a difference between usage and availability; usage by device (mobile, PC, tablet, gaming device, sensor); application bandwidth and network bandwidth. 


Perhaps the issue is application performance on a wide range of devices including mobiles and untethered devices using Wi-Fi, which would reduce average experienced speeds, compared to “delivered access speed.” 


Methodology does matter. So do the costs and benefits of broadband capital investment under competitive conditions, in areas with high construction costs or low demand for advanced services, especially when newer platforms with better economics are being commercialized. 


Telecommunications is a business like any other. Investments are made in expectation of profits. Where a sustainable business case does not exist, subsidies for high-cost areas or universal service support exist. 


The point is that every human activity has a business and revenue model: it can be product sales, advertising, memberships, subscriptions, tax support, fees, donations or inheritances. Children have a “parents support me” revenue model, supported in turn by any of the aforementioned revenue models. 


But every sustainable activity has a revenue model, direct or indirect. The whole global communications business now operates on very different principles than the pre-competitive monopoly business prior to the 1980s. We still have a “universal service” low end, but we increasingly rely on end user demand to drive the high end. 


Our notions of low end change--and higher--over time. We once defined “broadband” as any data rate of 1.544 Mbps or higher. These days we might use functional definitions of 25 Mbps or 30 Mbps. Recall that 30 Mbps--in 2020--was called “superfast” as a goal for U.K. fixed network broadband. 


Few of us would consider 30 Mbps “superfast” any longer. Some might say the new “superfast” is gigabit per second speeds. But that is the change in real-world communications over just a decade. What was a goal in 2010 now is far surpassed. 


What some call “redlining” is simply a response to huge changes in the internet access and communications business. “Maximum” is a moving target that responds to customer demand. “Minimums” tend to be set by government regulators in search of universal service. 


As independent internet service providers cherry pick service areas where they believe the greatest demand for gigabit per second internet access exists, so do incumbents. 


Similar choices are made by providers of metro business services; builders of subsea connectivity networks or suppliers of low earth orbit satellite constellations and fixed wireless networks. They build first--or pick customer segments--where they think the demand is greatest.


Saturday, November 14, 2020

Technology Displacement is Harder than it Seems

Technology displacement--new for older--is a tricky business. Sometimes a whole ecosystem has to be built before a key innovation can reach mass adoption.  


Not every feasible technology substitute actually displaces other solutions with which they potentially compete, even when the argument is made that the substitute is “better” on some relevant performance metric. 


Sometimes the failures are the result of business execution, as when a promising startup runs out of money, grows too fast or too slowly. 


And customer adoption is almost always related to potential customer underlying habits and preferences. Changing has costs, so the innovation must deliver value in significant excess to the costs of changing. 


Customer experience, broadly defined, always is important. “Better” in some sense is offset by “hard to use,” “inconvenient” or “not worth the extra money.”


Politics and culture sometimes also play a key role. Is an existing way of doing things beneficial to important and powerful interests? Can they resist innovations that threaten those interests? 


Sometimes it is deemed too much hassle to displace an existing solution and ecosystem with a rival. Despite its inferiority to other keyboard layouts, we still use QWERTY, which originally was developed to slow down typing and prevent key jamming on mechanical typewriters.


Some call that path dependence, the idea that small, random events at critical moments can determine choices in technology that are extremely difficult and expensive to change.


Innovation is more a human process than a technological one,” says Stacy Wood, North Carolina State professor. “Persuasion, environment, culture and context always matter.” 


If the primary end-use value of a smartphone is the expected ability to remain connected “anywhere,” on the go, then it makes sense that Wi-Fi--though a key part of the connectivity ecosystem and experience--is not a direct or convenient substitute.


For perhaps similar reasons, few of us use smartphones without cellular service, though some functionality is possible. 


In the mobile communications business, the service always is bundled: text messaging, voice and internet access being the foundations. It remains possible to purchase a basic bundle including only voice and messaging, but increasingly, the foundation package includes internet access. 


Decades ago, the emergence of Wi-Fi was touted as the potential foundation for mobile phone service, and so it has become, though not in the way some expected. Periodically, it has been suggested that Wi-Fi could be the sole connectivity mechanism for mobile phone service. 


Voice and text messaging still are required features of a 5G network, whether they directly generate lots of specific revenue or not. Customers might willingly buy a 5G-based home broadband service, without voice or texting capabilities. They might buy a 5G dongle for PC internet  access. 


But it remains an open question whether smartphone service without voice and texting is viable, lawful or desirable. In principle, a smartphone can function without a “mobile” account enabling voice, using Wi-Fi and VoIP. 


There are some issues, such as inability to use a phone number or communicate easily with other users of the public telephone network. But think of a smartphone connected to Wi-Fi, with no subscriber identification module and mobile service, as a PC connected to the internet and using Zoom or any other messaging or VoIP service.  


It can be done, but the utility or value is not high, for most people, if the mobile service bundle of value also includes low-cost public network voice and messaging (for domestic communications, for example) as well as the ability to use Wi-Fi or the mobile network when roaming or conducting international communications.


Calling using VoIP over Wi-Fi is possible and useful. In a mobile device context the overall value of a mobile service might be high enough, and the cost low enough, that bothering with a Wi-Fi-only use of the phone is not worthwhile. 


Technology displacement often is quite a bit more complicated than it appears. 


Content Versus Distribution: Netflix Versus Disney: Where's the Value?

It has long been possible to get a reasonable debate on the respective values of content and distribution in the media business. Simply put, the issue is which part of the value chain is better positioned. Consider one illustration. The market value of Netflix is something on the order of $213 billion. Content creator Disney has a market cap of about $250 billion. 


Market cap is not everything, and both firms produce and distribute content. But it might be surprising that Netflix is so close to Disney, given its simpler business model, which includes original content production, but relies on distribution (subscriptions) for its revenue. 


Disney is far more complex, spanning movie and animation studios, broadcasting networks, content networks, theme parks, hotels, merchandise and video streaming operations.  


Think of Netflix as a content distributor, much as cable TV, satellite and telco video service providers; movie theaters; TV or radio broadcasters and increasingly, many online services and apps now act. True, Netflix invests heavily in original content, as do some other leading streaming video services. But its direct revenue comes solely from subscriptions. 


Notice something about Disney, though. Direct-to-consumer, which includes the Disney streaming service, with 73 million paid subscribers, generates significant revenue, but negligible operating income. Granted, that is partially because the Covid-19 restrictions closed the theme parks, while Disney’s streaming service is in start-up mode, so operating income might not be expected for a bit. 

source: Investopedia 


In a non-Covid environment, the theme parks (which includes hotels and merchandise) contribution to operating income would be vastly higher, between 20 percent and 33 percent of revenue. 

source: Nasdaq 


Operating income from theme parks, hotels and merchandise might range as high as 37 percent. 


source: Valuewalk


The point is that direct-to-consumer, lead by the streaming networks, should ultimately produce significant cash flow and operating income for Disney. The issue for some is how that cash flow and income might affect Disney value, and whether some different legal status for direct-to-consumer might affect the value of that unit. 


To be sure, some are unsure Disney streaming businesses can approach the size of Netflix, which already has perhaps 201 million million subscriptions. All Disney streaming properties collectively might reach about 100 million subscriptions. 


So the question is whether, someday, the value of Disney streaming is such that those assets would fetch a higher valuation if independent, as is Netflix.


Not Even FTTH Might Propel Non-Cable ISP Gigabit Share

One has to remain impressed with the commercial success of the hybrid fiber coax platform used by cable TV companies. Where most of the rest of the telecommunications world has remained fixated on fiber to the home as the platform of the future, cable operators tweaked an inconsistently-available coaxial cable network into the leading supplier of home broadband connections in the United States and a few other countries, with economics arguably better than FTTH for brownfield operations and a more-graceful approach to network upgrades.


In 2016 the cable industry passed about four percent of U.S. homes with networks offering 1 Gbps internet access. By 2018 80 percent of U.S. homes were able to buy gigabit per second service. 

source: CableLabs 


By way of comparison, all FTTH passings number something more than 50 million. There are about 141 million U.S. homes in total. So FTTH passes roughly 35 percent of U.S. homes. Not all those connections are capable of supplying gigabit connections at the moment, though. 


Assume there are 21 million active FTTH connections in the United States. Assume there are a total 103 million total broadband accounts. According to Openvault, in the third quarter of 2020 about five percent of U.S. customers bought gigabit service. 


That implies a total of no more than 5.15 million U.S. gigabit accounts in service. Assume all internet service providers other than cable operators have 30 percent of those accounts, implying about 1.5 million 1 Gbps ISP accounts sold by all firms other than cable operators. 


That further pimples that gigabit FTTH accounts in service represent about seven percent of active FTTH connections. Cable gigabit connections are likely to be closer to five percent of total broadband accounts. 


source: Fiber Broadband Association, RVA 


The point is that we sometimes too casually equate physical media with speed, or physical media with specific speeds, such as gigabit per second connections. Neither FTTH nor HFC directly equate with gigabit service or availability. 


However, cable TV operators do claim about 70 percent of the broadband installed base, possibly representing 3.6 million active gigabit accounts. As early as 2009, at least 75 percent of the fastest U.S. broadband connections were supplied by cable TV operators.  


At the same time, cable HFC platforms arguably have proven more effective than FTTH at generating incremental new revenues for platform owners, while suffering less from demand changes as voice and linear video began to shrink, the former since 2000, the latter since 2012 or so. Telco voice lines have fallen as much as 70 percent since 2000. Cable linear video accounts are down less than 15 percent since about 2012.  


Friday, November 13, 2020

FTTH is the Platform of the Future, But Might "Always Be"

People often forget that, in the communications business, there is no platform that always is best for every use case. What matters are the particular advantages. In other words, it has proven largely pointless to argue whether mobile or Wi-Fi access is “better.” Both have their contributions to make. 


In the fixed networks business, the belief has been--for many decades--that fiber to the premises is the future of the next-generation fixed network. 


With the caveat that there always is a private interest corresponding to every public policy, one cable TV industry vice president decades ago quipped that “fiber is the technology of the future...and always will be.”


Keep in mind, that was said about 40 years ago. And while North American access platforms have a different pattern than in most parts of the world, 40 years later, hybrid fiber coax platforms have about 70 percent share of the U.S. installed base of broadband connections. 


That creates a huge stranded asset problem for brownfield fiber-to-home deployments. Assuming a new FTTH network is deployed at scale, it might find that up to 70 percent of the assets are not generating broadband revenue. 


To be sure, there is still some amount of voice, but the old copper access network works well enough for that application. Investing capital on FTTH does not necessarily improve user experience, value or features for voice customers. 


For decades, though, there was one clear assumed advantage to deploying FTTH: the ability to sell linear entertainment video. So the basic thinking was that FTTH would allow telcos to hold broadband share while losing voice share and gaining video account share. 


It sort of worked that way until about 2012 for video services.

source: Business Insider 


But the “voice” part of the model never worked well at all, as usage, lines sold and value began to drop in the U.S. market as early as 2000, the peak year for telco access lines and long distance revenue


But business cases matter, and in the U.S. market the business case for FTTH, always difficult, has become quite challenged with the dominance of cable operators, and their seeming ability to keep pushing commercial bandwidths ahead faster than FTTH platforms. Already, at least 80 percent of U.S. homes have the ability to buy 1 Gbps internet access service, provided by cable operators. Not even most telco FTTH networks can do that, yet. 


Nor does it seem likely the cable cost advantage can be overcome any time soon, as technologists already are working on ways to boost HFC bandwidths to 10 Gbps or beyond, symmetrically. Sure, FTTH can do that as well, but not at the cost per home that HFC can provide. 


And that leads to telco interest in fixed wireless access, using 4G and 5G. The issue is not whether “fixed wireless can match FTTH” in potential speeds. The issue is whether fixed wireless can create a positive business case in areas where a new FTTH build is not financially feasible


Globally, matters can be quite different, as it often seems as though only one nationwide broadband fixed network can be supported. Still, in many other countries a mix of platforms is called for, based on home density. And then there is the matter of how people use the internet. 


By 2020, mobile accounted for more than half of all of Internet access revenue in more than 75 percent of countries, researchers at PwC said early in the year. Some analysts noted mobile Internet access revenues already had surpassed fixed network broadband revenue as early as 2013 or 2014.


That trend is expected to continue. By 2024, consultants at PwC say, mobile revenue will account for 68 percent of global Internet access market revenues. In other words, more than two thirds of all internet access revenue globally will be generated by mobile internet access. 


source: PwC 


FTTH is, in many countries, perhaps always the next-generation platform. In some countries, though, a mix of platforms is likely for decades. In a few countries, FTTH seems to be infeasible for consumer accounts, if deployed by telcos on a ubiquitous basis. Hence the interest in mobile access, or mobile network fixed access.


Thursday, November 12, 2020

U.S. Gigabit Take Rates Pass 5% for First Time

The percentage of U.S. fixed network broadband subscribers buying gigabit-speed connections surpassed five percent for the first time in the third quarter of 2020, reaching 5.6 percent, according to Openvault. That represents an increase of 124 percent from the third quarter of 2019, when take rates for gigabit services were 2.5 percent.


Gigabit service take rates reached 4.9 percent in the second quarter of 2020, says Openvault. Compare that to availability of gigabit services, which reach at least 80 percent of U.S. homes, counting only cable TV service provider facilities.


source: Openvault 


On the other hand, buyers of service at 10 Mbps and below also increased by 41 percent,  from 4.1 percent to 5.8 percent, quarter over quarter, Openvault says. “Growth at the lower end tier may be pointing to subscribers looking to save money with lower end, lower cost broadband tiers,” Openvault says.


Openvault says the “average” U.S. household (no idea whether this is median or mean) uses 384 GB per month, with average speeds downstream of 170 Mbps, upstream of 13 Mbps. 

source: Openvault

Half of all customers buy services providing downstream speeds between 100 Mbps and 400 Mbps. Nearly 30 percent of U.S. customers buy internet access at speeds ranging from 20 Mbps to 75 Mbps.

So it remains fair to say that there is a big difference between speeds consumers can buy, and speeds they actually do buy. Supply is one matter; demand another.

Impact of Covid on Telecom, IT Spending Not Yet Completely Clear

When all the data is available, it is likely we will find that the Covid-19 related work-from-home and other measures reduced global service provider revenues  a couple of percentage points, though some firms might have seen far-worse hits, and some see higher revenues


Overall, full-year global results might show less than one percentage slippage of service provider revenue, IDC predicts. 


source: IDC 


And in many cases, it will be hard to separate specific Covid-19 impact from the underlying trend of declining growth. Something similar might happen for information technology spending, even for cloud services which have been on a steady growth path.


Though it might seem obvious that information technology executives increased cloud computing spend to support remote workers during the Covid-19 pandemic, it is not yet so clear how much spending actually might have changed. It is possible we might actually see spending fairly close to what had been predicted prior to the pandemic. 


An October 2020 survey of 230 information technology professionals by OpsRamp finds 60 percent of firms increased IT budgets while 22 percent reduced spending in the second and third quarters of 2020. Some 63 percent of respondents reported accelerated or maintained digital transformation initiatives because of Covid-19.


The survey included IT professionals in the United States and United Kingdom working for firms with  at least 500 employees and $5 million in annual IT budgets. 


Reported priorities for IT leaders included information security and compliance (59 percent), remote work and collaboration (55 percent), public and multi-cloud infrastructure (50 percent) and monitoring and management (42 percent). 


Among capabilities acquired were artificial intelligence for IT operations (57 percent); digital experience monitoring (50 percent) and network performance monitoring and diagnostics (50 percent). 


A Computer Economics survey in August and September 2020 finds a split pattern of information technology operations spending: 30 percent increasing spending while 29 percent were decreasing. About 41 percent of respondents say there has been no change in spending. 


Back in April and May 2020, 41 percent of respondents reported unchanged spending. But 30 percent already had made moves to reduce IT budgets in response to  To the extent there had Covid-19 work from home rules.


Fig. 1: Organizations Planning Operational Budget Changes

source: Computer Economics


The 30 percent of respondents who reported cutting spending seems not to have budged much. In the fall, the percentage of respondents reporting lower spending was still 29 percent, substantially the same as the 30 percent who reported cuts in April and May 2020. 


What changed was the percentage of respondent firms that boosted spending, probably for hardware, software or services related to supporting remote workers. “Of course, the big reason for increasing budgets is to respond to the increased need for employees to work from home,” said David Wagner, Computer Economics senior research director. 


Fig. 1: Organizations Planning Operational Budget Changes

source: Computer Economics 


It is hard to separate out the response to Covid-19 remote work on cloud computing, which has been growing robustly before the work-from-home mandates were imposed. And even some firms that one might assume boosted cloud spending to support at-home workers eventually reduced spending from surge levels. 


Some studies suggest cloud revenue growth from 2020 to 2021 will be about 12.5 percent, perhaps lower than many would have expected, even under normal circumstances. The only issue is whether the work-from-home rules affected cloud computing uptake rates. And that might ultimately be a different matter from earlier expectations that cloud computing among smaller businesses, for example, would increase because of the pandemic. 


To some degree, the changes might hinge on whether customers are active about turning off or reducing payment for unused cloud services. It also will matter how much aggregate demand changed when workers shifted from offices to homes. In principle, that might have shifted the location of consumption, but not the volume of use. 


Though early in the pandemic many expected increased reliance on cloud computing, beyond growth already expected, we will have to wait and see what actually transpired. Covid impact might ultimately turn out to have been less of a change inducer than we expected. 

Tuesday, November 10, 2020

Alphabet's Taara to Deploy Free Space Optics in Kenya

Alphabet’s Project Taara is now working with Econet and its subsidiaries, Liquid Telecom and Econet Group, to commercially deploy free space optics trunking systems in Sub-Saharan Africa. The initial use cases will involve signal trunking across rivers or transmission through any other areas where it would be expensive, difficult or dangerous to lay physical cables, or where microwave radio would be the logical alternative,


source: Alphabet 


Taara’s links will begin rolling out across Liquid Telecom’s networks in Kenya first.


A single Taara link can cover distances up to 20 km and can transmit bandwidth of up to 20 Gbps or more, Taara says. The technology is a point-to-point trunking approach originally investigated to support Alphabet’s Project Loon internet access system using balloons.


In that use case, the problem to be solved was creation ofdata links between balloons that were flying over 100 km apart.


But free space optics has been proposed as a point-to-point trunking solution for decades in niche applications such as sending video security camera signals from remote locations to monitoring stations. 


The deployments illustrate how some technologies take decades to reach even relatively limited commercial use in the connectivity business.

When a Bug is a Feature

Somes a bug can be a feature. Consider atmospheric attenuation of radio frequency signals at 60 GHz, At that frequency, oxygen in the atmosphere attenuates about 98 percent of the emitted energy. That limits coverage. 


So it might be paradoxical that Terragraph is working to commercialize use of 60-GHz frequencies for internet access. At the same time, many internet access providers say they are going to use fixed wireless for internet access. 

source: RF Globalnet


One advantage of 60-GHz spectrum, in the U.S. market and in most countries is that the spectrum is available for unlicensed use. That has huge business case  implications for service providers compared to the alternative of building fiber-to-location networks. 


In the U.S. market, where consumer internet access is dominated by cable TV companies, fixed wireless might be the main way telcos can compete with cable offers. When the cable operator has about 70 percent of the installed base, the fiber to home cost per passing or cost per customer is a tough business case. 


In addition to no requirement to pay for a license to use the spectrum, the bandwidth is huge, about 14 GHz of capacity in the 57 GHz to 71 GHZ band available for U.S. unlicensed use. 


Another advantage is security. Because signal attenuation is so high, stray signals are confined. In an indoor Wi-Fi deployment, the signals will not often “leak” out of the building. In an outdoor access deployment, the signals can be highly linear, reducing the chances of accidental or purposeful unauthorized signal reception. 


There are interference protection advantages as well, In an indoors Wi-Fi deployment (local area network), adjacent 60-GHz transmitters are less likely to interfere with each other, which causes a reduction in experienced speeds and capacity. 


The same goes for line-of-sight, or near line-of-sight 60-GHz access transmitters: signals from one radio are less likely to interfere with signals from other transmitters.


Frequency reuse opportunities also are quite high, again a function of the “bug” of extremely-high signal attenuation. In the same way that small cells boost the intensity of usage for any specific block of spectrum, so 60-GHz signal attenuation allows a high degree “cellular” frequency reuse. 

source: RF Globalnet


Though high signal attenuation would seem to be a major bug, it might also be a key feature for 60-GHz frequency internet access. 


Monday, November 9, 2020

No, Covid-19 Has Not Been Good for the Telecom Business

An argument I have frequently heard, from professionals in the telecom industry, is how the Covid-19 pandemic “must” be good for service provider revenues, since so many people were forced to work from home, and therefore were using more telecommunications. 


The reality is that revenues are down, as you would otherwise expect in a situation where whole economies are nearly shut down. The functional result is a recession, and recessions cause lower service provider revenue. 

 

source: IDC 


Also, lockdowns keeping people at home hit roaming revenue, and given that mobile services generate a clear majority of global service provider revenue, that has hit both revenue and earnings. 


Add to that bankruptcies of significant numbers of small businesses and organizations, and aggregate demand is reduced.


All of that happens within the context of a global business experiencing slow to no growth. 


source: IDC 


The point is that even professionals who work in the industry can have distorted views of what is happening in the business, especially the impact of the pandemic and economic lockdowns.


On the Use and Misuse of Principles, Theorems and Concepts

When financial commentators compile lists of "potential black swans," they misunderstand the concept. As explained by Taleb Nasim ...