Sunday, July 11, 2021

Rules of Thumb About Mobile Capacity Expansion Might be Changing

Some changes to the connectivity business model are obvious; others more subtle. The ubiquity of mobile services is obvious, as is the growth of internet access and the waning of fixed network voice and entertainment video.

But other changes happen over such long periods of time that a generation or two can live with a new reality without noticing the differences. There was, for example, a time when the internet did not exist; when PCs and mobile phones did not exist. 

Less obviously, the ways mobile network capacity gets created have changed. Some of those ways reduce capital investment and operating costs. 

Historically, there are three ways mobile operators have created more capacity on their networks: get new spectrum; use more spectrally-efficient technologies and move to smaller cell sizes. In the 4G era a new tool emerged: use of unlicensed spectrum to offload traffic to local networks. 


Buying additional spectrum and shrinking cell sizes obviously increase capex. Shrinking cell radii 50 percent quadruples the number of cells, for example. Deploying new radios and using new modulation schemes arguably is relatively neutral as a cost driver.


Use of unlicensed spectrum, on the other hand, clearly reduces both capex and operating expense. The spectrum does not have to be bought; the radios do not have to be installed or operated; and third parties pay for energy consumption.


5G brings advances in using unlicensed spectrum, particularly in the area of allowing aggregation of available unlicensed spectrum to licensed spectrum resources.


Prior to the 4G era, it can be argued that smaller cell sizes and radio technology or modulation advances have created more usable capacity than new spectrum allocations. But widespread Wi-Fi offload has changed the toolkit. Wi-Fi offload might account for 30 percent to 40 percent of customer data consumption. 


During the Covid pandemic the percentage of consumption shifted to Wi-Fi was certainly much larger than that. In the 5G and succeeding eras, the ability to aggregate unlicensed spectrum to licensed spectrum will be an important new source of effective capacity. 


source: Science Direct 


It is not yet clear how well that pattern will hold up in the 5G and coming eras. Though both network densification (smaller cells) and new spectrum resources will be applied, in addition to better radio technology and more advanced signal modulation, new spectrum allocated will be discontinuous.


From 1947 to 2017, allocated mobile spectrum doubled about every 8.6 years. The 5G auctions have broken the scale.


In large part, new spectrum allocations have been relatively small and incremental. The allocations for 5G are discontinuously larger, involving both larger amounts of spectrum per auction and also much more effective bandwidth per unit. 


Simply, capacity is related to frequency. The higher the frequency,  the higher the potential bandwidth.  


source: Lynk 


Spectrum auction behavior also shows that price per unit decreases as frequency increases, with several drivers at work. Higher-frequency spectrum simply involves more capacity per unit, but also requires more-expensive (denser) networks. So spectrum value is partly the result of expected costs to deploy networks using that spectrum


Historically, the highest prices were obtained for spectrum with good coverage capabilities, hence lower infrastructure cost. Business models also play a role. The problem mobile internet service providers face is that customers require more bandwidth every year, but are generally only willing to pay the same amount.


source: Lynk 

So additional bandwidth is a cost of remaining in business, not necessarily a driver of incremental revenue. Also, relative scarcity plays a role in setting value and prices per unit. Low-band spectrum was the most scarce. Mid-band spectrum is less scarce and high-band (millimeter and above) is relatively plentiful. As always, scarcity increases prices. Abundance reduces prices.

The point is that the traditional rules of thumb about how mobile network capacity gets increased might have changed. Better modulation and radios; new spectrum allocations and smaller cells still are three ways capacity gets increased. 


But use of unlicensed network capacity has become a fourth tool. Even if, historically, smaller cell sizes have driven most of the capacity increase, there will be more balanced improvements in the future, relying much more on the use of additional spectrum, licensed and unlicensed. 

Saturday, July 10, 2021

IBM Envisions all the World's Cloud Resources Easily Usable as Though it Were One Machine

Methodology Matters

Most of us--at least when it suits our purposes--believe decision making is enhanced by the availability of good data. And most of us likely would agree that methodology matters when gathering data. 


So notes Ookla in reviewing data on broadband speeds described in a recent report.  “Our concern with the rest of the report is that the network performance test results the report was derived from painted an inaccurate picture of what constituents were actually experiencing in the district.”


“The results presented greatly underestimated the speeds being delivered by the service providers throughout most of the study area while overestimating some others,” said Ookla, which compared its own data with that supplied by M-Lab in the report. 


“The speeds measured by Speedtest for the same areas and the same time period are dramatically higher in most areas, indicating that additional infrastructure investments are unnecessary where constituents can already achieve network speeds that meet FCC minimums,” said Ookla. 


There is more than one way to calculate an average.  The “mean” average is the sum of all measurements divided by the number of records used. “This number is valuable, but it can be influenced by a small portion of records that may be extremely high or low (outliers),” said Ookla. “As fiber is installed within an area, a significant number of tests from ultra-high-speed connections can skew mean averages up.”


The opposite also can occur. “M-Lab vastly under-reported the network throughput in every single ZIP code represented in the congressional report,” Ookla said. 


“The ZIP code showing the least amount of difference by percentage between Ookla and M-Lab data was 13803 (Marathon) where M-Lab’s recorded median was 5.5 Mbps and the median from Ookla data was 14.5 Mbps,” Ookla noted. “So the typical speed in Marathon measured by Ookla’s Speedtest was over two and a half times as fast as the average measurement captured by M-Lab.”


“On the other end of the scale, in Whitney Point, M-Lab’s recorded median was 0.9 Mbps while Ookla measured a median of 71 Mbps, almost eighty times faster,” the firm said. 


“It is clear from these results that M-Lab’s performance test does not measure the full capacity of a network connection and thus does not accurately reflect the real-world internet speeds consumers are experiencing,’ said Ookla. 


“These disparities in measured speed generally arise because some network data providers have low user adoption among consumers, limitations in their testing infrastructure, questionable testing methodologies, or inadequate geolocation resources to precisely locate where a given test was taken,” said Ookla. 


“These disparities in measured speed generally arise because some network data providers have low user adoption among consumers, limitations in their testing infrastructure, questionable testing methodologies, or inadequate geolocation resources to precisely locate where a given test was taken,” Ookla added.


The B2B Sales Journey Has Changed

The business-to-business buyer journey has changed. As in the past, B2B transactions remain complex, with multiple influencers and decision-makers, with many rounds of research, evaluation and stakeholder engagement work required. 

 

Since the Covid pandemic, when person-to-person meetings were largely impossible, means the B2B purchase journey has been streamlined. There is less distinction between marketing and sales. Timelines often are compressed. Buying authority is more decentralized as “computing as a service” can be bought with a credit card. 

 

Buyers still must identify the business need, research solutions, evaluate options and reach a decision. But buyers are doing more of that online and on their own.


Enterprise sales have in the past largely relied on field sales. But change is happening. Perhaps a third of business-to-business buyers might be willing to conduct fully-virtual transactions for new products up to a value of approximately USD 500,000, according to a McKinsey report. 


And marketplaces, ecosystems and platforms can make a huge difference. PCCW Global, using an automated system for sales to settlements, “gained over 800 customers in the last 18 months, with growing traction, without any actual sales contact,” Halbfinger said. 


“We don’t even have to know who the customer is,” he added. Sales come from third parties or online, direct from the trading platform PCCW Global uses. 


B2B sales have evolved as virtual marketing, sales, fulfillment and settlement evolve using artificial intelligence and other digital tools. Those themes, and many more, are featured in a PTC Webinar Series: Frictionless Business™  on How B2B Sales Will Change, Post Covid




Featured panelists included:

  • Matt Bramson, Founder & Managing Partner, Cloud Strategy Solutions, USA

  • Marc Halbfinger, Chief Executive Officer, PCCW Global, Hong Kong SAR China

  • Nancy Ridge,  Founder & President, Ridge Innovative, USA

  • Elmar Rode, Director Communications Industry Strategy Group, Oracle, Germany

  • Gary Kim, IP Carrier principal, acted as moderator


Available on 12 July 2021 to PTC members, the series will be available on YouTube in about 30 days. Other episodes in the series already are available for immediate viewing.

Hospitality Industry Changes, but Phone Systems Almost Do not Matter

Most observers expect changes in the hotel and lodging experience as a result of the Covid pandemic that will last beyond the pandemic’s end. Various forms of “contactless” experience--ranging from keyless room entry to contactless check in to and end to daily room cleaning--are among the expected changes. But supply chains, service elements and staffing levels are likely to be affected as well.  


And many expect cost-cutting measures to develop as well, given the slow travel rebound. Contactless experience will be among the ways lodging providers cut costs. 


Technology and analytics will be more important as human support and interactions are minimized.  


Other costs will be difficult to rein in, and might also not provide much upside to the operating cost or revenue models. 


In-room phone systems in the lodging industry are likely among the necessary costs of doing business, though few guests seem to use them. And some hotels have a line item revenue gain from in-room phones that most travelers consider a tax, not an amenity. 


source: PXF Hospitality Research


Of course, lodging establishments require phone systems for other reasons, including taking reservations. 

 source: PXF Hospitality Research


“From 2015 through 2019, total (hotel) operating expenses increased at a compound average annual growth rate (CAGR) of 2.2% at the properties in our study sample,” notes CBRE. “During this same period, the hotels’ cost for telecom service increased at a CAGR of 9.7 percent.”


“Individually, the cost of phone service rose by a CAGR of 5.7 percent, while the cost of internet service increased at an average annual pace of 16.1 percent,” says CBRE. 


“The 9.7 percent combined CAGR for telecommunications cost is more than three times the CAGR for any other individual hotel department cost during the same five-year period,” CBRE says. Costs grew faster than that at upper-midscale (CAGR 21.5 percent) properties and upscale (CAGR 13.9 percent) hotel chains. 


On the other hand, phone system expenses are a small part of total operating cost: less than half a percent. 

 source: CBRE


source: CBRE


Friday, July 9, 2021

Travel Between U.S. and Asia Remains Mostly Closed

Though life in the United States is normalizing, normal travel between most of Asia and the United States remains difficult to impossible. There are major travel restrictions in place across most of Asia, where it comes to travel to and from the United States. 


source: Skyscanner 


Small Business Broadband Gaps?

What is true of U.S. consumers also is true of small businesses: some percentage of locations do not have access to internet access at the defined minimum speed of 25 Mbps.


According to two recent surveys by the National Federation of Independent Business and Google, around 8 percent or about 2-3 million U.S. small businesses lack access to broadband, says the Government Accountability Office.


According to FCC’s 2021 Broadband Deployment Report, as of year-end 2019, about 96 percent of the U.S. population had access to broadband at FCC’s established minimum speed benchmark of 25/3 Mbps. But that of course leaves four percent of locations that cannot buy service at the defined minimum. 


source: GAO 


As always, it is rural areas that are most underserved. At least 17 percent of rural Americans lack access to broadband at speeds of 25/3 Mbps, compared to only one percent of Americans in urban areas. 


Also, a significant percentage of consumers who can buy service do not do so. About 31 percent of people nationwide who have access to broadband at speeds of 25/3 Mbps have not subscribed to it, GAO says. 


A Google sponsored survey of small businesses with less than 250 employees found that eight percent of small businesses reported “poor internet access” as a barrier to improving digital engagement, GAO says. 


Assuming 32 million smaller businesses, this represents around two million to three million small business that potentially lack adequate access to broadband. The caveat, of course, is that some smaller businesses might not need broadband connectivity. 


A nationally representative survey of rural small businesses sponsored by Amazon and U.S. Chamber Technology Engagement Center (C-TEC), found that approximately 20 percent of rural small businesses were not using fixed network broadband.


That does not mean those businesses could not buy, only that there did not buy. About five percent were using the internet with a dial-up connection. That is not necessarily because they could not buy broadband, but possibly preferred dial-up. That could be the case for some retailers, for example, who use internet access only to process credit card and debit card charges. 


Also unclear are instances where mobile data access was used in place of fixed broadband. 


That noted, there likely are fewer businesses that do not require broadband access to run their operations.


Thursday, July 8, 2021

Back to Single-Product or Single-Purpose Networks?

One result of lower U.S. consumer interest in buying fixed network voice services or linear video subscriptions is a waning of the value of bundled service packages and an increase in single-product subscriptions for internet access on fixed networks. 


 

source: Parks Associates 


Oddly, that is something of a return to the application-specific networks of old. 50 years ago, all networks were application specific: TV and radio broadcast; cable TV networks; satellite networks, telco networks and mobile networks.


That has implications for facilities-based providers of fixed network services, as the financial upside from consumer fixed networks increasingly relies on broadband internet access, with dwindling contributions from voice or video subscriptions. 


That makes the payback model harder, as in its heyday service providers could hope to sell two or three services per account. So where a triple-play account could generate $200 per line, broadband-only accounts generate $40 to $100 per line. 


As difficult as the fiber-to-home decision was in the days of triple-play leadership, such decisions now must turn on other values, such as access networks supporting business customers, edge computing or cell site backhaul. 


That is especially true in markets where two or three suppliers operate. In Knoxville, Tenn., for example, Xfinity covers 91 percent of locations; AT&T covers 81 percent of locations and WoW covers an additional 36 percent of locations. All three internet service providers sell gigabit speed services. 


source: Broadband Now 


The big difference is that AT&T sells symmetrical service, an attribute that could well be the key to further gains, for AT&T as well as other telcos. 


Can Broadband Definitions be Changed in a Platform-Neutral Way?

In principle, broadband speeds will keep increasing almost indefinitely. A reasonable projection is that headline speed be in most countries by 2050 will be in the terabits per second range. 


Though the average or typical consumer does not buy the “fastest possible” tier of service, the steady growth of headline tier speed since the time of dial-up access is quite linear. 


And the growth trend--50 percent per year speed increases--known as Nielsen’s Law--has operated since the days of dial-up internet access. Even if the “typical” consumer buys speeds an order of magnitude less than the headline speed, that still suggests the typical consumer--at a time when the fastest-possible speed is 100 Gbps to 1,000 Gbps--still will be buying service operating at speeds not less than 1 Gbps to 10 Gbps. 


Though typical internet access speeds in Europe and other regions at the moment are not yet routinely in the 300-Mbps range, gigabit per second speeds eventually will be the norm, globally, as crazy as that might seem, by perhaps 2050. 


The reason is simply that the historical growth of retail internet bandwidth suggests that will happen. Over any decade period, internet speeds have grown 57 times. Since 2050 is three decades off, headline speeds of tens to hundreds of terabits per second are easy to predict. 

source: FuturistSpeaker 


Some will argue that Nielsen’s Law cannot continue indefinitely, as most would agree Moore’s Law cannot continue unchanged, either. Even with some significant tapering of the rate of progress, the point is that headline speeds in the hundreds of gigabits per second still are feasible by 2050. And if the typical buyer still prefers services an order of magnitude less fast, that still indicates typical speeds of 10 Gbps 30 Gbps or so. 


Speeds of a gigabit per second might be the “economy” tier as early as 2030, when headline speed might be 100 Gbps and the typical consumer buys a 10-Gbps service. 


source: Nielsen Norman Group 


So there is logic to altering minimum definitions over time, as actual usage changes. In fact, typical speeds might be increasing faster than anticipated. 


Also, there is a difference between availability and actual customer demand. Fewer than 10 percent of potential customers who can buy gigabit service actually do so, at the moment, in the U.S. market. 


Perhaps 30 percent of customers buying service at 100 Mbps or higher believe it is more than they need. Perhaps 40 percent of customers buying gigabit services believe it is more than they need. 


The issue is that no definition can be technologically neutral, either for upstream or downstream speeds. Since 75 percent to 80 percent of U.S. customers already buy fixed network service at speeds of 100 Mbps to 1,000 Mbps, a minimum definition of “broadband” set at 100 Mbps is not unreasonable. 


The issue is that some platforms, including satellite, fixed wireless and digital subscriber line, will have a tough time meeting those minimums. 


Cable operators are virtually assured they can do so with no extra effort. 


Upstream bandwidth poses issues for most platforms other than modern fiber-to-home platforms, if the definition were set at 100 Mbps upstream, for example. 


It’s tricky, and almost impossible to reset broadband definitions in a platform-neutral way.


You Cannot Manage What You do not Measure

The adage that we cannot manage what we cannot measure is fair enough. So is the adage that "you get what you inspect, not what you expect."


Different eras in the connectivity business tend to bring distinct concepts and terms to the fore, and the changes in terminology reflect new business issues as much as efforts to measure progress in meaningful ways or enhance perceived firm value. 


Prior to the 1980s, a public company’s value might more often be stated in terms of equity valuation. Two decades later, “enterprise value” was more common--and useful--for startups and potential acquisition targets. Enterprise value is equity value plus debt, and provides a snapshot fo the price to acquire a firm. 


In the 1980s, “revenue per line” still  made good sense for consumer accounts. Most lines were producing revenue and revenue per account was still simple: voice only, and one line per house. Simply, consumer “revenue per line” and “revenue per account” were virtually identical. 


None of that makes sense in any market where serious facilities-based competition exists. 


Twenty years later, that metric was not useful. There was a significant difference between deployed assets and take rates: an active line passing a location might, or might not, have a paying customer. So “revenue per line” and “revenue per account” were no longer comparable. 


After U.S. cable companies started selling broadband access and voice, while telcos sold video entertainment and broadband, matters were even more complicated. A single house might buy voice, video entertainment or broadband, in any combination. 


Revenue contribution per product, and profit margin per product were distinct. Also, the question of how to define revenue contribution from any single location became complicated. So the concept of “revenue generating unit” arose. A house buying two services represented two RGUs. 


That concept replaced the concept of counting “subscribers.” RGU is a measure of units sold; a subscriber is an account. 


Similar issues arose with the concept of average revenue per user, in the mobile business especially, when the majority of accounts served multiple users on the same account. Average revenue per account and ARPU diverged. 


In the business segment, as competitive local exchange carriers emerged, upstarts wanted some way to describe sales in ways that generated larger numbers, since sales were often largely T-1 data lines. So the concept of the “voice grade equivalent” arose, representing the virtual number of voice grade circuits sold, at 64 kbps each. So a single T-1 line represented 24 VGUs. 


Undersea networks and other long-haul transport networks use dark fiber and lit fiber as metrics. 


Now Telstra uses a metric called “transacting minimum monthly commitment” (TMMC). It applies to postpaid mobile accounts.  TMMC is regarded as a lead indicator of ARPU trends, since actual ARPU might vary based on roaming revenue, handset revenue and out-of-bundle revenue. 


TMMC might be considered a measure of service contract value. 


The point is that new metrics reflect changes in the core business. Deploying lines or capacity is different from sales of capacity. Accounts and units sold are different. Revenue for products varies, as does profit margin. Contract value provides insight in a way that total revenues might not. 


Nearly all changes in terminology reflect changes in business models.


Wednesday, July 7, 2021

Small Firm Execs Say Work from Home Harmed Productivity

Big firms behave differently than small ones, in almost every respect. Big firms often can support and even benefit from regulations that add cost to running a business. Small firms have almost no ability to leverage regulatory overhead for business advantage. 


Large businesses can survive a significant number of underperforming employees. Few small businesses can do so. 


Some large businesses believe work at home for most employees will not harm productivity. But 45 percent of U.S. small business employers say employee productivity decreased while working from home, according to a survey conducted by Digital. 


source: Digital 


B2B Sales Journeys Now Begin Online

No matter how long a business-to-business sales process takes, perhaps 60 percent to  70 percent of potential business product buyers have conducted research on solutions to their problems before they contact a sales professional.  


In other words, “B2B customers today progress more than 70 percent of the way through the decision-making process before ever engaging a sales representative,” according to some studies.


The B2B sales journey in 2020 began online, using search about 90 percent of the time, according to Forrester Research. Search is involved 90 percent of the time, according to Google.  


In 2020, 80 percent of the average B2B buyer journey took place online, and whatever statistics you choose to believe, online research is a huge part of the buyer journey.  


Though intensified by the pandemic, 96 percent of B2B sales teams have shifted to remote selling, according to McKinsey.


That includes checking a supplier’s website, doing online searches and reading user reviews, according to Marketing Charts. Some 94 percent of all potential business customers conduct online research before making any purchases valued at more than $100,000, for example.  


source: Marketing Charts 


Accenture notes that business buyers also increasingly expect a “consumer online” experience when buying B2B. In 2014, for example, about 40 percent of prospects conduct online research for a majority of goods priced under $10,000, and 31 percent do so at that frequency for goods costing at least $100,000.  


More importantly, perhaps, many B2B transactions take place online as well. 


source: Marketing Charts 


That remained true in 2020. More than half of prospects reported found solutions online. And 70 percent of buyers defined their solutions before they ever contacted a sales professional. 


source: Demand Base 


Indirect Monetization of Language Models is Likely

Monetization of most language models might ultimately come down to the ability to earn revenues indirectly, as AI is used to add useful fe...