Sunday, July 11, 2021

Connectivity Networks are Becoming Computer Networks: What That Could Mean

“5G represents a paradigm shift, where the telecom industry is now taking substantial steps towards using the same building blocks as the IT industry,” says Ericssson. That is another way of saying telecom networks are becoming computer networks. 


And as networking is organized using a layered model, so too might all business processes be layered. 


source: Lifewire 


Think of the analogy to the ways all lawful applications run on IP networks: they use access supplied by third parties, with no required business relationship between the app providers and the access providers. 


To be sure, one entity might own both the transport network and the app, but that is not required. Google owns YouTube, Google search and Google Maps, which in part are transported over Google’s own global IP network. But common ownership is not required.


AIn the same way, telcos and cable TV companies own some lead apps, and also own access networks. But the relationship is not mandatory. They could own apps as well as networks. Those apps could be delivered over third party networks as well as their own networks. 

source: Ashridge on Operating Models


The point is that business operations are supported as layers on top of transport network layers. But those business and transport functions are logically separated. Ownership also is logically separated. 


In the future, that might allow different ways of structuring connectivity provider operations. In a sense, the way Comcast operates its theme parks, content studios and programming networks separately from its access networks provides an example. 


Each of those businesses runs independently of the access networks, though all have common ownership. 


source: Illinois Department of Innovation and Technology  


All that might have profound implications for the ways tier-one connectivity providers run their businesses. Connectivity providers run networks to support their core revenue-generating applications: broadband access, voice, business networks and content. 


As a practical matter, the network-operating functions increasingly are logically distinct from the application functions, as a Wi-Fi network is distinct from the apps using it. Perhaps the layers are not quite as distinct as they would be at Google or Facebook, where the app creation and business functions are logically distinct from the ownership and operation of core networks. 


But the principles are the same: all modern computer networks are based on separation of functions: logical separated from physical; higher layers isolated from lower layers; applications separated from networks. 


The obvious implication is that, over time, connectivity operations will more closely mirror the way all other networks work: transport functions separated from application functions; network functions logically separated from application, use case and revenue models. 


Historically, connectivity providers have bundled their core app or apps with construction and use of the network. In the future, as computer networks, those relationships could change. 


Already, any broadband access network allows lawful apps to be run on the connectivity network, with no business relationship between app owner and network owner. In the future, that might be further developed.  


The perhaps-obvious line of development is to further isolate business operations from the network, as Google’s YouTube, search, messaging, maps, Android and other business units are separated from the operation of Google’s own network. 


source: CB Insights


Assume a future where whole businesses (Google Maps, search, Android, Nest, Chromebook; Verizon mobility, voice, internet access, enterprise and business operations) are run independently of the transport and access networks. 


“Networks” are a service provided to the businesses, not a direct revenue generator. That is precisely how current telco or cable operations are structured already. Revenue is generated by services and apps sold to customers. The network exists only to facilitate the creation and sale of those apps. 


In principle, there no longer is any reason why applications and services need to be--or should be--developed or created to run solely on “my” networks. The bigger opportunity is to own apps and services that run on anybody’s network. 


Few would consider it “better” to create internet of services apps, platforms or services that only work on a single access provider’s network. It clearly is “better” if the platform, apps and services run on any access network, anywhere. 


But that requires a change not only of mindset but of business strategy. Today, most effort is spent trying to create value for things done “on my network.” In the future, some might do better creating value for apps, services and platforms that work anywhere, on any network. 


That assumes the continued existence of multiple competitors able to pursue such strategies. If competition is not the future connectivity framework, few if any access and transport providers will be allowed to spend much energy developing platforms, services or apps that run anywhere, on any network.


Instead, effort will revert to pre-competitive, monopoly objectives: just create and operate a competent access network.


Can Lumen Find a Buyer for Rural and Copper Access Assets?

Lumen Technologies has said it is willing to consider divesting non-core assets that could include up to 18 million access lines, mostly found in rural and other lower-density areas. That might be a tall order. 


The traditional rule of thumb for fixed networks in the U.S. market is that service providers make money in dense urban areas, break even in suburban locations and lose money everywhere else, including rural areas. The same sort of logic applies to fiber to home facilities: FTTH always makes most sense in urban areas, sometimes makes sense in suburban areas and most often requires subsidies in rural areas. 


That likely still is a reasonable assumption, both in facilities-based competitive markets as well as those based on a single network and wholesale access.


So Lumen has to find a buyer willing to bet it can take the “least desirable” fixed network service territories and upgrade them for higher-performance broadband access, relying on that one anchor service to support the business model. 


That represents a key change in payback models for fiber to home investment. For a few decades, the business logic was that the FTTH upgrade would be driven by a few anchor services: broadband, voice and video entertainment. 


Ironically, the justification for fiber-based networks supporting internet protocol was that they could “support any media type.” The new assumption is that a new FTTH network will mostly be supported by broadband access. Most independent internet service providers, for example, have migrated away from offering voice plus broadband, or broadband plus video, to offering broadband only. 


And, of course, it is harder to create an attractive payback model based on a single service than on two or three relatively popular services. 


Whether in a wholesale or facilities-based competition model, where the hope once was that three anchor services  would support the business model, it increasingly is the case that a single revenue stream--internet access--anchors the payback model. That broadband-led model arguably requires much more stringent cost control than an incumbent cable or telco business model. 


To be sure, telcos and cable operators continue to earn significant revenues from either voice or video services. But internet access is viewed as the revenue driver going forward. All of which makes the facilities-based independent internet service provider business model so relevant. 


The issues include not just infrastructure cost but also competitive dynamics. In an overbuild situation any independent provider of new FTTH services must compete against two incumbents. Even when that is not the case, few telcos can expect to grab more than 40 percent to 50 percent market share of broadband connections. 


In the more-favorable two-provider scenario, it will be tough for a telco to justify an FTTH business case based primarily on the value of internet access services, though some independent ISPs, with lower cost structures, claim they can make a profit even at low housing densities. 


The total cost to build FTTH systems in rural Vermont was about $26,000 per mile, which included absolutely everything (NOC, main pass, laterals, drops and customer installs for six customers per mile) and a 12 percent contingency cushion, according to Timothy and Leslie Nulty of Mansfield Community Fiber. 


“Just six paying customers per mile can be profitable,” they argue. Vermont’s density averages 12 people per mile, so six paying customers means a 50 percent take rate, which we always achieved by the end of the third year.


Other estimates suggest per-mile costs in the $18,000 to $22,000 per mile range, so the Mansfield figures do not appear out of line. On a “homes passed” basis, some estimate a network cost of less than $700 per passing in urban and suburban areas. In rural areas the cost per home might be in the range of $3656. 


source: Cartesian


Those costs typically do not include the additional cost to serve a customer, which might double the full cost per customer. Most would agree equipment costs have declined over the past decade, though construction costs arguably have not. 


Assume costs in urban areas ranging between $670 per passing and $1313 per passing, representing perhaps 70 percent of all households. 


Assume customer premises equipment and installation labor adds $600 to the cost of serving an internet access customer on an FTTH network. Assume 40 percent take rates and an average cost per passing of $1,000. 


That implies a cost per customer of about $2500. Assume internet access revenue is $80 a month, or $960 per year. Payback on invested capital might take a while, assuming 20 percent net margins after loading marketing, operations and other costs. Annual net profits might be as low as $192 per customer in that scenario, with break even happening in a decade and a half or so. 


That will be a tough proposition. Independent ISPs operate with higher margins because their costs are far lower. Telcos and cable companies, on the other hand, do have additional revenue streams (voice and video). 


All those are issues Lumen Technologies faces as it ponders the sale of its copper-based networks in rural and other lower-density areas. 


Rules of Thumb About Mobile Capacity Expansion Might be Changing

Some changes to the connectivity business model are obvious; others more subtle. The ubiquity of mobile services is obvious, as is the growth of internet access and the waning of fixed network voice and entertainment video.

But other changes happen over such long periods of time that a generation or two can live with a new reality without noticing the differences. There was, for example, a time when the internet did not exist; when PCs and mobile phones did not exist. 

Less obviously, the ways mobile network capacity gets created have changed. Some of those ways reduce capital investment and operating costs. 

Historically, there are three ways mobile operators have created more capacity on their networks: get new spectrum; use more spectrally-efficient technologies and move to smaller cell sizes. In the 4G era a new tool emerged: use of unlicensed spectrum to offload traffic to local networks. 


Buying additional spectrum and shrinking cell sizes obviously increase capex. Shrinking cell radii 50 percent quadruples the number of cells, for example. Deploying new radios and using new modulation schemes arguably is relatively neutral as a cost driver.


Use of unlicensed spectrum, on the other hand, clearly reduces both capex and operating expense. The spectrum does not have to be bought; the radios do not have to be installed or operated; and third parties pay for energy consumption.


5G brings advances in using unlicensed spectrum, particularly in the area of allowing aggregation of available unlicensed spectrum to licensed spectrum resources.


Prior to the 4G era, it can be argued that smaller cell sizes and radio technology or modulation advances have created more usable capacity than new spectrum allocations. But widespread Wi-Fi offload has changed the toolkit. Wi-Fi offload might account for 30 percent to 40 percent of customer data consumption. 


During the Covid pandemic the percentage of consumption shifted to Wi-Fi was certainly much larger than that. In the 5G and succeeding eras, the ability to aggregate unlicensed spectrum to licensed spectrum will be an important new source of effective capacity. 


source: Science Direct 


It is not yet clear how well that pattern will hold up in the 5G and coming eras. Though both network densification (smaller cells) and new spectrum resources will be applied, in addition to better radio technology and more advanced signal modulation, new spectrum allocated will be discontinuous.


From 1947 to 2017, allocated mobile spectrum doubled about every 8.6 years. The 5G auctions have broken the scale.


In large part, new spectrum allocations have been relatively small and incremental. The allocations for 5G are discontinuously larger, involving both larger amounts of spectrum per auction and also much more effective bandwidth per unit. 


Simply, capacity is related to frequency. The higher the frequency,  the higher the potential bandwidth.  


source: Lynk 


Spectrum auction behavior also shows that price per unit decreases as frequency increases, with several drivers at work. Higher-frequency spectrum simply involves more capacity per unit, but also requires more-expensive (denser) networks. So spectrum value is partly the result of expected costs to deploy networks using that spectrum


Historically, the highest prices were obtained for spectrum with good coverage capabilities, hence lower infrastructure cost. Business models also play a role. The problem mobile internet service providers face is that customers require more bandwidth every year, but are generally only willing to pay the same amount.


source: Lynk 

So additional bandwidth is a cost of remaining in business, not necessarily a driver of incremental revenue. Also, relative scarcity plays a role in setting value and prices per unit. Low-band spectrum was the most scarce. Mid-band spectrum is less scarce and high-band (millimeter and above) is relatively plentiful. As always, scarcity increases prices. Abundance reduces prices.

The point is that the traditional rules of thumb about how mobile network capacity gets increased might have changed. Better modulation and radios; new spectrum allocations and smaller cells still are three ways capacity gets increased. 


But use of unlicensed network capacity has become a fourth tool. Even if, historically, smaller cell sizes have driven most of the capacity increase, there will be more balanced improvements in the future, relying much more on the use of additional spectrum, licensed and unlicensed. 

Saturday, July 10, 2021

IBM Envisions all the World's Cloud Resources Easily Usable as Though it Were One Machine

Methodology Matters

Most of us--at least when it suits our purposes--believe decision making is enhanced by the availability of good data. And most of us likely would agree that methodology matters when gathering data. 


So notes Ookla in reviewing data on broadband speeds described in a recent report.  “Our concern with the rest of the report is that the network performance test results the report was derived from painted an inaccurate picture of what constituents were actually experiencing in the district.”


“The results presented greatly underestimated the speeds being delivered by the service providers throughout most of the study area while overestimating some others,” said Ookla, which compared its own data with that supplied by M-Lab in the report. 


“The speeds measured by Speedtest for the same areas and the same time period are dramatically higher in most areas, indicating that additional infrastructure investments are unnecessary where constituents can already achieve network speeds that meet FCC minimums,” said Ookla. 


There is more than one way to calculate an average.  The “mean” average is the sum of all measurements divided by the number of records used. “This number is valuable, but it can be influenced by a small portion of records that may be extremely high or low (outliers),” said Ookla. “As fiber is installed within an area, a significant number of tests from ultra-high-speed connections can skew mean averages up.”


The opposite also can occur. “M-Lab vastly under-reported the network throughput in every single ZIP code represented in the congressional report,” Ookla said. 


“The ZIP code showing the least amount of difference by percentage between Ookla and M-Lab data was 13803 (Marathon) where M-Lab’s recorded median was 5.5 Mbps and the median from Ookla data was 14.5 Mbps,” Ookla noted. “So the typical speed in Marathon measured by Ookla’s Speedtest was over two and a half times as fast as the average measurement captured by M-Lab.”


“On the other end of the scale, in Whitney Point, M-Lab’s recorded median was 0.9 Mbps while Ookla measured a median of 71 Mbps, almost eighty times faster,” the firm said. 


“It is clear from these results that M-Lab’s performance test does not measure the full capacity of a network connection and thus does not accurately reflect the real-world internet speeds consumers are experiencing,’ said Ookla. 


“These disparities in measured speed generally arise because some network data providers have low user adoption among consumers, limitations in their testing infrastructure, questionable testing methodologies, or inadequate geolocation resources to precisely locate where a given test was taken,” said Ookla. 


“These disparities in measured speed generally arise because some network data providers have low user adoption among consumers, limitations in their testing infrastructure, questionable testing methodologies, or inadequate geolocation resources to precisely locate where a given test was taken,” Ookla added.


The B2B Sales Journey Has Changed

The business-to-business buyer journey has changed. As in the past, B2B transactions remain complex, with multiple influencers and decision-makers, with many rounds of research, evaluation and stakeholder engagement work required. 

 

Since the Covid pandemic, when person-to-person meetings were largely impossible, means the B2B purchase journey has been streamlined. There is less distinction between marketing and sales. Timelines often are compressed. Buying authority is more decentralized as “computing as a service” can be bought with a credit card. 

 

Buyers still must identify the business need, research solutions, evaluate options and reach a decision. But buyers are doing more of that online and on their own.


Enterprise sales have in the past largely relied on field sales. But change is happening. Perhaps a third of business-to-business buyers might be willing to conduct fully-virtual transactions for new products up to a value of approximately USD 500,000, according to a McKinsey report. 


And marketplaces, ecosystems and platforms can make a huge difference. PCCW Global, using an automated system for sales to settlements, “gained over 800 customers in the last 18 months, with growing traction, without any actual sales contact,” Halbfinger said. 


“We don’t even have to know who the customer is,” he added. Sales come from third parties or online, direct from the trading platform PCCW Global uses. 


B2B sales have evolved as virtual marketing, sales, fulfillment and settlement evolve using artificial intelligence and other digital tools. Those themes, and many more, are featured in a PTC Webinar Series: Frictionless Business™  on How B2B Sales Will Change, Post Covid




Featured panelists included:

  • Matt Bramson, Founder & Managing Partner, Cloud Strategy Solutions, USA

  • Marc Halbfinger, Chief Executive Officer, PCCW Global, Hong Kong SAR China

  • Nancy Ridge,  Founder & President, Ridge Innovative, USA

  • Elmar Rode, Director Communications Industry Strategy Group, Oracle, Germany

  • Gary Kim, IP Carrier principal, acted as moderator


Available on 12 July 2021 to PTC members, the series will be available on YouTube in about 30 days. Other episodes in the series already are available for immediate viewing.

Hospitality Industry Changes, but Phone Systems Almost Do not Matter

Most observers expect changes in the hotel and lodging experience as a result of the Covid pandemic that will last beyond the pandemic’s end. Various forms of “contactless” experience--ranging from keyless room entry to contactless check in to and end to daily room cleaning--are among the expected changes. But supply chains, service elements and staffing levels are likely to be affected as well.  


And many expect cost-cutting measures to develop as well, given the slow travel rebound. Contactless experience will be among the ways lodging providers cut costs. 


Technology and analytics will be more important as human support and interactions are minimized.  


Other costs will be difficult to rein in, and might also not provide much upside to the operating cost or revenue models. 


In-room phone systems in the lodging industry are likely among the necessary costs of doing business, though few guests seem to use them. And some hotels have a line item revenue gain from in-room phones that most travelers consider a tax, not an amenity. 


source: PXF Hospitality Research


Of course, lodging establishments require phone systems for other reasons, including taking reservations. 

 source: PXF Hospitality Research


“From 2015 through 2019, total (hotel) operating expenses increased at a compound average annual growth rate (CAGR) of 2.2% at the properties in our study sample,” notes CBRE. “During this same period, the hotels’ cost for telecom service increased at a CAGR of 9.7 percent.”


“Individually, the cost of phone service rose by a CAGR of 5.7 percent, while the cost of internet service increased at an average annual pace of 16.1 percent,” says CBRE. 


“The 9.7 percent combined CAGR for telecommunications cost is more than three times the CAGR for any other individual hotel department cost during the same five-year period,” CBRE says. Costs grew faster than that at upper-midscale (CAGR 21.5 percent) properties and upscale (CAGR 13.9 percent) hotel chains. 


On the other hand, phone system expenses are a small part of total operating cost: less than half a percent. 

 source: CBRE


source: CBRE


Directv-Dish Merger Fails

Directv’’s termination of its deal to merge with EchoStar, apparently because EchoStar bondholders did not approve, means EchoStar continue...