Monday, December 23, 2019

Overprovisioning is as Dangerous as Underprovisioning Capacity

Historically, the way connectivity providers have protected themselves from growing end user data demand is by overprovisioning: building networks that supply bandwidth in excess of what present demand currently requires.

In the past, that has meant building voice networks to support the peak calling hour of the peak day levels of consumption, for example. That also means most of the time, networks have spare capacity. 

Historically, one of the examples is call volume, or data consumption, by time of day and day of week. That is true for business and consumer call patterns, as well as call centers, for example. 

Both business and consumer calling volume drops on weekends and evenings, for example. Traffic to call centers peak between noon and 3 p.m. on weekdays. 


Communications networks, however, no longer are dimensioned for voice. They are built to handle data traffic, and video traffic most of all. That still leaves service providers with the question of how much overprovisioning is required, and what that will cost, compared to revenues to be earned. 

In principle, service providers tend to overprovision in relationship to the pace of demand growth and the cost of investing in that capacity growth. It is one thing to argue that demand will grow sharply; it is quite another matter to argue that demand levels a decade from now must be provisioned today. 

That noted, there is a difference between the amount of data consumed and the speed of an access connection. The two are related, but not identical, and customers often get them confused. 

But they do tend to understand the difference between a usage allowance (“how much can I use?”) and the speed of a connection (“how fast is the connection?”). 

Service providers must overprovision, it is true. But they tend to do so in stair step fashion. Mobile network operators build a next-generation network about every decade. Fixed network operators, depending on the platform, likewise increase capacity in stair step fashion, because doing so involves changes in network elements, physical media and protocols. 


The point is that capacity increases are matched to demand needs; the pace of technology evolution; the ability to sustain the business model; revenue, operations cost and profit margin considerations. 

While some might argue, as a rule, that service providers must “deploy fiber to the home,” or build 5G or make some other stair step move, such investments always are calibrated against the business case.


It is one thing to overprovision as needed to support the business model. It might impair the business model to overprovision too much, stranding assets. Some of you will recall the vast over-expansion of wide area network transport capacity in the 1998 to 2002 period. That provides a concrete example of the danger of excessive overprovisioning. 

Yes, some service providers have business models allowing them to create gigabit per second internet access networks today, even when typical customers buy service at 100 Mbps to 200 Mbps, and use even less. 

That does not necessarily mean very access provider, on every platform, in every market, can--or should--do so. 

No comments:

AI Physical Interfaces Not as Important as Virtual

Microsoft’s dedicated AI key on some keyboards--which opens up access to Microsoft’s Copilot--now is joined by Logitech’s Signature AI mouse...