Tuesday, October 24, 2023

The Purpose of Deregulation is to Cause Leaders to Lost Market Share

The functional purpose of any industry deregulation is to cause incumbents to lose market share. The functional purpose of new regulation often is to restrict market share held by market leaders, eventually causing share loss for the leaders. As regulators examine markets led by hyperscale app providers, that is going to be a relevant issue.


That is precisely what has happened in the connectivity business. That will be the case for possible regulatory restrictions placed on hyperscale app providers, as was true for connectivity providers.


Consider services purchased by enterprises. 


Though Verizon, AT&T and Comcast stand atop the market share ranking for sales of connectivity services to U.S. enterprises, about 40 percent is shared by a number of other providers. 


Company

Percentage

Verizon

25%

AT&T

20%

Comcast

15%

Other specialized connectivity service providers

40%


Among them are: 

  • Lumen Technologies

  • Cogent Communications

  • Windstream

  • Zayo Group

  • Equinix

  • Digital Realty

  • Telehouse

  • NTT Communications

  • GTT Communications

  • Tata Communications


To be sure, U.S. enterprise spending on information technology services and products has grown steadily since 2000, according to IDC estimates. The issue is market share, as many new competitors have entered the market. 


Year

Services

Products

Total

2000

242.0

295.0

537.0

2001

234.0

275.0

509.0

2002

227.0

250.0

477.0

2003

220.0

225.0

445.0

2004

213.0

200.0

413.0

2005

206.0

175.0

381.0

2006

200.0

150.0

350.0

2007

206.0

175.0

381.0

2008

212.0

200.0

412.0

2009

218.0

225.0

443.0

2010

224.0

250.0

474.0

2011

230.0

275.0

505.0

2012

236.0

300.0

536.0

2013

242.0

325.0

567.0

2014

248.0

350.0

598.0

2015

254.0

375.0

629.0

2016

260.0

400.0

660.0

2017

266.0

425.0

691.0

2018

272.0

450.0

722.0

2019

278.0

475.0

753.0

2020

284.0

500.0

784.0

2021

290.0

525.0

815.0

2022

296.0

550.0

846.0

2023

302.0

575.0

877.0


No comments:

Have LLMs Hit an Improvement Wall, or Not?

Some might argue it is way too early to worry about a slowdown in large language model performance improvement rates . But some already voic...