Data centers in the United States consume more than 60 billion kWh of energy each year, at an annual cost of $4.5 billion, according to the Environmental Protection Agency. Energy consumption has doubled since 2000. Much of that spending is for cooling systems to deal with all the heat produced by the power-gobbling servers that are the muscle inside any data center.
Additional studies by many of today's largest corporations agree that a 10 percent to 20 percent reduction in power consumption from new IT equipment is required. That, and the inability of data centers to continue to scale operations using current technology virtually assures new generations of power-efficient servers.
Some data centers already are finding that the key constraint to further growth in hosting capacity is inability to get any more power from the grid at their current locations.
Much the same can be said for end user devices, especially mobiles. Broadband mobile applications require more power consumption. That means bigger or better batteries. Since device size is crucial these days, that means better batteries.
Of course, the problem is that processors and memory advance at much-faster rates than battery technology, for example.
Thursday, July 10, 2008
$4.5 Billion Annual Data Center Power Bills

Subscribe to:
Post Comments (Atom)
"GPU as a Service" will be a Business, but Probably Not for Telcos
Some things are predictable. A computing-related trend promising new use cases and business models arises. And even if it is not a core comp...
-
We have all repeatedly seen comparisons of equity value of hyperscale app providers compared to the value of connectivity providers, which s...
-
It really is surprising how often a Pareto distribution--the “80/20 rule--appears in business life, or in life, generally. Basically, the...
-
One recurring issue with forecasts of multi-access edge computing is that it is easier to make predictions about cost than revenue and infra...
No comments:
Post a Comment