Friday, July 22, 2016

Google Applies Artificial Intelligence to Cut Data Center Power for Cooling Up to 40%

It is not yet clear how, when and how much machine learning and other forms of artificial intelligence will start to reshape the way customers buy and use communication services. For the moment, AI likely will make its mark on non-customer-facing processes.

Google’s data centers, for example, should soon be able to reduce energy consumption cooling by up to 40 percent by applying machine learning.

Servers generate lots of heat, so cooling drives power consumption requirements. But peak heat dissipation requirements are highly dynamic, complex and non-linear, Google Deepmind says.

The machine learning system was able to consistently achieve a 40 percent reduction in the amount of energy used for cooling, which equates to a 15 percent reduction in overall power usage effectiveness.

“Because the algorithm is a general-purpose framework to understand complex dynamics, we plan to apply this to other challenges in the data centre environment and beyond in the coming months,” said Rich Evans, DeepMind research engineer, and  Jim Gao, Google data center engineer.

Probably few would debate the potential use of machine learning and artificial intelligence to improve industrial processes.

Sales processes, though, likely are not an area where most would expect big changes. Products sold to business customers and larger organizations generally are considered complex matters, requiring customization.

Enterprise communications requirements are more complicated than data center power consumption processes, many could argue. But are they? Google and Deepmind applied historical data and then AI on top, to develop new rules for managing a complex system.

In essence, do sales and engineering personnel not have an accumulated wisdom about the general problems, existing processes and typical software and hardware used by enterprise customers, to a relatively high degree?

And where the typical solution involves recommendations for removing, adding or altering services and features to solve enterprise communication problems, are there not patterns designers and sales personnel can rely upon?

If so, might it not be possible to radically simplify the process of understanding and then “quoting” a solution? And if this cannot be done on a fully-automated basis, might it still be done on a wide enough scale to deliver business value for a communications supplier?

In other words, could AI simplify substantial parts of the enterprise solutions business? Most who do such things for a living might argue the answer is “no.” But are enterprise solutions completely unique? Are there not functional algorithms engineers and network architects work with that are, in fact, bounded in terms of potential solutions?

And, if so, could not large amounts of the analysis, design and reconfiguration not be done using AI? Airline reservation systems were, and are, quite complex. And yet consumers now use tools built on those systems to buy their own tickets.

Business communication solutions are complex. But they are not unmanageably complex. People can, and do, create solutions based on the use of what effectively are algorithms. We might call it experience or skill. That it is. But it is based on rules, formal or informal.

Rules-based systems can be modeled. And that could have huge implications for how business communications solutions are designed, provisioned and sold.

No comments:

"Tokens" are the New "FLOPS," "MIPS" or "Gbps"

Modern computing has some virtually-universal reference metrics. For Gemini 1.5 and other large language models, tokens are a basic measure...