Reducing energy consumption has been an important objective in Fandis over the years. But we are not the only ones! Google also, with their immense data center, has the same problem, but they found a very innovative way to solve it: artificial intelligence.
Over the past 10 years, Google super-efficient servers have required more and more energy for the complex arithmetic operations, and the company has invested heavily in green energy sources. The main innovation, however, was through DeepMind technology, artificial intelligence that can learn from Google’s data centers, which has reduced the amount of energy used to cool the data center up to 40%.
One of the primary sources of energy consumption, in fact, is required for cooling the server. Just like a laptop generates heat, the Google data center Search, Gmail, YouTube, etc. generate a lot of heat. The cooling is performed by large industrial facilities such as pumps, refrigerators and cooling towers, running continuously. This happens because the equipment and the environment interact in complex non-linear ways, that classical engineering formulas fail to grasp.
To solve this problem, Google have started using the “machine learning” technology to manage data centers more efficiently. And in recent months, the DeepMind researchers have also started working with the Google data center team to significantly improve the operation of the machines. Using a system of neural networks trained on different operating scenarios and parameters within data centers, they have created a more efficient and adaptable framework for understanding the dynamics of the data center and optimize efficiency.
How did they do? It is not even easy to explain!
“We accomplished this by taking the historical data that had already been collected by thousands of sensors within the data center — data such as temperatures, power, pump speeds, setpoints, etc. — and using it to train an ensemble of deep neural networks. Since our objective was to improve data center energy efficiency, we trained the neural networks on the average future PUE (Power Usage Effectiveness), which is defined as the ratio of the total building energy usage to the IT energy usage. We then trained two additional ensembles of deep neural networks to predict the future temperature and pressure of the data center over the next hour.” Explain the two engineers DeepMind and Google, Rich Evans and Jim Gao.
The implications are significant for Google data center, given its potential to dramatically improve energy efficiency and reduce overall emissions. But this will also help other companies that use the Google cloud to improve their energy efficiency. Through DeepMind it will then be possible to use machine learning to consume less energy and win one of the biggest challenges of all time: climate change.
Do you want to improve the performance of enclosures, or look for heating or cooling solutions? Visit fandis.it!
Photo: Google/Connie Zhou
Leave a Reply