Has Google cracked the data center cooling problem with AI?
- Data center cooling costs are becoming prohibitive amidst a global economic slowdown
- Google & DeepMind’s AI algorithm, AlphaGo, helped lower data center cooling costs by 40 percent
- AlphaGo’s findings could mean huge savings in energy consumption for data centers, and beyond
Cost-cutting has become an ever-present concern for beleaguered enterprises in the current global economy, which has been hard hit by the aftershocks of the COVID-19 pandemic.
Energy consumption at data centers is one of the prime examples of a heavy cost load for cash-strapped operators and clients, especially when it comes to cooling all of the running components. Dealing with excess heat is one of the biggest, most expensive factors involved in running a modern data center.
It’s such an issue, in fact, that the data center cooling market itself could be worth US$20 billion by 2024. In the past, solutions to solving the temperature problem have included locating data centers in locales with cooler climates such as Europe, or having them situated below sea level in the ocean.
Solving this problem is a key focus on the world’s tech giants whose services depend on these vast operations. Cutting cooling costs means more money goes into the bank, or into acquiring more users with enhanced services or better prices. There are plenty of avenues of exploration here, but one of the most promising is simply making cooling systems more intelligent.
With that in mind, Google has been working with British artificial intelligence (AI) company DeepMind to develop an AI algorithm that will help decrease the power needed for cooling purposes, without requiring expensive relocations of data centers.
The AI approaches the cooling power problem from the perspective of optimizing effective power management. Leveraging deep learning techniques, the AI breaks down vast quantities of historical data from various aspects of operating the data center, and utilizes predictive modeling to gauge the effect on energy usage.
Deep learning methodology helps to draw likely relationships between different cooling equipment, the intersection with IT frameworks, and using predictive models, it sends out varied power usage parameters to study the effect on different systems.
The AI-powered insights are used to predict system overloads and overheating probabilities, which can function as an early warning system that administrators can rely upon.
Google’ and DeepMind’s algorithm, AlphaGo, takes it even further by tapping trial-and-error reinforcement learning to figure out the best framework of cooling infrastructure like fans and ventilation, that will most effectively lower energy consumption.
Recommendations from AlphaGo (named for its ability to defeat human players at the board game Go) were then applied at Google’s data centers, leading to a 40 percent reduction in costs related to the cooling systems.
DeepMind is now using the data gathered from Google data centers to help reduce cooling costs across other platforms, which might lead to millions in energy savings. Google is even hoping the initiative will help energy consumption be lowered in the long run, reducing the company’s carbon footprint in the process.
For you, the business owner, it might lead to lowered cloud costs in the long run (if we’re lucky).
READ MORE
- Data Strategies That Dictate Legacy Overhaul Methods for Established Banks
- Securing Data: A Guide to Navigating Australian Privacy Regulations
- Ethical Threads: Transforming Fashion with Trust and Transparency
- Top 5 Drivers Shaping IT Budgets This Financial Year
- Beyond Connectivity: How Wireless Site Surveys Enhance Tomorrow’s Business Network