The rise of energy-efficient data centers
We are in the midst of a huge era of digital transformation. From the way we order food to the way apps are being built -- almost everything has a touch of digital. With every device being connected, the demand for putting in place the respective IT infrastructure has also gone up. Gartner, for instance, is predicting that there will be close to 20 billion IoT devices deployed by 2020. Not surprisingly, data centers are being built everywhere. A recent report by research firm Technavio says that the global data center market size is poised to grow by USD 284.44 billion during 2019-2023, progressing at a CAGR of more than 17% during the forecast period.
However, even as data centers are being built aggressively, optimizing data center power use is a high priority area for data center managers. For high power density servers, Gartner estimates that ongoing power costs are increasing at least 10% per year due to cost per kilowatt-hour (kwh) increases and underlying demand. Typically, Gartner estimates that 10% of data center OPEX is power, and this number will only increase. The European Union believes that digital industries account for five to nine percent of the world's electricity use.
Data center players have therefore been naturally encouraged to save energy. Currently, the power consumption of data centers accounts for three percent of the world’s total power consumption. To reduce their energy usage, many leading data centers are adopting a host of energy saving mechanisms to improve their data center efficiency.
These include methods such a decommissioning inactive data centers, consolidating servers, replacing old energy inefficient IT infrastructure with new optimized energy efficient infrastructure and managing air flow in a better way. This is important as it’s believed that close to 25% of data center power goes into cooling. If a company makes the right investments in cooling systems and in renewable sources of energy, there's a direct impact on profitability.
From Amazon to Google to Microsoft, all the leading hyperscale data center players have relentlessly pursued practices to squeeze out energy savings from their data centers. For example, Google's data center today is almost twice as efficient as a typical enterprise data center. For the same electrical power, Google can deliver approximately seven times the computing power than it could five years ago. Google also recently announced it has made 18 new energy deals that will together result in 1,600MW of power generated from renewable sources. The biggest player in the cloud space, Amazon Web Services, has announced its plans to power all Amazon servers with 100% renewable energy, and has constructed several solar and wind farms. Its founder, Jeff Bezos, announced that he has created the Bezos Earth Fund, with an initial commitment of USD 10 billion. Similarly, Microsoft's current data centers run on 60% renewable electricity, and the company plans to take this to 70% renewable energy by 2023. In India, we are continuing to invest in renewable sources of energy including wind and solar power plants. Last year, we partnered with Tata Power to build our first 50 MW solar power project in Solapur, Maharashtra.
The AI imperative
In a data center, there are multiple components involved, and so an energy optimization process is a complex activity. This is because an optimum cooling energy process involves multiple components such as a ventilation system, cooling tower, chillers etc. A cooling system typically adjusts different temperature control variables to meet desired set values. In different environmental conditions and in different settings, the temperature control variables change dynamically. As most cooling systems are optimized based on peak values, this causes excessive cooling which leads to not only high-power costs, but also huge waste of energy. Typically, human-based interventions have proved to be inadequate and inefficient.
A technology like AI can help in such cases. An AI-based system learns and re-learns from the massive data that is collected from a huge number of sensors, on energy consumption. The AI-system then creates an algorithm or a model that looks to optimize the most efficient usage of energy. A case in point is Google, which has used AI in the data center to improve the power efficiency of its data center. Every five minutes, a cloud-based AI pulls a snapshot of the data center cooling system from thousands of sensors and feeds it into Google’s deep neural networks, which predict how different combinations of potential actions will affect future energy consumption. The AI system then identifies which actions will minimize the energy consumption while satisfying a robust set of safety constraints.
With energy costs only set to go up, it is imperative for data center owners to start looking at using AI to optimize energy usage. Failure to do so may cost them heavily. Gartner, for instance, predicts that more than 30% of data centers that fail to prepare for AI will no longer be economical to operate by 2020.
As more data centers are built, there’s a need for cooling methods to be more intelligent, as the increase in power requirements and heat being generated will prove to be challenging for data center players. Fortunately, data center players are rising to the occasion, and have used emerging technologies intelligently to improve use rates. For example, a recent report by the Lawrence Berkeley National Laboratory states that improved energy is almost canceling out growing capacity. In 2014, US data centers consumed 70 billion kw hours. The report states that if energy efficiency levels remained as they were in 2010, the energy consumption by data centers today would be 160 billion kilowatt hours. However, the estimate for 2020 is only 73 billion kilowatt hours, thanks to the improved efficiency processes and sustainability practices followed by most big hyperscalers and private data center players such as NTT.
As one looks at the future, it is imperative that the clouds truly have a 'green' lining!