As the world becomes increasingly digital, the growing demand for data centres is raising challenges for service providers, energy companies and the environment. In this post, we look at the impact of data centre expansion and explore how AI-driven energy savings are enabling data centres to overcome these challenges.
Contents
Growing energy consumption
Data centres use an enormous amount of energy. In 2022, global data centre electricity consumption was estimated to be 240-340 TWh, equivalent to around 1-1.3% of global electricity demand.
The rapid rise in AI adoption is set to increase demand substantially as AI tools use far more energy than their non-AI counterparts. According to the International Energy Agency, where a Google search uses 0.3 watt-hours of electricity, a ChatGPT query requires 2.9 watt-hours . The impact of this is considerable. In the UK, data centre power consumption is expected to increase fourfold by 2030 while Goldman Sachs claims US data centres will consume 8% of the country’s total electricity by 2030, up from just 3% in 2022.
Is your website energy- efficient? Read: How to Give Your Website an Eco-Friendly Digital Footprint
Issues for the industry
Growing consumption impacts operational costs, energy infrastructure and the environment. In terms of costs, if service providers use considerably more energy to run AI workloads, operational costs will spiral, forcing them to find new ways to cut usage.
Another pressing issue is energy capacity. The anticipated growth in data centre energy consumption will push already struggling national grids to the limit. Goldman Sachs reports that utility companies in the US will need around $50 billion of investment in electricity generation if capacity is to meet data centre power demand.
Nuclear power is likely to play a key role going forward. The UK government is considering modular nuclear reactors for powering new data centre developments, while Google, Microsoft and Meta are taking a similar approach in the US.
With modular nuclear supply some years away, providers are reliant on existing energy supplies. Though renewable sources are increasing, continued reliance on fossil fuels means an inevitable increase in CO2 emissions. According to Morgan Stanley, the boom in data centres is expected to produce about 2.5 billion metric tons of CO2 per year by 2030 , more than double the amount produced today.
Are you taking advantage of AI? Read: Generative AI: What is it and How Can it Benefit Website Owners?
Hardware solutions
To address the issues caused by increased data centre power consumption, service providers are making use of new technologies. One area where this is bringing energy-saving benefits is with hardware.
The key processors for AI are GPUs (Graphics Processing Units) which are better than CPUs at handling machine learning and deep learning workloads. Advances in GPU design, such as increased numbers of cores, enhanced memory management and improved power management, enable these chips to perform faster and use far less energy.
The latest drives are also more efficient. SATA SSDs consume less than half the energy of HDDs while NVMe SSDs use only a third of the energy of SATA models. Together, these technologies have made today’s hardware significantly more energy efficient.
Hardware acceleration
In addition to better hardware, data centre providers are deploying optimisation technology that speeds up big data processing and cuts energy use even more. Many providers use Apache Spark, an open-source framework for processing the huge datasets necessary for AI and machine learning.
Apache Spark Accelerators boost the performance of Apache Spark, enabling it to run workloads more efficiently. According to NVIDIA, which developed the RAPIDS Accelerator, it can reduce the carbon footprint for data analytics by up to 80%. In addition, it can speed up processing by 500% and cut computing costs by 75%.
The role of AI
While the technologies above help reduce the energy demand of AI, AI itself has a role to play in increasing data centre efficiency. Today, service providers are utilising AI-powered management systems that optimise energy usage in several ways:
- Cooling system optimisation: Cooling accounts for around 40% of data centre energy use. While new cooling technologies, like liquid cooling, are proving to be more efficient, AI uses real-time data to adjust cooling systems and reduce unnecessary consumption.
- Energy usage: AI-enabled power management tools monitor energy consumption in real-time, identifying where efficiencies can be made. Additionally, AI can predict energy demand by analysing patterns in workload fluctuations, enabling resources to be allocated more efficiently.
- Server performance: Machine learning optimises server performance by dynamically adjusting workloads in real-time. At the same time, energy is saved by putting idle servers into sleep mode and matching performance to workload demand.
- Consolidation and right-sizing: AI enables the workloads of underutilised servers to be consolidated onto fewer servers, while right-sizing ensures workloads are run on the fewest possible servers. These measures prevent over-provisioning, increasing energy efficiency.
Is your business expanding? Read: Virtual Servers: The Scalable Solution for Growing Businesses
Conclusion
While the huge demand for AI is significantly increasing data centre power consumption, AI, together with the latest hardware and optimisation technologies, is helping to make data centres far more energy efficient. This mitigates the impact on costs, power grids and the environment. As more efficient technology evolves and cleaner energy supplies become more available, the negative effects of AI adoption could eventually be completely eradicated.
Looking for managed cloud hosting with a web host committed to net zero and energy efficiency? At Webhosting UK, we utilise the latest technologies to deliver greener data centres, enhanced performance and robust security. For more information, visit our Enterprise Cloud Hosting and Sustainability pages.