The Hidden Cost of AI: From Algorithms to Energy
From automation to art, AI is revolutionising the way we live and work. Its ability to analyse massive datasets, identify patterns and make predictions is propelling us into the future at a lightning-fast rate. But as we marvel at the capabilities of algorithms and the exciting applications AI allows, it is essential for us to look beyond the digital curtain and examine the environmental impact of our AI-driven world.
AI’s thirst for power
AI has a limitless need for electricity. Training and operating AI models, especially deep learning models, requires immense computational power. This means that data centres, packed with servers and high-performance GPUs, are churning away 24/7, consuming staggering amounts of energy.
A recent peer-reviewed study featured in Joule has provided one of the earliest efforts to measure the fast-increasing demand in AI energy consumption. If the present trends in AI capacity and adoption persist, NVIDIA is expected to dispatch 1.5 million AI server units each year by 2027.
These 1.5 million servers, when operating at maximum capacity, are anticipated to consume no less than 85.4 terawatt-hours of electricity every year— this exceeds the annual energy consumption of many smaller countries, as outlined in the latest analysis.
To put that into context, academic, Alex de Vries, told Joule that chatbots like ChatGPT could cost 564 MWh of electricity a day to run.
None of this is good news for the environment.
What is the AI environmental impact?
Unless it is produced from renewable sources like wind or solar, energy generation creates CO2, which is the primary greenhouse gas causing climate change. AI’s environmental impact isn’t limited to its electricity consumption either. Experts believe that electricity only accounts for around 10% of a data centre’s CO2 emissions.
The manufacturing of specialized hardware components and cooling systems for data centres also require substantial resources. Extracting and processing the minerals and materials used in these components further contributes to the ecological footprint.
Moreover, the heat generated by data centres necessitates cooling solutions, leading to even more energy consumption.
It’s a self-perpetuating cycle: as AI models grow larger and more complex, they require more computational power, which in turn escalates their energy demand.
What can be done to lower AI environmental impact?
Recognizing the environmental costs of AI, the tech industry is beginning to address the issue. Efforts are being made to develop more energy-efficient algorithms and hardware.
The MIT Lincoln Laboratory Supercomputing Centre (LLSC) is in the process of developing methods to reduce energy consumption in data centres. These methods comprise straightforward yet highly efficient modifications, such as implementing power-capping on hardware, as well as the incorporation of innovative tools capable of terminating AI training prematurely.
Importantly, the LLSC’s findings indicate that these techniques exert only a negligible influence on model performance. By optimising AI models to perform equally well with fewer computational resources, the industry can reduce its environmental impact.
In addition to creating greener AI models, there are initiatives to power data centres with renewable energy sources, such as wind and solar power. The largest data centres globally are operated by Google Cloud, Microsoft Azure, and Amazon Web Services (AWS). Each of these companies has set targets to power their data centres entirely with renewable energy, aiming to achieve this goal between 2025 and 2030.
The Hidden Cost of AI: From Algorithms to Energy
From automation to art, AI is revolutionising the way we live and work. Its ability to analyse massive datasets, identify patterns and make predictions is propelling us into the future at a lightning-fast rate. But as we marvel at the capabilities of algorithms and the exciting applications AI allows, it is essential for us to look beyond the digital curtain and examine the environmental impact of our AI-driven world.
AI’s thirst for power
AI has a limitless need for electricity. Training and operating AI models, especially deep learning models, requires immense computational power. This means that data centres, packed with servers and high-performance GPUs, are churning away 24/7, consuming staggering amounts of energy.
A recent peer-reviewed study featured in Joule has provided one of the earliest efforts to measure the fast-increasing demand in AI energy consumption. If the present trends in AI capacity and adoption persist, NVIDIA is expected to dispatch 1.5 million AI server units each year by 2027.
These 1.5 million servers, when operating at maximum capacity, are anticipated to consume no less than 85.4 terawatt-hours of electricity every year— this exceeds the annual energy consumption of many smaller countries, as outlined in the latest analysis.
To put that into context, academic, Alex de Vries, told Joule that chatbots like ChatGPT could cost 564 MWh of electricity a day to run.
None of this is good news for the environment.
What is the AI environmental impact?
Unless it is produced from renewable sources like wind or solar, energy generation creates CO2, which is the primary greenhouse gas causing climate change. AI’s environmental impact isn’t limited to its electricity consumption either. Experts believe that electricity only accounts for around 10% of a data centre’s CO2 emissions.
The manufacturing of specialized hardware components and cooling systems for data centres also require substantial resources. Extracting and processing the minerals and materials used in these components further contributes to the ecological footprint.
Moreover, the heat generated by data centres necessitates cooling solutions, leading to even more energy consumption.
It’s a self-perpetuating cycle: as AI models grow larger and more complex, they require more computational power, which in turn escalates their energy demand.
What can be done to lower AI environmental impact?
Recognizing the environmental costs of AI, the tech industry is beginning to address the issue. Efforts are being made to develop more energy-efficient algorithms and hardware.
The MIT Lincoln Laboratory Supercomputing Centre (LLSC) is in the process of developing methods to reduce energy consumption in data centres. These methods comprise straightforward yet highly efficient modifications, such as implementing power-capping on hardware, as well as the incorporation of innovative tools capable of terminating AI training prematurely.
Importantly, the LLSC’s findings indicate that these techniques exert only a negligible influence on model performance. By optimising AI models to perform equally well with fewer computational resources, the industry can reduce its environmental impact.
In addition to creating greener AI models, there are initiatives to power data centres with renewable energy sources, such as wind and solar power. The largest data centres globally are operated by Google Cloud, Microsoft Azure, and Amazon Web Services (AWS). Each of these companies has set targets to power their data centres entirely with renewable energy, aiming to achieve this goal between 2025 and 2030.