AI has a fast-growing role to play in enhancing climate resilience and improving energy efficiency, but its valuable input also comes at a significant cost in energy consumption and e-waste. Martin Allen-Smith evaluates the trade-offs
Generative AI is rapidly transforming the way we work and live at the kind of scale comparable to the emergence of the internet. Among its many applications today, the potential to equip us to handle some of the biggest climate change challenges is one of the most salient.
Responses to the 2024 cycle of the S&P Global Corporate Sustainability Assessment found examples such as a European bank using AI tools to analyse environmental and climate risks associated with its loans and investments, a Latin American bank using AI to identify native forest areas with high carbon capture potential, a chemicals company using AI to help track compliance with environmental rules, and an energy company that built an AI platform to optimise water use with the goal of achieving water neutrality.
As with anything that sounds too good to be true, there is a flip side. AI’s progress is also heavily reliant on prompt technological improvements – to hardware infrastructure as well as to chips – and the upgrades needed to keep pace with the technology’s growth could compound existing e-waste issues if waste-reduction measures are not managed carefully. A study carried out by researchers from the Chinese Academy of Sciences and Reichman University in Israel warns that there could be between 1.2 million and 5 million metric tons of additional electronic waste by the end of this decade.
“There is still much we don’t know about the environmental impact of AI but some of the data we do have is concerning,” says Golestan Radwan, chief digital officer of the United Nations Environment Programme. “We need to make sure the net effect of AI on the planet is positive before we deploy the technology at scale.”
A recent UNEP report, Navigating New Horizons, examined the promises and perils of AI and highlighted a number of startling statistics. The datacentres on which large AI deployments rely can take a heavy toll when it comes to materials, with a 2kg computer requiring around 800kg of raw materials. The centres also use water during construction and, once operational, to cool electrical components. According to one estimate from academics at the University of California and the University of Texas at Arlington, globally, AI-related infrastructure and datacentres may soon annually consume six times more water than Denmark, a country of six million people.
A report by the International Energy Agency suggests that a request made through ChatGPT consumes up to 10 times the electricity of a Google search. While global data is sparse, the agency estimates that in the tech hub of Ireland, the rise of AI could see datacentres account for nearly 35 per cent of the country’s energy use by 2026.
Driven in part by the explosion of AI, the number of datacentres has surged, and experts expect the technology’s demands on the planet to keep growing. Analysis by Deloitte predicts that datacentre power demand could reach 1,000 terawatt-hours by 2030, potentially climbing to 2,000 TWh by 2050. This would account for three per cent of global electricity consumption, indicating faster growth than in other uses like electric cars and green hydrogen production.
Even so, AI can be a powerful tool for climate resilience and energy efficiency, helping to speed up problem-solving, enhance understanding of our changing environment, and manage increasingly complex energy systems.
The UAE used IBM’s geospatial foundation model to identify urban ‘heat islands’ and make design decisions that lowered the temperature in those areas by three degrees Celsius. In early 2025, IBM also announced that it will build an AI model in collaboration with L’Oreal that will help uncover new, more sustainable formulas for beauty, skincare and hair products.
Christina Shim, chief sustainability officer at IBM, says: “To achieve these types of things in the most sustainable and cost-effective way, there are options available across the full stack of AI.
Organisations should use foundation models and tune the smallest ones possible to meet their specific needs. They should be intentional about where they locate their processing, using tools to minimise extra ‘headroom’ and running workloads by renewable power sources whenever possible. They should choose infrastructure designed specifically to run AI, which can dramatically improve performance.”
Data from Capgemini suggests that one-third of organisations are already using GenAI for sustainability initiatives. The same report revealed that 28 per cent of insurance organisations surveyed have started using GenAI for sustainability initiatives.
Satish Weber, head of sustainability, financial services at Capgemini, says: “In spite of its environmental footprint, GenAI can be used to accelerate sustainability objectives by improving ESG reporting and scenario planning, material optimisation and sustainable/circular product design. This, coupled with AI’s ability to drive efficiency, can have a significant impact on organisations’ energy usage and environmental footprint. As investors and consumers push for more comprehensive social, biodiversity and environmental disclosures, firms must ensure the integration of sustainability across every facet of their operations.”
However, Weber added: “Organisations need to establish robust mechanisms for measuring and governing AI usage, including sustainable practices throughout its lifecycle. This would require making smarter choices about AI hardware, model architecture and energy sources for datacentres, as well as implementing sustainable usage policies.”
Measuring the environmental impact of AI is complex and multifaceted, but there are efforts under way by organisations to develop standards for sustainable AI that will include methods for measuring energy efficiency, raw material use, water consumption as well as practices for reducing AI impacts throughout its lifecycle.
IBM’s Shim notes: “Fortunately, with AI, sustainability is just good economics. It’s clear that real value comes from applying AI to important problems in efficient, targeted and cost-effective ways. There is a top line (business outcomes) and a bottom line (costs). Over the long term, it’s in everyone’s interests to use AI responsibly.”
Businesses should also consider using edge computing devices to reduce energy usage associated with data transfer and distribution and select cloud providers with energy-efficient datacentres. It is also worth considering the location of these datacentres too, since choosing a country where clean energy is readily available can help reduce the use of fossil fuel energy to power these datacentres.
Given all of these variables, how can companies transparently measure the environmental footprint of their AI initiatives? And what role do ethical considerations play in ensuring AI deployment aligns with global climate goals? Capgemini’s Weber says: “Our latest data shows that only 12 per cent of organisations are currently measuring the environmental footprint of their GenAI usage. While there are a lot of challenges around ensuring transparency and accuracy of environmental metrics, analysis and accurate measurement, monitoring and tracking are paramount.”
She adds that, to achieve this, businesses should look at the CO2 emissions across the full AI lifecycle, from tracking and quantifying the carbon footprint of GenAI applications, to their datacentre infrastructure, hardware devices and AI model architecture. “Our recently launched sustainability trends book reveals that global regulations and directors’ and officers’ liability are increasing, while ESG integration reduces financial volatility. By 2025, 71 per cent of investors will incorporate ESG into portfolios and investors are affirming ESG factors contribute more to resilient portfolios with high returns.
“Organisations should also communicate their sustainability intentions clearly to stakeholders, disclosing emissions levels, detailing progress transparently and setting specific goals for improvement.”
Given the rapid pace of AI development, could future technology enhancements reduce its energy demands and align it more closely with global climate goals? Initiatives such as using small language models, or models trained on smaller, more specific datasets, can help reduce energy consumption as well as the use of energy efficient hardware and code.
“GenAI excels in data extraction, summarisation and processing, giving organisations an edge in ESG reporting by generating insightful content for human consumption,” Weber adds.
“Opting for green datacentres and optimised cooling systems is another way to promote sustainability. Emerging sustainable hardware solutions such as neuromorphic computers – which are made up of neurons and synapses that process information – or new models of analogue computing, can significantly reduce the energy requirements of GenAI applications. These solutions can also bring significant cost savings, so there is a strong business case for exploring more sustainable IT options beyond the clear environmental benefits.”
This article was published in the Q2 2025 issue of CIR Magazine.
View as PDF
Contact the editor
Printed Copy:
Would you also like to receive CIR Magazine in print?
Data Use:
We will also send you our free daily email newsletters and other relevant communications, which you can opt out of at any time. Thank you.
YOU MIGHT ALSO LIKE