Industry leaders like OpenAI, DeepSeek and Meta should be more transparent about the environmental impacts of AI, said experts at King’s College London’s (KCL) Festival of Artificial Intelligence.
While numbers vary depending on AI type and usage, the data centres that power AI consume vast amounts of electricity and water.
Cornell University found that OpenAI’s 700 million daily GPT-4o searches use enough energy to power 35,000 American homes. An MIT review says AI could consume as much electricity annually as 22 per cent of all US households by 2028.
Daniel Summerbell, co-founder of CarbonRe, a company that uses AI to streamline the production of industrial materials, voiced his concerns about “how transparent AI companies are being about their energy costs”. He also explained that available data are just estimates because leading firms, using more energy-intensive Large Language Models (LLMs), rarely publish the information.
Sumberbell said CarbonRe’s investors require that the company track its climate footprint, so he knows the company produces 100 tons of carbon emissions a year, and that their AI models – designed to improve efficiency at cement farms – save 10,000 tons. He said these calculations were necessary “so we can be sure we’re making the appropriate tradeoff”.



For Catherine Tilley, Impact Director at the Centre for Sustainable Business at KCL, leading companies could help people use AI more sensibly by being more transparent. “If I start using an LLM, there’s no indication of the potential environmental impacts,” she said.
Summerbell agreed. “We all end up in our own little bubbles”, he said, referring to people casually using AI “without questioning whether sharing pictures of themselves as an action figure was worth it”.
He added, however, that increased transparency should not come with so much red tape that it makes it hard for businesses, especially smaller ones, to operate.
In an essay last year, OpenAI CEO Sam Altman argued that the accelerating capabilities of AI will usher in an idyllic “Intelligence Age,” with the potential for triumphs like “fixing the climate.”
Still, Tilley said that “businesses should take more responsibility for their ethical decisions,” and pointed out that increasing transparency could force developers to think more critically and improve AI models.
She said that if a few pioneering companies started publishing their data, it could encourage others to follow suit until disclosure is regulated.
The KCL festival ran from 20-24 May, exploring the latest developments in AI.