Artificial intelligence is growing faster than the world’s power grids can keep up with. Every search, every generated image, every chatbot conversation draws from a deepening well of electricity, water, and rare materials. Now, UNESCO has stepped forward with a concrete, research-backed response: a toolkit of practical strategies that could cut the energy consumed by AI systems by up to 90%, without sacrificing a single percentage point of performance. It’s a bold claim, and the science behind it is harder to dismiss than most green pledges.
The Scale of AI’s Energy Problem

A report jointly released by UNESCO and University College London warns that the energy demands of artificial intelligence, especially large language models, have reached unsustainable levels. The numbers are staggering. Research estimates that just one prompt to a generative AI model uses around 0.34 watt-hours. When multiplied by the daily usage of over 1 billion people, the global energy demand balloons to 310 gigawatt-hours annually – roughly equivalent to the amount of electricity consumed by over 3 million people in a low-income country. That figure is not a forecast. It is already happening today.
Electricity consumption from data centers is estimated to amount to around 415 terawatt-hours, or about 1.5% of global electricity consumption in 2024, and has grown at 12% per year over the last five years. The trajectory is steep. Global electricity consumption for data centers is projected to nearly double to around 945 TWh by 2030, with data center electricity consumption growing by around 15% per year – more than four times faster than the growth of total electricity consumption from all other sectors. UNESCO has cautioned that the energy demand from generative AI is doubling every 100 days, stressing its strain on global energy systems, water resources, and rare minerals.
The UNESCO–UCL Report: “Smarter, Smaller, Stronger”

UNESCO released a groundbreaking report, developed in partnership with University College London, titled “Smarter, Smaller, Stronger: Resource-Efficient AI and the Future of Digital Transformation.” The research shows that small changes in how we build and use large language models can reduce energy consumption by up to 90%, without compromising performance. This is not a theoretical document. The findings are based on real experiments and open-source data, and three innovations show that we can dramatically cut AI energy use without sacrificing performance.
Researchers found that by rounding down numbers used in the models’ internal calculations, shortening user instructions and AI responses, and using smaller AI models specialised to perform certain tasks, a combined energy reduction of 90% could be achieved compared to using a large all-purpose AI model. Professor Ivana Drobnjak from UCL Computer Science and a member of the UNESCO Chair in AI at UCL put it plainly: “Our research shows that there are relatively simple steps we can take to drastically reduce the energy and resource demands of generative AI, without sacrificing accuracy and without inventing entirely new solutions.”
Three Technical Pathways to Efficiency

The report identifies three distinct methods that, when combined, produce the headline 90% reduction. The first is the use of smaller models tailored to specific tasks rather than relying on massive, general-purpose systems for every function. Researchers found that these smaller, task-specific models can perform just as accurately as larger ones while consuming significantly less energy – with reductions of up to 90% recorded in some cases. Compression techniques like quantisation make AI models smaller and faster by using simpler numbers, saving up to 44% in energy without losing accuracy.
The report also highlights the importance of using shorter and more concise prompts and responses. More efficient communication between users and AI systems could cut energy consumption by at least half. By simply reducing the length and complexity of interactions, a substantial drop in energy demand can be achieved without altering model architecture at all. UNESCO’s findings revealed that shortening prompts from 300 to 150 words and switching from a large general-purpose model to a smaller, task-specific model led to energy savings of nearly 90%, without any loss in output quality. Together, these three approaches form a practical toolkit that anyone from a developer to an everyday user can apply.
Industry Response: Smaller Models Gaining Ground

Major tech firms have already begun releasing miniaturised AI models to improve efficiency. Google’s Gemma, Microsoft’s Phi-3, OpenAI’s GPT-4o mini, and Mistral AI’s Ministral are purpose-built for performance with significantly lower energy costs. Still, deployment of efficiency-first strategies remains inconsistent across the industry. These approaches, well-known in research, remain marginal in commercial deployments.
Participants at UNESCO’s high-level panel discussions acknowledged that there was a growing interest in green AI solutions, especially when these create both cost savings and environmental benefits. Leona Verdadero, a Programme Specialist from UNESCO’s Digital Policies and Digital Transformation Section and a co-author of the report, summed up the mismatch clearly: “Too often, users rely on oversized AI models for simple tasks – it’s like using a fire hose to water a house plant.” According to the IEA, renewable energy production for data centers is growing at an average rate of 22% per year and is expected to cover nearly half of additional demand by 2030.
The Digital Divide: Who Bears the Cost

The energy crisis of AI is not equally distributed. The digital divide is deepening: the benefits of generative AI are concentrated in regions with abundant compute power, while many in developing countries lack the infrastructure to harness this new technology – with 32% of the world’s population, or 2.6 billion people, still offline altogether in 2025. In Africa, only 5% of AI talent has access to the computing resources they need to develop African models, and only a handful of nations in the region host supercomputing infrastructure for generative AI applications.
If unchecked, the energy cost of AI could worsen climate change and deepen digital inequality, especially in countries already grappling with energy poverty. Efficiency is about more than reducing carbon or water use. It is about enabling access, especially for countries and communities that have too often been excluded from AI development. Limited infrastructure, skills, computing power, and governance capacity constrain the potential benefits of AI while amplifying risks, including job displacement, data exclusion, and indirect impacts such as rising global energy and water demands from AI-intensive systems.
Policy Frameworks, Transparency, and the Road to 2030

UNESCO has a mandate to support its 194 member states in their digital transformations, providing them with insights to develop energy-efficient, ethical, and sustainable AI policies. In 2021, the organisation’s member states adopted the UNESCO Recommendation on the Ethics of AI, a governance framework which includes a policy-oriented chapter on AI’s impact on the environment and ecosystems. This new research directly extends that framework. UNESCO highlighted the environmental and ethical dimensions of artificial intelligence at the Adopt AI Summit on 26 November 2025, urging stronger international cooperation to ensure AI supports climate action rather than harming it.
Like energy labels on appliances, visible indicators such as efficiency ratings or environmental impact disclosures can help users make informed choices and motivate developers to innovate toward sustainability. Developers and operators should also commit to transparent reporting of energy use, carbon emissions, and water consumption. Reviewing 75 national AI strategies published as of 2025, the World Bank’s Senior Digital Specialist noted that while many address AI inclusion, very few explicitly address AI’s resource footprint. The report calls on governments and businesses to invest in the research and development of more efficient, ethical, and accessible AI, as well as in user education, so they become aware of the energy consequences of their digital practices.
