October 10, 2025
The GIST OpenAI's newly launched Sora 2 makes AI's environmental impact impossible to ignore
Sadie Harley
scientific editor
Andrew Zinin
lead editor
Editors' notes
This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:
fact-checked
trusted source
written by researcher(s)
proofread

OpenAI's recent rollout of its new video generator Sora 2 marks a watershed moment in AI. Its ability to generate minutes of hyper-realistic footage from a few lines of text is astonishing, and has raised immediate concerns about truth in politics and journalism.
But Sora 2 is rolling out slowly because of its enormous computational demands, which point to an equally pressing question about generative AI itself: What are its true environmental costs? Will video generation make them much worse?
The recent launch of the Stargate Project—a US$500 billion joint venture between OpenAI, Oracle, SoftBank and MGX—to build massive AI data centers in the United States underscores what's at stake. As companies race to expand computing capacity on this scale, AI's energy use is set to soar.
The debate over AI's environmental impact remains one of the most fraught in tech policy. Depending on what we read, AI is either an ecological crisis in the making or a rounding error in global energy use. As AI moves rapidly into video, clarity on its footprint is more urgent than ever.
Two competing narratives
From one perspective, AI is rapidly becoming a major strain on the world's energy and water systems.
Alex de Vries-Gao, a researcher who has long tracked the electricity use of bitcoin mining, noted in mid-2025 that AI was on track to surpass it. He estimated that AI already accounted for about 20% of global data-center power consumption; this is likely to double by year's end.
According to the International Energy Agency, data centers used up to 1.5% of global electricity consumption last year, with consumption growing four times faster than total global demand. The IEA predicts that data centers will more than double their use by 2030, with AI processing the leading driver of growth.
Research cited by MIT's Technology Review concurs, estimating that by 2028, AI's power draw could exceed "all electricity currently used by US data centers"—enough to power 22% of U.S. households each year.
'Huge' quantities
AI's water use is also striking. Data centers rely on ultra-pure water to keep servers cool and free of impurities. Researchers estimated that training GPT-3 would have used up 700,000 liters of freshwater at Microsoft's American facilities. They predict that global AI demand could reach four to six billion cubic meters annually by 2027.
Hardware turnover adds further strain. A 2023 study found that chip fabrication requires "huge quantities" of ultra-pure water, energy-intensive chemical processes and rare minerals such as cobalt and tantalum. Manufacturing the high-end graphics processing units—the engines that drive AI boom—has a much larger carbon footprint than most consumer electronics.
Generating an image uses the electricity of a microwave running for five seconds, while making a five-second video clip takes up as much as a microwave running for over an hour.
The next leap from text and image to high-definition video could dramatically increase AI's impact. Early testing bears this out—finding that energy use for text-to-video models quadruples when video length doubles.
The case for perspective
Others see the alarm as overstated. Analysts at the Center for Data Innovation, a technology and policy think tank, argue that many estimates about AI energy use rely on faulty extrapolations. GPU hardware is becoming more efficient each year, and much of the electricity in new data centers will come from renewables.
Recent benchmarking puts AI's footprint in context. Producing a typical chatbot Q&A consumes about 2.9 watt-hours (Wh)—roughly 10 times a Google search. Google recently claimed that a typical Gemini prompt uses only 0.24 Wh and 0.25 mL of water, though independent experts note those numbers omit indirect energy and water used in power generation.
Context is key. An hour of high-definition video streaming on Netflix uses roughly 100 times more energy than generating a text response. An AI query's footprint is tiny, yet data centers now process billions daily, and more demanding video queries are on the horizon.
Jevons paradox
It helps to distinguish between training and use of AI. Training frontier models such as GPT-4 or Claude Opus 3 required thousands of graphics chips running for months, consuming gigawatt-hours of power.
Using a model takes up a tiny amount of energy per query, but this happens billions of times a day. Eventually, energy from using AI will likely surpass training.
The least visible cost may come from hardware production. Each new generation of chips demands new fabrication lines, heavy mineral inputs and advanced cooling. Italian economist Marcello Ruberti observes that "each upgrade cycle effectively resets the carbon clock" as fabs rebuild highly purified equipment from scratch.
And even if AI models become more efficient, total energy keeps climbing. In economics, this is known as the Jevons paradox: in 19th-century Britain, the consumption of coal increased as the cost of extracting it decreased. As AI researchers have noted, as costs per query fall, developers are incentivized to find new ways to embed AI into every product. The result is more data centers, chips and total resource use.
A problem of scale
Is AI an ecological menace or a manageable risk? The truth lies somewhere in between.
A single prompt uses negligible energy, but the systems enabling it—vast data centers, constant chip manufacturing, round-the-clock cooling—are reshaping global energy and water patterns.
The International Energy Agency's latest outlook projects that data-center power demand could reach 1,400 terawatt-hours by 2030. This is the equivalent of adding several mid-sized countries to the world's grid. AI will count for a quarter of that growth.
Transparency is vital
Many of the figures circulating about AI energy use are unreliable because AI firms disclose so little. The limited data they release often employ inconsistent metrics or offset accounting that obscures real impacts.
One obvious fix would be to mandate disclosure rules: standardized, location-based reporting of the energy and water used to train and operate models. Europe's Artificial Intelligence Act requires developers of "high-impact" systems to document computation and energy use.
Similar measures elsewhere could guide where new data centers are built, favoring regions with abundant renewables and water—this could encourage longer hardware lifecycles instead of annual chip refreshes.
Balancing creativity and cost
Generative AI can help unlock extraordinary creativity and provide real utility. But each "free" image, paragraph or video has hidden material and energy costs.
Acknowledging those costs doesn't mean we need to halt innovation. It means we should demand transparency about how great the environmental cost is, and who pays it, in order to address AI's environmental impacts.
As Sora 2 begins to fill social feeds with highly realistic visuals, the question won't be whether AI uses more energy than Netflix, but whether we can expand our digital infrastructure responsibly enough to make room for both.
Provided by The Conversation
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Citation: OpenAI's newly launched Sora 2 makes AI's environmental impact impossible to ignore (2025, October 10) retrieved 10 October 2025 from https://techxplore.com/news/2025-10-openai-newly-sora-ai-environmental.html This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.
Explore further
AI surge to double data center electricity demand by 2030: IEA
Feedback to editors