
Yury Dvorkin
As the demand for artificial intelligence (AI) surges in everything from business to science to art, the power-intensive data centers needed to run AI tools and applications are straining the nation’s aging electricity infrastructure. A recent National Academies webinar — part of the Climate Conversations webinar series — examined AI’s growing appetite for electricity and possible solutions for powering it sustainably.
“Even before this growth in demand … the U.S. grid was facing challenges related to aging infrastructure, long queues for interconnection of new renewable resources, the need for new pricing structures and market designs, and more,” said Kristin Hayes, senior director for research and policy engagement at Resources for the Future, and the webinar’s moderator. “How AI presents both new challenges but also new opportunities for the power sector is at the heart of our conversation today.”
Hayes was joined by Yury Dvorkin, associate professor at Johns Hopkins University and core faculty member of the Ralph O’Connor Sustainable Energy Institute (ROSEI), and Cooper Elsworth, Google’s AI and Cloud workload emissions program lead, for the discussion, which focused on how artificial intelligence is driving new electricity demand, how emissions play into each stage of the life cycle of AI products and services, and what it will take to answer society’s demands for AI while also meeting greenhouse gas emissions goals needed to tackle climate change.
A Crossroads for U.S. Grid
AI’s rapid growth is putting unprecedented demands on the nation’s power grid. “For the first time since the 1950s and ’60s, we’re going to have electric demand growth greater than GDP,” said Dvorkin.
When it comes to deploying new data centers to support this growth, he emphasized three major challenges — time, cost, and coordination. “It’s not only about generation, it’s all also about transmission,” he said. “How do we supply electricity from the places where it’s being produced, to the places where it’s being consumed?”
Dvorkin emphasized that unlike electric vehicles — which are distributed across many regions and add relatively modest demand — data centers concentrate large, sudden power needs in specific locations, generating more acute challenges both in timing and geography for the impacted grid.
One solution could be to deploy renewable energy sources like solar, which could be used locally and brought online faster than other clean energy sources such as nuclear power plants. In addition, businesses and utilities could employ battery storage to help align peaks in renewable energy generation with the timing of AI-related electricity demand.
In fact, sustainability and business performance don’t have to be at odds when it comes to AI development, Elsworth said. “A lot of the objectives that we have on the sustainability side are actually fairly aligned with the business objectives of reducing the energy consumption and the cost associated with developing and serving these models,” he noted.
Understanding and optimizing energy use should be just one aspect of a broader sustainability strategy, said Elsworth. For example, innovations in transparency and emissions reporting could offer stakeholders a better understanding of AI’s environmental footprint and identify key areas for improvement, he said. And hourly emissions accounting would enable companies to match energy consumption with production by providing a more granular view of when energy is actually being used, rather than relying on averages that may mask real-time emissions patterns.
“There’s a lot of demand by different customers [for] data center providers to be more transparent” about emissions data, said Elsworth. But “a lot of the transparency challenge comes from a lack of consensus around how to share these numbers in a way that is consistent across different companies.”
Coordinating clean power
In the midst of this rapid growth, new forms of collaboration between AI companies and electricity providers will be needed. “The key here is to better understand respective demands, to better understand [the] flexibility of AI loads, and how this flexibility can be leveraged to better coordinate data center operations with the power grid,” said Dvorkin. This type of flexibility will be increasingly valuable as the different grids become more complex.
In many regions of the country, renewable energy is being curtailed due to transmission bottlenecks, when not enough infrastructure is in place to distribute this energy to consumers. Placing data centers near these underused resources and deploying energy storage in these locations could help reduce waste and improve overall system efficiency.
Carbon-aware computing, which Elsworth described as shifting computing workloads to times and places where cleaner energy is available, could also help energy efficiency, and some AI workloads, such as model training, can also be scheduled in advance.
“There are plenty of opportunities; we just need to target them in terms of need, location, and timing,” noted Dvorkin.
Tracing emissions upstream
To better assess the environmental impact of AI, life cycle analysis of tools and processes is also essential. Elsworth pointed to a recent study that assessed the full life cycle emissions of a single microchip used for running AI models, from raw material extraction and manufacturing to usage and eventual disposal.
The results showed that while upstream emissions — which include activities outside a company’s direct control, such as supply chain manufacturing and transportation — are not negligible, over 80% of emissions come from electricity used during the chip’s operation. These indirect emissions are associated with a company’s purchase of electricity, steam, heat, or cooling. As a result, said Elsworth, “Clean energy procurement ends up being a relatively straightforward lever for us to start reducing the emissions of those chips.”
Investments in more efficient hardware and optimized algorithms can also result in dramatic reductions in per-query emissions, making life cycle insights even more essential for sustainable AI development. “As AI infrastructure and AI use in our society expands drastically, this small relative percentage turns into fairly large emissions numbers,” said Dvorkin.
Looking ahead
The AI boom presents challenges to an already strained power grid, but it could also provide a historic opportunity to drive change — not just in how data is processed, but in how energy systems evolve, the speakers noted. Whether through advanced emissions tracking, hardware and software innovation, or deeper collaboration with the energy sector, the road to sustainable AI will require long-term planning and shared accountability. The same technologies fueling demand could potentially play a role in managing it.
This story was written by Laura Lyon, and originally appeared on the National Academies website. Watch the full conversation here.