By
David Tuft
We recently learned that Google has blown its emission reduction goals due in large part to AI computing - and it’s not alone. Harvest, Impulse, and The Abundance Institute co-hosted this panel discussion during our Gigaton Salon at SF Climate Week 2024. It focused on how tech companies can catalyze planetary-scale solutions to climate change with a deep dive into the immense projected load growth from AI. Joining moderator Eli Dourado of the Abundance Institute on stage were Jenn Huffstetler, Chief Product Sustainability Officer at Intel, and Peter Freed, former Director of Energy Strategy at Meta.
Eli: The US has about 1.3 terawatts of electricity generating capacity. Thinking about growth over the next 20 years, how much capacity do you think we’re going to need?
Peter: There's been a lot of conversation lately about AI and the electric loads associated with it. In a sense, it's a bit of a distraction.
In the last 18 months, [we’ve passed] a lot of legislation designed to electrify as much as humanly possible. It's a core part of the climate strategy for the current administration. We're seeing industrial electrification, transportation electrification, residential, mobility. Compute is certainly part of that. But it's not the only part of it.
We are in a moment of tremendous opportunity. I’m hoping that the narrative can shift to how exciting it is to see all this load coming onto the system - it represents economic development and job creation, which is amazing. And simultaneously, we need to make sure that load is decarbonized, reliable, and affordable for customers of all types.
But to specifically answer your question: A lot. Ultimately, what we are after is trying to figure out how much we can electrify. In terms of terawatts, it’s really a measure of how successful we are.
Jenn: I think there will be new sources of clean energy. We certainly have solar and wind today – and hydropower, where I'm from in Oregon to power data centers.
But everybody is looking for how to deploy more dense solutions to meet these increased needs. You can't always bring the utility base load online as fast as you want. It's going to take investments across the industry to get some of these newer technologies. And some of them might not be the cleanest – we're going to see natural gas being deployed in more places in the near term.
Eli: We're here, at Impulse Labs, an event co-sponsored with Harvest. Both companies do load shifting at the household level. Working on data centers and computing facilities, you have the opportunity to think about load-shifting on a much bigger scale, maybe even a planetary scale. What can be done to do load shifting and help stabilize the grid?
Jenn: A lot of what you see with artificial intelligence was born out of high-performance computing – the workload looks fairly similar. You have higher-priority jobs and lower-priority jobs. You can pause training the model, which is the more energy-intensive portion. It's typically not urgent that it has to happen now.
That's a technique that some of the largest cloud providers in the Bay provide to make sure everybody is able to receive that power and avoid blackouts. More is needed in demand- supply communication. And there are probably some software techniques and even AI that communicate between the demand and supply side. Absent policy, it's probably not going to happen at the scale that it could.
When you talk about planetary load shifting, Google has certainly been a thought leader. They've published a lot of white papers about how they're moving load to follow the sun to have the greenest energy. While that’s certainly a possibility, it's pretty complicated for those who aren't the most savvy. These hyperscale data centers, they're running major applications like Search or Instagram. They have technical capabilities that most of the world doesn't have.
While they'll continue to employ that, what I think we're going to see is, if I'm an enterprise or somebody with less of a load, how am I being more intentional in my consumption.
Peter: It's worth unpacking what we mean when we say load shifting because ultimately we're not really talking about energy. In most cases, we're thinking about capacity in constrained moments. Historically, things like peak shaving or the types of things that the folks at Impulse are thinking about are great.
But there are also a lot of things when you start thinking about servers and batteries inside of data centers – the backup batteries inside data centers are the largest energy storage installations on the grid today and have been for decades. What if you made those responsive to grid operators? There are a bunch of things that we can be doing that move beyond using less in certain hours to “how do we provide a suite of services that grid operators need?”
Historically, this has largely been a voluntary conversation. I'm not sure it's going to be voluntary for much longer. Understanding flexibility in large-scale loads may be part of the equation if you want to get interconnection capacity.
Eli: If you think about all the load growth we're expecting, is our grid ready for that?
Peter: No. The grid is barely ready for what we have today. It's been 15 years of flat to declining load across the country. We've been phenomenally successful at deploying energy efficiency into the built environment, which is awesome. We have a lot of folks scrambling right now to figure out how to lock up sufficient electrical capacity for what they want to do on the system.
We're very inefficient in the way that we use our wires. We're very inefficient in the ways we use our generation. We are struggling to figure out how to deploy enough generating capacity to accommodate it.
There's a lot of talk right now that we're going to deploy a ton of gas. And that's how we’re going to solve this problem. True. We’re going to have more gas. I would like to spend a lot of time thinking about how we have less gas - but also how we bring other technologies to the mix – better utilization of existing infrastructure. Some of this is really dead simple stuff like dynamic line ratings.
Then we get into other questions about what your generation stack looks like for a clean energy future. We're looking at long-duration storage, SMRs (small modular reactors), and advanced geothermal. And it's probably an “all of the above” strategy that will take us to a place where we feel happy and comfortable with a fully-decarbonized generation stack.
I don't think we have an energy issue. We are dealing with capacity issues in most markets. When it's a peak hour and you don't have sufficient capacity, that's still a real problem.
Jenn: I was at an AI energy summit just last week in DC. The DOE is trying to figure out how to get the funds to support those clean energy technologies.
We have technologies, but they're not being deployed. We need to find ways to reach the regulators who are protecting all of us – who are paying for that capital in our rates every day. But they don't understand the technology.
We have a solution, called The Edge for Secondary Substations. It helps onboard more renewables to the grid.
These technologies exist but there's fear, uncertainty, and doubt about how that will impact the resiliency of the grid, the reliability of it, the security of it. And that's one of those areas that we need to work on.
We could probably get 2x more effective capacity out of the current grid and generation if we deployed those technologies today. That was just mind-boggling.
This interview was edited for length and clarity. You can watch the entire interview here.