One thing that listeners are hearing from tech media, from the developers, and from the general public is concerns about how much energy AI uses.
That makes sense, for a number of reasons. One major one is that AI is, by its nature, automated. It’s not just scalable, it scales itself, on its own, to run potentially all the time, everywhere. That’s a lot of energy. Then we see the actual proof – the massive demand for data centers, an unprecedented tack toward nuclear power in America, Memphis locals fighting xAI’s Colossus for water and electricity…
But as new models have come out during the course of 2025 – and boy, have they come out – there are some interesting wrinkles that make this conversation even more important.
Multimodal AI and Energy
You may remember the hand-wringing about studies suggesting that at some point, a series of questions answered by a model like ChatGPT corresponded to a bottle of water, or a certain amount of electricity that was significant.
Okay, but now we have generative video – pixels, and frames, in a vast, oceanic trough of data that all has to be handled by energy-intensive systems.
This article from Katelyn Chedraoui at CNet goes into detail and reveals, for example, that energy use for a tool like Sora that generates AI video is a staggering 2,000 times more energy intensive than text. It’s hard not to use an exclamation point here. But the numbers tell the story.
It’s clear, in light of these evaluations, that we desperately need an energy strategy for AI.
“An average energy-efficient LED lightbulb uses between 8-10 watts,” Chedraoui writes. “LCD televisions can use between 50-200 watts, with newer technology like OLEDs helping run them more efficiently. For example, the 65-inch Samsung S95F, CNET’s pick for the best picture quality of 2025, typically draws 146W, according to Samsung. So creating one AI video would be equivalent to running this TV for 37 minutes.”
That’s – a lot, especially if you assume that every instance of a genAI video model is going to be cranking these out, every day, all day.
The Energy Pickle
In a recent panel at a Stanford event on AI, a group of seasoned professionals involved in the energy business talked about problems, and possible solutions.
“We’re seeing a huge surge in data center power demand,” said Nico Enriquez of Future Ventures, setting the stage (full disclosure: I am also affiliated with Future Ventures). “Big tech is spending hundreds of billions of dollars building data centers, and their main limiting factor is access to power. They’re collectively betting they can scale their compute and win the arms race, and the projections for where energy will go are uncertain.”
“What I know for sure is: humanity needs more electricity, period, full stop,” said Paige Crahan, general manager at Tapestry, Google’s electric grid moonshot. “We’re going to do a lot of things with it, we’re going to help our communities in many different ways.”
Crahan cited the IAEA 2023 report showing the dramatic need for power on the horizon.
Mitesh Agrawal, CEO of Positron, agreed that the struggle is real. It requires planners to do two things, he suggested: build as much as (we) can, and get more intelligence out of every watt.
“For every watt that we’re consuming, it’s important that … we increase the number of watts we produce by any means,” he said.
Creating Vast Efficiencies
Panelist Nelson Abramson, CEO of Verrus, suggested that existing innovations have let us extend our energy horizon as needs scale exponentially. He mentioned, for example, LED lighting, which he called “ a very underappreciated driver of underlying efficiency of energy.”
“That bought us 20 years,” he said, leading to today:
“The question is, how do you combine the idea that the energy consumption, the grid needs to be managed differently, at different times of day, different parts of the year, with the fact that the customers need to run the same thing all year round?” he asked.
Geopolitical Concerns
In terms of worldwide effort, Crahan suggested there’s some amount of coordination and shared incentives.
“This is a challenge everywhere in the world, so we’re sort of all in it together,” she said. “There’s a good exchange and transferability of lessons.”
Chronicling the efforts of her company to do some kinds of international research on energy, she talked about a synergy in analyzing different types of biomes, observing regional trends, and generally working across borders.
“The optimism is that not every country has to learn these insights on its own,” she added. “We can really share these insights, once you instrument and run these algorithms.”
Agrawal added thoughts on the value of hardware implementation analysis.
“NVIDIA GPUs, AMD GPUs, Google TPUs, they’re brilliant pieces of silicon, they work very well for different types of applications, right?” he said. “And I think, just like in most aspects of life, we have really determined the most efficient use cases for each type of technology.”
Abramson explained a strategy for deeper governance.
“One of the things that we’re able to do by bringing flexibility and grid interactivity along with our data is the ability to drive up utilization for the utility itself across the whole rest of the year, where it’s not constrained,” he said, enumerating both capex and opex costs. “(Stakeholders) take all of their costs to build, run, operate the grid, and that’s a combination of very high fixed costs … the capex to build the wires and the substation and the power plants, and then the opex or fuel costs, and they divide it by the number of they sold. That’s your dollar per kilowatt hour, plus a guaranteed profit margin. That’s roughly the way that it works.”
“We could drop our electricity costs,” Enriquez said, wrapping up and elaborating on some of the panel insights. “They don’t have to go up, inevitably, if we can design the system right. And I think that’s really the dream of designing the grid better, using more flexible data centers, having more efficient chips, having these data centers soak up power, but also paying for it. We could drop our electricity bills. They don’t have to go up.”
A Race for Change
This makes sense. We have our burgeoning energy needs, balanced against our potential to build in systemic efficiencies, and cut power demands monumentally. Will we be up to the challenge? That all depends on a number of factors – not least of which is the progress that companies like TerraPower make in ushering in clean, safe nuclear power in the form of local, collocated plants. Then there’s the value of edge engineering – having the power next to the compute, etc.
Let’s see how all of this takes shape in 2026.
