Innovation·nologyCompanies OpenAI are sucking up power at a historic rate.
One startup thinks it has found a way to take pressure off the gridBy Geoff ColvinBy Geoff ColvinSenior Editor-at-LargeGeoff ColvinSenior Editor-at-LargeGeoff Colvin is a senior editor-at-large at Fortune, covering leadership, globalization, wealth creation, the info revolution, and related issues.SEE FULL BIO Sam Altman, chief executive officer of OpenAI Inc., during a media tour of the Stargate AI data center in Abilene, Texas, US, on Tuesday, Sept.
23, 2025. Kyle Grillot/Bloomberg via Getty ImagesThe numbers are nothing short of staggering. Take Sam Altman, Open AI’s CEO.
He reportedly wants 250 gigawatts of new electricity—equal to half of Europe’s all-time peak load—to run gigantic new data centers in the U.S. and elsewhere worldwide by 2033.
Building or expanding power plants to generate that much electricity on Altman’s timetable indeed seems almost inconceivable.
“What OpenAI is trying to do is absolutely historic,” says Virun Sivaram, Senior Fellow for Energy and Climate at the Council on Foreign Relations.
The blem is, “there is no way today that our grids, with our power plants, can supply that energy to those jects, and it can’t possibly happen on the timescale that AI is trying to accomplish.” Yet Sivaram believes Altman may be able to reach his goal of running multiple new data centers in a different way.
Sivaram, in addition to his position at the CFR, is the founder and CEO of Emerald AI, a startup that launched in July.
“I founded it directly to solve this blem,” he says—not just Altman’s blem specifically, but the larger blem of powering the data centers that all AI companies need.
Several smart minds in the odds of Sivaram’s company.
It’s backed by Radical Ventures, Nvidia’s venture capital arm NVentures, other VCs, and heavy-hitter individuals including Google chief scientist Jeff Dean and Kleiner Perkins chairman John Doerr.
Emerald AI’s premise is that the electricity needed for AI data centers is largely there already. Even big new data centers would confront power shortages only occasionally.
“The power grid is kind of a superhighway that faces peak rush hour just a few hours per month,” Sivaram says.
Similarly, in most places today the existing grid could handle a data center easily except in a few times of extreme demand.
Sivaram’s objective is to solve the blem of those rare high-demand moments the grid can’t handle. It isn’t all that difficult, at least in theory, he argues.
Some jobs can be paused or slowed, he explains, the training or fine-tuning of a large language model for academic re.
Other jobs, queries for an AI service used by millions of people, can’t be rescheduled but could be redirected to another data center where the local power grid is less stressed.
Data centers would need to be flexible in this way less than 2% of the time, he says; Emerald AI is int to help them do it by turning the theory to real-world action.
The result, Sivaram says, would be found: “If all AI data centers ran this way, we could achieve Sam Altman’s global goal today.” A paper by Duke University scholars, published in February, reported a test of the concept and found it worked.
Separately, Emerald AI and Oracle tried the concept on a hot day in Phoenix and found they could reduce power consumption in a way that didn’t degrade AI computation—“kind of having your cake and eating it too,” Sivaram says.
That paper is under peer review. No one knows if Altman’s 250-gigawatt plan will ve to be brilliant or folly. In these early days, Emerald AI’s future can’t be divined, as mising as it seems.
What we know for sure is that great challenges bring forth unimagined innovations—and in the AI era, we should brace for plenty of them. Fortune Global Forum returns Oct. 26–27, 2025 in Riyadh.
CEOs and global leaders will gather for a dynamic, invitation-only event shaping the future of . Apply for an invitation.