AI·Eye on AII met Sam Altman in Texas.
He’s turning the race for AI into a gigawatt arms raceBy Sharon GoldmanBy Sharon GoldmanAI ReporterSharon GoldmanAI ReporterSharon Goldman is an AI reporter at Fortune and co- Eye on AI, Fortune’s flagship AI .
She has written digital and enterprise for over a decade.SEE FULL BIO OpenAI CEO Sam Altman talks to reporters outside Building 2 at the Stargate data center site in Abilene, TexasSharon GoldmanWelcome to Eye on AI, with AI reporter Sharon Goldman.
In this edition…Open AI’s gigawatt arms race is underway in Abilene, Texas…Nscale announces record- $1.1 billion Series B...OpenAI and Databricks strike AI agent deal…Trump administration will vide Elon Musk’s xAI to federal agencies.
Sam Altman stood outside Building 2 at OpenAI, Oracle, and SoftBank’s flagship Stargate data center in Abilene, Texas.
He — along with the cluster of journalists peppering him with questions — looked small against the backdrop of the sprawling 800-acre site, swarming with thousands of construction workers and dotted with spools of fiber cable, steel beams, water pipes, and heavy machinery.
As I reported on Tuesday, we were there for a media event to tout the gress of their high-file and ambitious “Stargate” AI infrastructure ject.
They announced an expansion of the Abilene site, plus plans to build five massive new data center complexes across the U.S. over the next several years.
Altogether, the initiative represents hundreds of billions of dollars in investment — a ject of mind-boggling scale.
In Abilene alone, a crew of 6,400 workers has already flattened hills by moving mountains of soil and laid enough fiber optic cable to wrap the Earth 16 times.
“We cannot fall behind in the need to put the infrastructure together to make this revolution happen,” Altman told reporters during the media event, which also included Clay Magouryk.
one of Oracle’s two new CEOs, as well as Texas Senator Ted Cruz.
“What you saw today is just a small fraction of what this site will eventually be — and this site is just a small fraction of what we’re building.
All of that still won’t be enough to serve even the demand of ChatGPT,” he added, referring to OpenAI’s flagship duct.
Building AI with brute industrial force Altman and OpenAI have been relentless in their drive to “scale compute.” By this, they don’t mean chasing the next algorithmic breakthrough or elegant line of code.
They mean brute industrial force: millions of chips, sprawling campuses wired with fiber, and gigawatts of electricity — along with the gallons of water needed to help cool all that equipment..
To OpenAI, scaling compute means piling on ever more of this horsepower, betting that sheer scale — not software magic — is what will unlock not just artificial general intelligence (AGI), which the company defines as “highly autonomous systems that outperform humans at most economically valuable work,” but what it calls artificial super intelligence (ASI), that would hypothetically surpass human capabilities in all domains.
That’s why OpenAI keeps pointing to a number: 10 gigawatts of capacity across the Stargate ject sites.
Ten gigawatts — enough to power roughly 7.5 million s, or an entire regional grid — marks a shift in how AI capacity is measured.
At this scale, Altman explained to me with a quick handshake before the press gaggle, companies OpenAI don’t even bother counting GPUs anymore.
The unit of measure has become gigawatts: how much electricity the entire fleet of chips consumes.
That number is shorthand for the only thing that matters: how much compute the company can keep running.
That’s why it was so striking to come from Texas and read Alex Heath’s Sources the very next day.
In it, Heath revealed an internal Slack note Altman had d with employees on the same day I saw him in Abilene.
Altman spelled out what he called OpenAI’s “audacious long-term goal”: to build not 10, not 100, but a staggering 250 gigawatts of capacity by 2033.
In the note, he disclosed that OpenAI started the year at “around” 230 megawatts of capacity and is “now on track to exit 2025 north of 2GW of operational capacity.” To put that into perspective: 250 gigawatts would be a quarter of the entire U.S.
electrical generation capacity, which hovers around 1,200 GW. And Altman isn’t just talking electricity.
The number is shorthand for the entire industrial system required to use it: the chips, the data centers, the cooling and water, the networking fiber and high-speed interconnects to tie millions of cessors into supercomputers.
‘A new core bet’ for OpenAI Heath reported that Altman’s Slack note announced OpenAI is “formalizing the industrial compute team,” led by Peter Hoeschele, who reports to president Greg Brockman.
“The mission is simple: create and der massive usable compute as fast as physics allows, to power us through ASI,” Altman wrote.
“In several years, I think this could be something a gigawatt per week, although that will require us to completely reimagine how we build compute.” “Industrial compute should be considered a new core bet ( re, consumer devices, custom chips, robotics, applications, etc.) which will hire and operate in the way it needs to run at maximum effectiveness for the domain,” Altman continued.
“We’ve already invested hundreds of billions of dollars, and doing this right will cost trillions.
We will need support from team members across OpenAI to help us move fast, unlock jects, and the path for the buildout ahead.” A quarter of the U.S. power grid. Trillions in cost.
Does that sound bonkers to you? It does to me — which is precisely why I hopped on a plane to Dallas, rented a car, and drove three hours through rolling hills and ranches to Abilene to see for myself.
The scale of this one site is staggering. Imagining it multiplied by dozens is nearly impossible.
I told Altman that the scene in Abilene reminded me a bit of a tour I recently took of Hoover Dam, one of the great engineering feats of the 20th century that duces 2 gigawatts of power at capacity.
In the 1930s, Hoover Dam was a symbol of American industrial might: concrete, turbines, and power on a scale no one had imagined.
Altman acknowledged that “people to pick their historical analogies” and thought the “vibe was right” to compare Stargate to Hoover Dam.
It wasn’t his own personal favorite, however: “A recent thing I’ve thought is airplane factories,” he said.
“The history of what went into airplane factories, or container ships, the whole industry that came around those,” he said.
“And certainly, everything that went into the Apollo gram.” The need for public awareness That’s when I realized: whether you think Altman’s goals make sense, seem nuts, or feel downright reckless really comes down to what you believe AI itself.
If you think supercharged versions of AI will change everything — and mostly for the good, curing cancer — or you are a China hawk that wants to win the new AI ‘cold war’ with China, then Altman’s empire of data centers looks a necessary bet.
If you’re skeptical, it looks the biggest boondoggle since America’s grandest infrastructure follies: think California’s long-awaited high-speed rail.
If you’ve read Karen Hao’s Empire of AI, you might also be shouting that scaling isn’t inevitable — that building a ‘compute empire’ risks centralizing power, draining resources, and sidelining efficiency and safety.
And if you think AGI will kill us all, Eliezer Yudowsky? Well, you won’t be a fan. No one can predict the future, of course.
My greater concern is that there isn’t nearly enough public awareness of what’s happening here.
I don’t mean Abilene, with its mesquite shrubland ground into dust, or even OpenAI’s expanding Stargate ambitions around the US and beyond.
I mean the vast, almost unimaginable infrastructure buildout across Big — the buildout that’s pping up the stock market, fueling a data center arms race with China, and reshaping energy, land, and labor around the world.
Are we sleepwalking into the equivalent of an AI industrial revolution—and not a metaphorical one, but in terms of actual building of physical stuff—without truly reckoning with its costs versus its benefits?
Even Sam Altman doesn’t think enough people understand what he’s talking . “Do you feel people understand what ‘compute’ is?” I asked him outside of Building 2.
That is, does the average citizen really grok what Altman is saying the physical manifestation of these mega data centers? “No, that’s why we wanted to do this,” he said the Abilene media event.
“I don’t think when you hit the button on ChatGPT…you think of walking the halls here.” Of course, Hoover Dam, too, was also divisive, controversial and considered risky.
But I wasn’t a when it was built.
This time I could see the dust rising in Abilene with my own eyes — and while Altman talked walking the newly-built halls filled with racks of AI chips, I walked away unsettled what comes next.
With that, here’s the rest of the AI news. Sharon Goldmansharon.goldman@fortune.com@sharongoldmanFORTUNE ON AISam Altman’s AI empire will devour as much power as New York City and San Diego combined.
Experts say it’s ‘scary’ – by Eva RoytburgExclusive: Startup using AI to automate software testing in the age of ‘vibe coding’ receives $20 million in new venture funding – by Jeremy KahnOpenAI plans to build 5 giant U.S.
‘Stargate’ datacenters, a $400B challenge to Meta and Microsoft in the relentless AI arms race – by Sharon GoldmanAI IN THE NEWSNscale announces record- $1.1 billion Series B.
UK cloud infrastructure company Nscale announced a $1.1 billion funding round, the largest in UK and European history.
The Series B, led by Aker ASA with participation from NVIDIA, Dell, Fidelity, Point72, and others, will accelerate Nscale’s rollout of “AI factory” data centers across Europe, North America, and the Middle East.
The company, which recently unveiled partnerships with Microsoft, NVIDIA, and OpenAI to establish Stargate UK and launched Stargate Norway with Aker, says the funding will expand its engineering teams and GPU deployment pipeline as it races to der sovereign, energy-efficient AI infrastructure at massive scale.
OpenAI and Databricks strike AI agent deal.
OpenAI and data platform Databricks struck a multiyear deal expected to generate $100M, the Wall Street Journal reported, making OpenAI’s models—including GPT-5—natively available inside Databricks so enterprises can build AI agents on their own data “out of the box.” The partnership includes joint re, and OpenAI COO Brad Lightcap says the two aim to “far eclipse” the contracted figure.
It targets a key adoption barrier—reliable, data-integrated agents—tapping Databricks’ 20,000+ customers and $4B ARR foot (Mastercard is already using Databricks-built agents for onboarding/support).
The move sits alongside Databricks’ model partnerships (e.g., Anthropic) and a broader vendor push (Salesforce, Workday) to pair agent tooling with customer data, as OpenAI ramps its infrastructure ambitions with Oracle/SoftBank.Trump administration will vide Elon Musk’s xAI to federal agencies.
According to the Wall Street Journal, Elon Musk's xAI will be available to federal agencies via the General Services Administration for just 42 cents—part of a broader effort to bring top AI systems into government.
The arrangement mirrors similar nominal-fee deals with Google (47 cents), OpenAI ($1), and Anthropic ($1), meaning Washington is now working with all four leading U.S.
model makers, each of which also has $200M Pentagon contracts.
Officials say the low-cost access is less revenue than securing a foothold in government AI adoption, where automating bureaucratic cesses is seen as a major opportunity.
The move also highlights a thaw in Musk’s relationship with the White House, while underscoring the administration’s push to foster competition among frontier AI viders. AI CALENDAROct.
6-10: World AI Week, Amsterdam Oct. 21-22: TedAI San Francisco. Apply to attend here. Nov. 10-13: Web Summit, Lisbon. Nov. 26-27: World AI Congress, London.Dec. 2-7: NeurIPS, San Diego Dec.
8-9: Fortune Brainstorm AI San Francisco. Apply to attend here. EYE ON AI NUMBERS$923 BillionThat's how much AI-driven capital expenditures in fiscal year 2025 will translate into total U.S.
economic output, according to new economic modeling results released by IMPLAN.
The analysis, based on reported 2025 capital expenditure estimates, found that Amazon, Alphabet, Microsoft, and Meta are set to spend a record $364B on AI-driven capital expenditures in fiscal 2025—more than all new U.S.
commercial construction in 2023. The modeling showed that those dollars will generate $923B in total U.S. economic output, support 2.7M jobs, and add $469B to GDP.
Every $1 invested, the report said, yields $2.50 in impact, rippling from construction and chip manufacturing to retail and local services.
For policymakers, it’s a reminder: Big ’s AI buildout isn’t just data centers—it’s reshaping the broader U.S. economy.Fortune Global Forum returns Oct. 26–27, 2025 in Riyadh.
CEOs and global leaders will gather for a dynamic, invitation-only event shaping the future of . Apply for an invitation.