Amazon Web Services is building equipment to cool Nvidia GPUs as AI boom accelerates
Key Takeaways
AWS is trying to address the need for more data center cooling by developing its own hardware.
Article Overview
Quick insights and key information
3 min read
Estimated completion
investment
Article classification
July 9, 2025
10:40 PM
CNBC
Original publisher
The In-Row Heat Exchanger that Amazon engineers designed will cool Nvidia's massive AI systems
Amazon Web Services looked at building liquid-cooled data centers and systems from third parties before concluding that it should make its own solution. "They would take up too much data center floor space or increase water usage substantially," Dave Brown, vice president of compute and machine learning services at AWS, said in a
In this articleNVDAAMZN your favorite stocksCREATE FREE ACCOUNTThe letters AI, which stands for "artificial intelligence," stand at the Amazon Web Services booth at the Hannover Messe industrial trade fair in Hannover, Germany, on March 31, 2025
Julian Stratenschulte | Picture Alliance | Getty ImagesAmazon said Wednesday that its cloud division has developed hardware to cool down next-generation Nvidia graphics cessing units that are used for artificial intelligence workloads
Nvidia's GPUs, which have powered the generative AI boom, require massive amounts of energy
That means companies using the cessors need additional equipment to cool them down
Amazon considered erecting data centers that could accommodate widespread liquid cooling to make the most of these power-hungry Nvidia GPUs
But that cess would have taken too long, and commercially available equipment wouldn't have worked, Dave Brown, vice president of compute and machine learning services at Amazon Web Services, said in a posted to YouTube. "They would take up too much data center floor space or increase water usage substantially," Brown said. "And while some of these solutions could work for lower volumes at other viders, they simply wouldn't be enough liquid-cooling capacity to support our scale. "Rather, Amazon engineers conceived of the In-Row Heat Exchanger, or IRHX, that can be plugged into existing and new data centers
More traditional air cooling was sufficient for previous generations of Nvidia chips
Customers can now access the AWS service as computing instances that go by the name P6e, Brown wrote in a blog post
The new systems accompany Nvidia's design for dense computing power
Nvidia's GB200 NVL72 packs a single rack with 72 Nvidia Blackwell GPUs that are wired together to train and run large AI models
Computing clusters based on Nvidia's GB200 NVL72 have previously been available through Microsoft or CoreWeave
AWS is the world's largest supplier of cloud infrastructure
Amazon has rolled out its own infrastructure hardware in the past
The company has custom chips for general-purpose computing and for AI, and designed its own storage servers and networking routers
In running grown hardware, Amazon depends less on third-party suppliers, which can benefit the company's bottom line
In the first quarter, AWS dered the widest operating margin since at least 2014, and the unit is responsible for most of Amazon's net income
Microsoft, the second largest cloud vider, has ed Amazon's lead and made strides in chip development
In 2023, the company designed its own systems called Sidekicks to cool the Maia AI chips it developed
WATCH: AWS announces CPU chip, will der record networking speedwatch now2:1302:13AWS announces CPU chip, will der record networking speedNews s.
Related Articles
More insights from FinancialBooklet