AWS & OpenAI sign $38Bn deal to power next-gen AI workloads

Amazon Web Services (AWS) and OpenAI have announced a multi-year strategic partnership that allows AWS’s infrastructure to run and scale OpenAI’s core artificial intelligence (AI) workloads starting immediately.

The $38 billion agreement, set to expand over the next seven years, allows OpenAI access to AWS compute comprising hundreds of thousands of NVIDIA GPUs, with scope to scale to tens of millions of CPUs to support rapidly growing agentic workloads. AWS stated that it has deep experience in running large-scale AI infrastructure securely and reliably, with clusters exceeding 500,000 chips.

AWS said its leadership in Cloud infrastructure, combined with OpenAI’s work in Generative AI, will enable millions of users to continue benefiting from ChatGPT.

The surge in AI development has driven an unprecedented demand for compute resources. As AI developers push their models towards higher intelligence, AWS has become a preferred choice due to its performance, scalability, and security. OpenAI is expected to begin using AWS compute immediately, with all capacity scheduled for deployment by the end of 2026, and the potential for expansion into 2027 and beyond.

AWS’s infrastructure for OpenAI features an advanced architecture designed for efficiency and performance. By clustering NVIDIA GB200 and GB300 GPUs via Amazon EC2 UltraServers on the same network, the system enables low-latency performance across interconnected workloads. This configuration is built to handle a range of applications – from running ChatGPT inference to training next-generation models – with flexibility to adapt to OpenAI’s evolving needs.

OpenAI Co-Founder and CEO Sam Altman said: “Scaling frontier AI requires massive, reliable compute. Our partnership with AWS strengthens the broad compute ecosystem that will power this next era and bring advanced AI to everyone.”

Matt Garman, CEO of AWS, added: “As OpenAI continues to push the boundaries of what's possible, AWS's best-in-class infrastructure will serve as a backbone for their AI ambitions. The breadth and immediate availability of optimised compute demonstrates why AWS is uniquely positioned to support OpenAI's vast AI workloads."

The announcement continues the companies’ collaboration to advance AI technology for global use. Earlier in the year, OpenAI’s open weight foundation models became available on Amazon Bedrock, offering additional model options to millions of AWS customers.

OpenAI quickly became one of the most widely adopted model providers on Amazon Bedrock, with thousands of customers – including Bystreet, Comscore, Peloton, Thomson Reuters, Triomics, and Verana Health – using its models for agentic workflows, coding, scientific research, and mathematical analysis.

Further information on OpenAI’s open weight models in Amazon Bedrock can be found at: aws.amazon.com/bedrock/openai

For more startup news, check out the other articles on the website, and subscribe to the magazine for free. Listen to The Cereal Entrepreneur podcast for more interviews with entrepreneurs and big-hitters in the startup ecosystem.