OpenAIβs $38B cloud deal with Amazon takes ChatGPT maker further beyond Microsoft

ChatGPT maker OpenAI, exercising newfound freedom under its renegotiated Microsoft partnership, will expand its cloud footprint for training and running AI models to Amazonβs infrastructure under a new seven-year, $38 billion agreement.
The deal, announced Monday, positions Amazon as a major infrastructure provider for Microsoftβs flagship AI partner, highlighting seemingly insatiable demand for computing power and increasingly complex alliances among big companies seeking to capitalize on AI.
It comes as Microsoft, Amazon, and big tech companies attempt to reassure investors whoβve grown concerned about a possible bubble in AI spending and infrastructure investment.
Under its new Amazon deal, OpenAI is slated to begin running AI workloads on Amazon Web Servicesβ new EC2 UltraServers, which use hundreds of thousands of Nvidia GPUs. Amazon says the infrastructure will help to run ChatGPT and train future OpenAI models.
Amazon shares rose nearly 5% in early trading after the announcement.
βScaling frontier AI requires massive, reliable compute,β said OpenAI CEO Sam Altman in the press release announcing the deal. βOur partnership with AWS strengthens the broad compute ecosystem that will power this next era and bring advanced AI to everyone.β
Matt Garman, the AWS CEO, said in the release that Amazonβs cloud infrastructure will serve as βa backboneβ for OpenAIβs ambitions.
In an interview with CNBC, Dave Brown, Amazonβs vice president of compute and machine learning services, said the new agreement represents βcompletely separate capacityβ that AWS is building out for OpenAI. βSome of that capacity is already available, and OpenAI is making use of that,β Brown told CNBC.
Amazon has also been deepening its investment in AI infrastructure for Anthropic, the rival startup behind the Claude chatbot. Amazon has invested and committed a total of $8 billion in Anthropic and recently opened Project Rainier, an $11 billion data center complex for Anthropicβs workloads, running on hundreds of thousands of its custom Trainium 2 chips.
Microsoft has been expanding its own relationship with Anthropic, adding the startupβs Claude models to Microsoft 365 Copilot, GitHub Copilot, and its Azure AI Foundry platform
Up to this point, OpenAI has relied almost exclusively on Microsoft Azure for the computing infrastructure behind its large language models. The new deal announced by Microsoft and OpenAI last week revised that relationship, giving OpenAI more flexibility to use other cloud providers β removing Microsoftβs right of first refusal on new OpenAI workloads.
At the same time, OpenAI committed to purchase an additional $250 billion in Microsoft services. Microsoft still holds specific IP rights to OpenAIβs models and products through 2032, including the exclusive ability among major cloud platforms to offer OpenAIβs technology through its Azure OpenAI Service.
OpenAIβs new $38 billion deal with Amazon builds on a relationship that began earlier this year, when Amazon added OpenAIβs first open-weight models in five years to its Bedrock and SageMaker services. Released under an open-source license, those models werenβt bound by OpenAIβs exclusive API agreement with Microsoft, letting Amazon offer them on its platforms.
The latest announcement is part of a series of deals by OpenAI in recent months with companies including Oracle and Google β committing hundreds of billions of dollars overall for AI computing capacity, and raising questions about the long-term economics of the AI boom.