After securing substantial deals for AI processors from industry leaders Nvidia and AMD, OpenAI is now poised to embark on an ambitious journey: designing and deploying its own specialized chips. This move comes as the company commits hundreds of billions of dollars to construct next-generation data centers.
The creators of the popular ChatGPT AI, announced Monday their collaboration with Broadcom, a renowned chip manufacturer based in San Jose, California. Starting in the latter half of next year, the partnership aims to roll out chips capable of consuming a staggering 10 gigawatts of electricity—enough to power millions of homes.
This new initiative complements OpenAI’s previous commitment to deploy Nvidia and AMD chips, which collectively were projected to consume 16 gigawatts of power.
In a public statement, OpenAI’s chief executive, Sam Altman, emphasized the strategic importance of this development, stating, “Creating our own accelerators enriches the wider ecosystem of partners, all working to build the necessary capacity to advance the boundaries of artificial intelligence.”
This latest pact with Broadcom is part of a larger strategy by OpenAI to establish a global network of cutting-edge data centers. The company is already constructing a major facility in Abilene, Texas, with plans for more in other parts of Texas, New Mexico, Ohio, and throughout the Midwest region.
OpenAI joins a growing list of technology giants, including Amazon, Google, Meta, and Microsoft, collectively investing hundreds of billions—over $325 billion this year alone—into the foundational infrastructure for artificial intelligence: new data centers.
Nvidia, currently the world’s most valuable publicly traded company, holds a dominant position in the market for AI chips that power technologies such as ChatGPT. However, an increasing number of firms are now developing their own chip designs to contend with this supremacy. This includes major tech players like Google and Amazon, established chipmakers such as AMD, and innovative startups like Cerebras and Groq.
Notably, Google also maintains a partnership with Broadcom for the design of its own AI-focused chips.
It’s worth noting that OpenAI’s existing agreements with Nvidia and AMD also served to inject crucial capital for its data center construction. Nvidia, for instance, committed a staggering $100 billion investment, and AMD agreed to provide OpenAI with 160 million shares of its stock, representing approximately 10 percent of the chipmaker’s total shares.
In contrast to those deals, Broadcom is not directly investing in OpenAI or offering stock. By developing its custom chips, OpenAI aims to lessen its reliance on external chip manufacturers like Nvidia and AMD, thereby enhancing its negotiating position in future partnerships and procurements.
(A relevant note: The New York Times filed a lawsuit against OpenAI and Microsoft in 2023, alleging copyright infringement regarding the use of its news content in AI systems. Both OpenAI and Microsoft have publicly denied these allegations.)