OpenAI is leaving few stones unturned in the race to build compute capacity for its AI efforts. The ChatGPT maker on Wednesday said it had struck agreements with two of the world’s biggest manufacturers of memory chips, Samsung Electronics and SK Hynix, to make DRAM wafers for the Stargate AI infrastructure project, and build data centers in South Korea. The companies signed the letters of intent following a meeting in Seoul between OpenAI CEO Sam Altman, South Korea’s president Lee Jae-myung, Samsung Electronics’ executive chairman Jay Y. Lee, and SK chairman Chey Tae-won. Under the deal, Samsung and SK Hynix plan to scale their manufacturing to produce up to 900,000 high-bandwidth memory DRAM memory chips per month for use in Stargate and AI data centers. SK Group noted in a separate statement that this would be more than double the current industry capacity for high-bandwidth memory chips. Stargate is a massive infrastructure project by OpenAI, Oracle, and SoftBank that seeks to spend $500 billion to build data centers dedicated to AI development in the United States. Techcrunch event Join 10k+ tech and VC leaders for growth and connections at Disrupt 2025 Netflix, Box, a16z, ElevenLabs, Wayve, Hugging Face, Elad Gil, Vinod Khosla — just some of the 250+ heavy hitters leading 200+ sessions designed to deliver the insights that fuel startup growth and sharpen your edge. Don’t miss the 20th anniversary of TechCrunch, and a chance to learn from the top voices in tech. Grab your ticket before doors open to save up to $444. Join 10k+ tech and VC leaders for growth and connections at Disrupt 2025 Netflix, Box, a16z, ElevenLabs, Wayve, Hugging Face, Elad Gil, Vinod Khosla — just some of the 250+ heavy hitters leading 200+ sessions designed to deliver the insights that fuel startup growth and sharpen your edge. Don’t miss a chance to learn from the top voices in tech. Grab your ticket before doors open to save up to $444. San Francisco | REGISTER NOW Wednesday’s agreements follow a month of frenetic investment in AI compute capacity, and OpenAI has been the locus of a lot of that activity. Just a couple of weeks ago, Nvidia said it would invest up to $100 billion in OpenAI as part of a deal that would give the ChatGPT maker access to more than 10 gigawatts of compute capacity via Nvidia’s AI training systems. The following day, OpenAI said it would build out five data centers with SoftBank and Oracle for the Stargate project, aiming to increase its total compute capacity to 7 gigawatts. Earlier in September, Oracle agreed to sell $300 billion of compute capacity to OpenAI over five years. OpenAI said it is also working with the Korean Ministry of Science and ICT to find opportunities to build AI data centers outside Seoul, and that it had struck a separate deal with SK Telecom to build an AI data center. The AI company also signed a few other agreements with Samsung subsidiaries to explore avenues for building more data centers in the country. Samsung and SK Group will also integrate ChatGPT Enterprise and OpenAI APIs into their operations as part of the deal.