OpenAI CEO Sam Altman speaks to media following a Q&A at the OpenAI data center in Abilene, Texas, U.S., Sept. 23, 2025. Shelby Tauber | Reuters Nvidia's massive investment in OpenAI, announced earlier this week, will put billions of dollars into the coffers of the artificial intelligence startup to use as it sees fit. But most of the money will go towards use of Nvidia's cutting-edge chips. The agreement between the two companies was big on numbers but thin on specifics. They said the investment would reach up to $100 billion, paid out as AI supercomputing facilities open in the coming years, with the first one coming online in the second half of 2026. The timing of the buildouts and the cost of each data center remains up in the air. However, what's become clear is that OpenAI plans to pay for Nvidia's graphics processing units (GPUs) through lease arrangements, rather than upfront purchases, according to people familiar with the matter who asked not be named because the details are private. Nvidia CEO Jensen Huang, who described this week's deal as "monumental in size," has estimated that an AI data center with a gigawatt of capacity costs roughly $50 billion, with $35 billion of that used to pay for Nvidia's GPUs. By leasing the processors, OpenAI can spread its costs out over the useful life of the GPUs, which could be up to five years a person said, leaving Nvidia to bear more of the risk. The Information previously reported on some aspects of the lease arrangement. watch now Nvidia agreed to invest over time as OpenAI's data centers get up and running. The initial $10 billion will be available to OpenAI soon, and help the company work towards deploying its first gigawatt of capacity, a source told CNBC. While Nvidia's equity investment could help OpenAI with hiring, marketing and operations, the biggest single item it will be used for is compute, the people said. And that's almost entirely directed at Nvidia's GPUs, which are key to building and training large language models and for running AI workloads. As a non-investment-grade startup that lacks positive cash flow, financing remains costly. OpenAI executives have called equity the most expensive way to fund data centers, and said that the company is preparing to take on debt to cover the remainder of the expansion. In addition to offering a cost-efficient way for OpenAI to access chips, Nvidia's lease option and long-term commitment can help the company land better terms from banks when it comes to raising debt, a person said. An Nvidia spokesperson declined to comment. 'They will get paid' Speaking to CNBC in Abilene, Texas, home to the first new data center, OpenAI CFO Sarah Friar pointed to the role Oracle and Nvidia are playing in the financing. Oracle , one of OpenAI's partners on the Stargate project, is leasing the Abilene facility, and OpenAI will eventually pay for the operations. "Folks like Oracle are putting their balance sheets to work to create these incredible data centers you see behind us," Friar said. "In Nvidia's case, they're putting together some equity to get it jumpstarted, but importantly, they will get paid for all those chips as those chips get deployed." She said all the big partners are needed to help relieve a dramatic shortage of capacity. "What I think we should all be focused on today is the fact that there's not enough compute," Friar said. "As the business grows, we will be more than capable of paying for what is in our future — more compute, more revenue." The steel frame of data centers under construction during a tour of the OpenAI data center in Abilene, Texas, U.S., Sept. 23, 2025. Shelby Tauber | Reuters Still, the OpenAI-Nvidia deal has raised some concerns about the sustainability of the AI boom. Nvidia's march to a $4.3 trillion market cap has been driven by GPU sales to OpenAI as well as to tech megacaps like Google , Meta , Microsoft and Amazon . OpenAI's path to a $500 billion private market valuation has been enabled by hefty investments from Microsoft and others that allow the company to burn billions of dollars in cash while building its AI models that power services including ChatGPT. Jamie Zakalik, an analyst at Neuberger Berman, said the Nvidia deal is the latest example of OpenAI raising money that it pours right back into the company providing the capital. Investors are concerned about the "circular nature of this deal goosing up everyone's earnings and everyone's numbers," said Zakalik. "But it's not actually creating anything." Asked about those fears, Altman told CNBC the company is focused on driving real demand. "We need to keep selling services to consumers and businesses — and building these great new products that people pay us a lot of money for," he said. "As long as that keeps happening, that pays for a lot of these data centers, a lot of chips." — CNBC's Kif Leswing contributed to this report WATCH: Oracle, OpenAI and SoftBank unveil Stargate data center expansion