Tech News
← Back to articles

AMD CEO Lisa Su 'emphatically' rejects talk of an AI bubble — says claims are 'somewhat overstated'

read original related products more articles

AMD CEO Lisa Su used her appearance at WIRED’s Big Interview conference in San Francisco to push back against growing speculation that the AI sector is overheating. Asked whether the industry is in a bubble, Su replied “emphatically” no, arguing that concerns are "somewhat overstated” and that AI is still in its infancy. According to Su, AMD needs to be ready to provide chips for the future — “there’s not a reason not to keep pushing that technology."

Her remarks come as AMD prepares for several of its largest data-center commitments to date, including a multi-gigawatt accelerator deployment with OpenAI and the resumption of MI308 shipments to China under a new export-control framework.

OpenAI plans to deploy six gigawatts of Instinct GPUs over the next several years under a joint announcement the companies made earlier this year. The first one-gigawatt block is scheduled for the second half of next year. As part of that arrangement, OpenAI secured the option to buy up to 160 million AMD shares at a penny each once deployment milestones are met. AMD presented the structure as a way to align long-term incentives around infrastructure delivery rather than a short window of product availability.

Meanwhile, the company's operations in China have been shaped by a different kind of uncertainty. AMD has confirmed that it will pay a 15% export tax on MI308 shipments under revised export rules, and that it is ready to do so. Washington halted sales of the part in April before reopening a licensing process that allowed vendors to apply for restricted shipments.

AMD has told investors that the original controls would create up to $800 million in inventory and purchase-commitment charges, which makes re-entering the market on known terms a positive step, even with the additional fee. China will not be the main driver of AMD’s data-center revenue in the near term, but it remains one of the few regions with customers capable of absorbing large accelerator batches at short notice.

Su’s comments also addressed pressure from hyperscalers that are expanding their in-house silicon portfolios. She argued that AMD’s challenge is not matching any single rival but advancing its own roadmap quickly enough to capture the next wave of deployments.

In her view, each generation of AI models raises performance expectations, and the industry’s underlying trajectory supports sustained investment in training and inference clusters. For a company that has spent much of the past decade rebuilding its position in high-performance computing, the coming cycle will test how well that confidence translates into delivered hardware and long-term customer commitments.

Stay On the Cutting Edge: Get the Tom's Hardware Newsletter Get Tom's Hardware's best news and in-depth reviews, straight to your inbox. Contact me with news and offers from other Future brands Receive email from us on behalf of our trusted partners or sponsors

Follow Tom's Hardware on Google News, or add us as a preferred source, to get our latest news, analysis, & reviews in your feeds.