AI bubble? What AI bubble? If you ask Nvidia CEO Jensen Huang, we're in a "new industrial revolution."
Huang's company, of course, makes chips and computer hardware, the "picks and shovels" of the AI gold rush, and it's become the world's largest business by capitalizing on AI's growth, bubble or not. Speaking on Wednesday during an earnings call as his company reported revenue of $46.7 billion in the past quarter, he indicated no sign that the incredible growth of the generative artificial intelligence industry will slow.
"I think the next several years, surely through the decade, we see really significant growth opportunities ahead," Huang said.
Compare that with recent comments from OpenAI CEO Sam Altman, who said he believes investors right now are "overexcited about AI." (Altman also acknowledged that he still believes AI is "the most important thing to happen in a very long time.")
Huang said his company has "very, very significant forecasts" of demand for more of the chips and computers that run AI, indicating the rush for more data centers is not stopping soon. He speculated that AI infrastructure spending could hit $3 trillion to $4 trillion by the end of the decade. (The gross domestic product of the US is around $30 trillion.)
That means a lot of data centers, which take up a lot of land and use a great deal of water and energy. These AI factories have gotten bigger and bigger in recent years, with significant impacts on the communities around them and a greater strain on the US electric grid. And the growth of different generative AI tools that require even more energy could make that demand even greater.
Don't miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source on Chrome.
More powerful and demanding models
One prompt on a chatbot doesn't always mean one prompt anymore. A source of increased demand for computational power is that newer AI models that employ "reasoning" techniques are using a lot more power for one question. "It's called long thinking, and the longer it thinks, oftentimes it produces better answers," Huang said.
This technique allows an AI model to research on different websites, try a question multiple times to get better answers and put disparate information together into one response.
... continue reading