Tech News
← Back to articles

Elon Musk wants foundry partners to build astounding '100 – 200 billion AI chips' per year — Musk says chipmaking industry can't deliver on his goals

read original related products more articles

It's no secret that Elon Musk has tremendous ambitions when it comes to artificial intelligence, but apparently, they are so tremendous that he wants to get more AI processors than the industry produces, or even can produce. As it turns out, Tesla might need '100 – 200 billion AI chips per year' and if it cannot get them from existing foundry partners, then the company will consider building its own fabs, which Musk discussed several weeks ago. Now he has elaborated on those goals further.

"I have tremendous respect for TSMC and Samsung, we work with both TSMC and Samsung at Tesla and SpaceX. They are great companies and we want them to make our chip as quickly as they can and scale up to the highest possible volume that they are comfortable doing," said Elon Musk, during his conversation with Ron Baron. "But it doesn't appear to be fast enough. When I asked how long it would take from start to finish to build a new chip fab built, they said five years to get to production. Five years for me is eternity. My timelines are one year, two years. […] I cannot even see past three years. This is not going to be fast enough. If they change their minds and say, yeah, they are going to go faster and they want to provide us with 100 billion, 200 billion AI chips a year in the time frame that we need them, that is great."

Starman @elonmusk joins our Founder and CEO @RonBaronAnalyst for a virtual fireside chat to discuss the future. Livestream starts at 1:05pm ET. https://t.co/6ceIb5OHTeNovember 14, 2025

Musk did not say when Tesla and SpaceX would require those 100 to 200 billion AI processors a year, but that number is pretty insane, assuming that he meant units, not dollars. To put it into context, the industry supplied 1.5 trillion semiconductor devices globally in 2023, according to the Semiconductor Industry Association. Yet, this number is a bit misleading, because the term 'chip' covers a wide variety of devices, ranging from tiny microcontrollers and sensors to memory chips and logic devices. Logic devices like Nvidia's H100 or B200/B300 AI GPUs are huge pieces of silicon that are hard to make, and thus take the longest lead and production times.

Musk recently said he believed power consumption for his AI5 AI processors could drop to as low as 250W. The power rating (TDP) of a chip can often be used as a decent relative proxy for the size of a chip, and by comparison, Nvidia's B200 GPUs can consume up to 1,200W, or nearly five times more power, thus implying that the AI5 will be a much smaller chip. Regardless, there absolutely isn't enough production capacity to meet Musk's targets, even if his chips are much smaller.

As one of the biggest clients of TSMC, Nvidia has supplied four million Hopper GPUs worth $100 billion (not counting China) throughout the active lifespan of the architecture, which was about two calendar years. With Blackwell, Nvidia has sold around six million GPUs, which equate to three million GPU packages, in the first four quarters of their lifespan.

If Musk indeed meant 200 billion units, then he would like to get orders of magnitude more AI processors than the industry (which is largely produced by TSMC) can build in a year. Yet, if he by any chance was referring to $100 - $200 billion worth of AI processors, then TSMC and Samsung Foundry could certainly produce that volume in the coming years. However, given that Musk is not satisfied with how quickly TSMC and Samsung build fabs, it looks like he indeed thinks he needs more than these companies can supply.

"We will be using TSMC fabs in Taiwan and Arizona, Samsung fabs in Korea and Texas," said Musk. "From their standpoint, they are moving like lightning. I am just saying that, nonetheless, it would be a limiting factor for us. They're going as fast as they can, but from their standpoint, it's 'pedal to the metal.' They just never had someone, a company, with our sense of urgency. It might just be that the only way to get to scale at the rate that we want to get to scale is to build up a real big fab, or be limited in output of Optimus and self-driving cars because of AI chip [supply]."

Whether Tesla and SpaceX really need 100–200 billion chips per year remains unclear. Tesla sold 1.79 million vehicles in 2024, so it does not need more than two million chips for its cars. Of course, the company might need millions more AI processors for its AI training efforts, though we have reasonable doubts that it can indeed build AI clusters powered by billions of GPUs any time soon. Also, while anthropomorphic Optimus robots, also powered by Tesla's AI hardware, could be a big market, it will take years to develop.

... continue reading