Tech News
← Back to articles

SpaceX CEO Elon Musk says AI compute in space will be the lowest-cost option in 5 years — but Nvidia's Jensen Huang says it's a 'dream'

read original related products more articles

In addition to hardware costs, power generation and delivery and cooling requirements will be among the main constraints for massive AI data centers in the coming years. X, xAI, SpaceX, and Tesla CEO Elon Musk argues that over the next four to five years, running large-scale AI systems in orbit could become far more economical than doing the same work on Earth.

That's primarily due to 'free' solar power and relatively easy cooling. Jensen Huang agrees about the challenges ahead of gigawatt or terawatt-class AI data centers, but says that space data centers are a dream for now.

Terawatt-class AI datacenter is impossible on Earth

"My estimate is that the cost of electricity, the cost effectiveness of AI and space will be overwhelmingly better than AI on the ground so far, long before you exhaust potential energy sources on Earth," said Musk at the U.S.-Saudi investment forum. "I think even perhaps in the four- or five-year timeframe, the lowest cost way to do AI compute will be with solar-powered AI satellites. I would say not more than five years from now."

Jensen Huang, chief executive of Nvidia, notes that the compute and communication equipment inside today's Nvidia GB300 racks is extremely small compared to the total mass, because nearly the entire structure — roughly 1.95 tons out of 2 tons — is essentially a cooling system.

Musk emphasized that as compute clusters grow, the combined requirements for electrical supply and cooling escalate to the point where terrestrial infrastructure struggles to keep up. He claims that targeting continuous output in the range of 200 GW – 300 GW annually would require massive and costly power plants, as a typical nuclear power plant produces around 1 GW of continuous power output. Meanwhile, the U.S. generates around 490 GW of continuous power output these days (note that Musk says 'per year,' but what he means is continous power output at a given time), so using the lion's share of it on AI is impossible. Anything approaching a terawatt of steady AI-related demand is unattainable within Earth-based grids, according to Musk.

" There is no way you are building power plants at that level: if you take it up to say, a [1 TW of continuous power], impossible," said Musk. You have to do that in space. There is just no way to do a terawatt [of continuous power on] Earth. In space, you have got continuous solar, you actually do not need batteries because it is always sunny in space and the solar panels actually become cheaper because you do not need glass or framing and the cooling is just radiative."

While Musk may be right about issues with generating enough power for AI on Earth and the fact that space could be a better fit for massive AI compute deployments, many challenges remain with putting AI clusters into space, which is why Jensen Huang calls it a dream for now.

"That's the dream," Huang exclaimed.

... continue reading