This story originally appeared on Grist and is part of the Climate Desk collaboration. The companies frantically building and leasing data centers are well aware that they’re straining grids, driving emissions, and guzzling water. The electricity demand of AI data centers in particular could increase as much as 165 percent by 2030. Over half of the energy powering these sprawling facilities comes from fossil fuels, threatening to reverse progress toward addressing the climate crisis. Some of the biggest names in artificial intelligence say they have a solution: Just stick these colossal computer clusters in space. OpenAI CEO Sam Altman told manosphere podcaster Theo Von that he considers a massive expansion of data centers inevitable. “I do guess a lot of the world gets covered in data centers over time,” he said. (This is not, in fact, inevitable, but the result of unfathomably wealthy companies choosing to invest unfathomably large sums of money. Altman has speculated that he would quite literally put trillions into it, and OpenAI is part of the consortium behind the $500 billion Stargate project.) Altman is aware, however, that some people might not like this. “I’ve spoken with environmentalists,” he said. Then, he offered a suggestion. “Maybe we put [data centers] in space,” he said. “I wish I had, like, more concrete answers for you, but like, we’re stumbling through this.” Now, the idea of hurling data centers, the largest of which can cover over a million square feet, into orbit may seem impractical. But Altman’s not alone in considering it. Jeff Bezos and Eric Schmidt are also betting on the idea. Altman has proposed creating a Dyson sphere of data centers around the sun, referring to a hypothetical megastructure built around a star to capture much of its energy. The rather glaring downside to this is that building it would likely require more resources than exist on Earth, and could make the planet uninhabitable. But somewhat more realistic plans are inching closer to reality. Startups like Starcloud, Axiom, and Lonestar Data Systems have raised millions to develop them. There are at least 5,400 data centers in the United States, ranging from micro-size to thousand-server “hyperscalers,” and the number is growing fast. These facilities are expected to consume up to 12 percent of the nation’s electricity by 2028. Putting them in space, then, can seem like a panacea: solving the energy-use problem with 24/7 solar power, and freeing communities from the burden of air, noise, and water pollution. There’s some real science behind this. Ali Hajimiri, an electrical engineer and professor with Caltech’s Space Solar Power Project, sought a patent for a “massively parallel computational system in space”—as in, a data center—back in 2016. Since then, launch costs have gone down (to around $1,500 per kilogram, by one estimate) and solar panels have gotten lighter and more efficient. Hajimiri and his colleagues recently proposed a lightweight space-based solar power system that could generate electricity at 10 cents per kilowatt-hour, significantly cheaper at scale than comparable systems here on Earth. Such technology theoretically could power orbital data centers like those Altman imagines, though Hajimiri is still not sure when they could be built at the kind of scale companies like OpenAI demand. “I never want to say something cannot be done,” he said. “But there are challenges associated with it.”