Tesla and SpaceX CEO Elon Musk is no stranger to bold forecasts, and his latest prediction pushes the boundaries of how and where artificial intelligence could be built in the future. Speaking on the Dwarkesh podcast, Musk suggested that space may soon become the most cost-effective location to deploy AI infrastructure—possibly within the next 36 months.
During the wide-ranging conversation, Musk discussed the economics of orbital data centers, the growing difficulty of scaling power generation on Earth, and the long-term future of AI computing. According to him, the constraints faced by terrestrial data centers—ranging from land availability to energy inefficiencies—could make space-based AI infrastructure far more attractive than many currently believe.
“It’s actually harder to scale on the ground than it is to scale in space,” Musk said, outlining his reasoning for why the economics may soon favor orbital AI facilities.
Why Musk believes space makes financial sense
A major factor behind Musk’s prediction is energy efficiency. He explained that solar panels in space can generate significantly more power compared to those on Earth. Without atmospheric interference, cloud cover, or seasonal variation, solar energy collection becomes far more efficient.
“You’re going to get about five times the effectiveness of solar panels in space versus the ground,” Musk said. On Earth, solar panels lose a substantial portion of potential energy due to atmospheric absorption alone—roughly 30 percent, according to Musk. Add to that the challenges posed by clouds, shorter daylight hours, and seasonal changes, and the gap widens further.
Another key advantage, he noted, is the absence of a day-night cycle in orbit. Space-based systems can receive near-continuous sunlight, eliminating the need for large-scale battery storage. “You also avoid the cost of having batteries to carry you through the night,” Musk explained, arguing that energy storage is one of the most expensive components of Earth-based renewable power systems.
Taken together, these factors lead Musk to a striking conclusion: “It’s actually much cheaper to do in space.”
A timeline measured in months, not decades
While the idea of space-based data centers may sound futuristic, Musk believes the transition could happen far sooner than expected. He predicted that space would become “by far the cheapest place to put AI” within three years—or possibly even sooner.
“It will be space in 36 months or less,” Musk said, adding, “Maybe 30 months.”
This aggressive timeline aligns with Musk’s broader vision of rapidly scaling both space infrastructure and AI capabilities. SpaceX’s reusable rockets, particularly Starship, are central to this vision, as they aim to drastically reduce the cost of launching large payloads into orbit.
Addressing concerns about hardware reliability
One of the major concerns around space-based AI systems is hardware reliability, especially when it comes to GPUs used for large-scale AI training. Repairing or replacing failed components in orbit would be far more complex than on Earth.
Musk, however, downplayed these concerns. He argued that GPU failures are less common than many assume, particularly after the early stages of deployment. “It depends on how recent the GPUs are that have arrived,” he said.
According to Musk, most issues occur during the initial testing and debugging phase. Once GPUs move past that stage—whether they are made by Nvidia, Tesla, or other manufacturers producing chips like TPUs or Trainium—they tend to operate reliably over long periods.
“Once they start working and you’re past the initial debug cycle, they’re quite reliable past a certain point,” he said. As a result, Musk believes ongoing maintenance and servicing may not be as significant a challenge as critics suggest.
Implications for the future of AI infrastructure
If Musk’s prediction proves accurate, it could fundamentally reshape how AI infrastructure is designed and deployed. Today, AI data centers consume massive amounts of electricity and water, often straining local power grids and drawing environmental criticism. Moving such infrastructure to space could reduce pressure on Earth-based resources while unlocking unprecedented scale.
However, challenges remain. Regulatory frameworks, space debris concerns, and the technical complexity of orbital construction all pose significant hurdles. Still, Musk’s track record of pushing ambitious ideas—many of which later become reality—means his prediction is being taken seriously across the tech and space industries.
For now, the idea of AI data centers orbiting Earth may seem radical. But if Musk is right, space could soon become not just the final frontier for exploration, but also the cheapest and most efficient home for artificial intelligence.

