Elon Musk has made a shocking statement in his latest podcast: due to the stagnation of global electricity growth, space will become the most cost-effective and efficient place to deploy artificial intelligence (AI) within the next three years. This remark has once again brought the cutting-edge topic of "space GPUs" to the forefront of the global capital market.

Musk pointed out that the world is currently facing a severe power bottleneck. Compared to the exponential growth of chip production, electricity production has barely increased. He boldly predicted that by the end of 2026, humans may face a situation where there are plenty of chips but not enough power to turn them on. In space, solar panels generate five times more power than on Earth, and there is no need for expensive battery storage to get through the night, making space deployment economically superior.

In addition to energy advantages, Musk also discussed the complicated approval processes for ground construction. He believes it is extremely difficult to expand power facilities on the ground, while space deployment can avoid these administrative constraints. He expects this complete reversal in cost structure to happen within 30 to 36 months. Regarding concerns about maintenance, he said that after initial debugging on the ground, the reliability of the chips is very high, and space maintenance will not be a core obstacle.

Regarding the operational pressure of current data centers, Musk revealed details about the Memphis data center of xAI: cooling alone has increased power consumption by 40%. Additionally, expensive solar tariffs and limited domestic capacity in the U.S. have restricted the expansion of ground-based power. Musk finally mentioned that to achieve this vision, future TeraFab factories may need to achieve full in-house production from logic chips to memory and packaging to address the challenges posed by the sharp rise in memory chip prices.

Key Points:

  • 🌌 Space Deployment Becomes the Most Cost-Effective: Musk predicts that within 36 months, AI deployment in space will be far cheaper than on the ground, due to higher light efficiency in space and no need for battery storage.

  • Global Power Bottleneck: Ground electricity growth cannot keep up with chip production, and by the end of 2026, large computing clusters may be unable to start due to lack of power.

  • 🏗️ Vertical Integration of the Supply Chain: To overcome challenges such as the inability to purchase gas turbines and high memory prices, Musk plans for his factories to produce logic and memory chips in-house and achieve self-packaging.