What if the next generation of datacenters wasn’t built on Earth, but in orbit? The notion might feel like science fiction, yet it is drawing in today’s tech and space giants alike.
With the explosive demand for compute power driven by artificial intelligence and the mounting strains on terrestrial energy resources, the concept of space-based datacenters is gaining credibility.
The possibility of a merger between Elon Musk’s SpaceX and xAI illustrates the growing appetite for this approach. The promises are enticing—unlimited solar energy, natural cooling, a smaller carbon footprint—yet the challenges are just as formidable: launch costs, hardware reliability, maintenance that may be impossible to perform.
What exactly are we talking about?
Space-based AI datacenters would be computing infrastructures deployed in low Earth orbit or higher, combining servers, AI accelerators (GPUs, TPUs, ASICs) and expansive solar arrays. They would rely on hundreds of interconnected satellites to meet the massive compute needs for training and inferencing highly resource-intensive AI models.
Beyond the atmosphere, satellites would enjoy uninterrupted sun exposure and could dissipate heat directly into the vacuum of space, thereby removing two of the biggest challenges faced by terrestrial datacenters.
Several programs are currently shaping this still-emerging concept, signaling genuine industrial enthusiasm.
> Google and the Project Suncatcher
Google is developing Project Suncatcher, an ambitious network of around 80 solar-powered satellites positioned about 400 km up, equipped with TPU (tensor processing units) to run AI workloads. These satellites would be interconnected through optical links and would relay results back to Earth via high-bandwidth laser links. The first two prototypes are expected in 2027, in partnership with Planet Labs.
> The ASCEND Initiative
In Europe, the ASCEND project (Advanced Space Cloud for European Net zero emission and Data sovereignty), led by Thales Alenia Space and funded by the European Commission, concludes that orbital datacenters are feasible and could contribute to carbon neutrality goals and European digital sovereignty. It draws on a consortium blending environmental experts (including Carbone 4), cloud players (Orange Business, HPE, CloudFerro), launchers (ArianeGroup) and space agencies.
Thales Alenia Space is also testing Space Edge Computing on a smaller scale, deploying a hardened onboard computer carrying Microsoft Azure on the ISS to process Earth observation streams in orbit with AI applications like DeeperVision. This approach foreshadows hybrid architectures where part of the AI processing happens in orbit and the rest in terrestrial clouds.
> Starcloud and Nvidia: the “hypercluster” objective
Starcloud, backed by Nvidia and Google, took a major step last month by launching the Starcloud-1 satellite aboard a Falcon 9 rocket.
Equipped with an Nvidia H100 chip—the most powerful ever sent into orbit—it trains and runs Google’s Gemma model as a “proof of concept.” The company advocates for orbiting data centers powered around the clock by solar energy, with a promise to cut electricity costs by up to 90% and reduce CO2 emissions over the lifecycle by a factor of 10 compared with terrestrial datacenters, assuming optimized launches and robotic maintenance. It ultimately envisions a modular “hypercluster” delivering roughly five gigawatts of compute power.
The Japan-U.S. Alliance Against China
In Japan, Space Compass and Microsoft are exploring a network of optical relay satellites with edge computing capabilities to bring AI computing even closer to the orbital sensors and the Azure cloud.
China is not staying behind, announcing its intention to create a “space cloud” within the next five years. The China Aerospace Science and Technology Corporation has committed to building a gigawatt-class space-based digital intelligence infrastructure, in line with a five-year development plan.
Technological and architectural challenges
Putting an AI datacenter into orbit poses substantial technological hurdles that engineers must overcome.
> Launch and assembly
Modules must be designed modularly and rugged enough to withstand the violent vibrations of launch, then be assembled in orbit. An effort like EROSS IOD (European Robotic Orbital Support Services) aims to automate this task through European space robotics starting in 2026.
> Complex thermal management
While the vacuum of space eliminates convection, it paradoxically complicates heat evacuation. Heat must be carried away by radiators and finely tuned thermal engineering to handle dense AI workloads. Contrary to common belief, cooling in space is not automatic and requires sophisticated systems.
> Extreme hardware reliability
Servers and AI accelerators must be hardened against cosmic radiation and extreme thermal cycles, while remaining competitive in performance with terrestrial generations refreshed every 3 to 5 years. This is a major challenge in a sector where obsolescence is rapid.
> High-performance connectivity
Space datacenters rely on high-bandwidth optical links, both inter-satellite and ground-connected, to minimize latency and maximize throughput for distributed training and inference. Laser links are becoming essential to handle the colossal data volumes.
Economic and timing challenges
Despite the enthusiasm, space industry experts remain cautious. Several major obstacles loom on the path to this futuristic vision:
- Space debris represents a constant threat to any orbital equipment
- Launch costs remain substantial despite recent advances
- Maintenance is extremely limited once satellites are in orbit
- The pace of technological refresh raises questions in an environment where physical access is impossible
According to Deutsche Bank analysts, the first deployments of small orbital data centers are expected between 2027 and 2028. These pioneering missions will validate the technology and assess profitability. Larger constellations, potentially comprising hundreds or thousands of units, would emerge only in the 2030s, and only if these early experiments prove successful.
The business model rests on three pillars: rapid reductions in launch costs, maturity of orbital robotics, and the densification of AI chips. If these assumptions prove correct, orbital AI computing could become, in the medium term, competitive with or even more cost-effective than the endless expansion of ground-based datacenters in energy- and water-constrained regions.
Energy and environmental considerations: a mixed picture
AI datacenters today contribute to the growth of global electricity consumption, fueling concerns about network saturation and mounting pressures on land, water, and renewable energy. In orbit, the combination of a continuous solar flux (except during eclipses) and more efficient panels than on Earth opens a new gradient for energy optimization.
According to ASCEND project proponents, despite the initial carbon footprint of launches, a space datacenter could, over its full lifecycle, show a carbon balance better than a terrestrial counterpart if certain power and lifetime thresholds are met. Players like Starcloud point to impressive figures: up to a 90% reduction in electricity costs, and a tenfold decrease in CO2 emissions over the lifecycle, assuming optimized launches and robotic maintenance.
However, the reality is nuanced. Each rocket launch emits hundreds of tonnes of CO2 and other compounds into the atmosphere, shifting the problem to the space sector and prompting questions about the sustainable pace of orbital deployment for such infrastructures. Add to this worrying issues:
- Light pollution caused by satellite constellations, already criticized by astronomers
- Growing congestion in low Earth orbit, a source of collision risk
- The cumulative impact of thousands of launches on the atmosphere
The environmental debate remains open: do operational benefits truly offset the impacts of launch and deployment phases?
The Musk and Bezos ambition
For Elon Musk, the timing seems ideal. SpaceX is the most capable rocket builder in history and has already placed thousands of satellites into orbit as part of its Starlink internet service. This existing infrastructure could serve as a foundation for AI-capable satellites or facilitate the deployment of onboard computing capabilities.
At the Davos World Economic Forum earlier this month, he did not hide his optimism: “It’s clear we need to build solar-powered data centers in space… the cheapest place to deploy AI will be space, and this will be true in two years, three at most.”
SpaceX is reportedly considering an initial public offering this year, which could value the rocket-and-satellite enterprise at over $1 trillion. A portion of the funds raised would be used to finance the development of satellites for AI-centered data centers.
Meanwhile, Blue Origin and Jeff Bezos are pursuing their own space datacenter technology, drawing on Amazon’s expertise. The founder envisions that “giant, multi-gigawatt” data centers in orbit could be more affordable than their terrestrial counterparts within 10 to 20 years.