Orbital Data Centers Will "Bypass Earth-Based" Constraints

www.zerohedge.com

Last week, readers were briefed on the emerging theme of data centers in low Earth orbit, a concept now openly discussed by Elon Musk, Jensen Huang, Jeff Bezos, and Sam Altman, as energy availability and infrastructure constraints on land increasingly emerge as major bottlenecks to data center buildouts through the end of this decade and well into the 2030s.

Nvidia-backed startup Starcloud has released a white paper outlining a case for operating a constellation of artificial intelligence data centers in space as a practical solution to Earth's looming power crunch, cooling woes, and permitting land constraints.

Terrestrial data center projects will reach capacity limits as AI workloads scale to multi-gigawatt levels, while electricity demand and grid bottlenecks worsen over the next several years. Orbital data centers aim to bypass these constraints by using near-continuous, high-intensity solar power, passive radiative cooling to deep space, and modular designs that scale quickly, launched into orbit via SpaceX rockets.

"Orbital data centers can leverage lower cooling costs using passive radiative cooling in space to directly achieve low coolant temperatures. Perhaps most importantly, they can be scaled almost indefinitely without the physical or permitting constraints faced on Earth, using modularity to deploy them rapidly," Starcloud wrote in the report.

Starcloud continued, "With new, reusable, cost-effective heavy-lift launch vehicles set to enter service, combined with the proliferation of in-orbit networking, the timing for this opportunity is ideal."

Already, the startup has launched its Starcloud-1 satellite carrying an Nvidia H100 GPU, the most powerful compute chip ever sent into space. Using the H100, Starcloud successfully trained NanoGPT, a lightweight language model, on the complete works of Shakespeare, making it the first AI model trained in space.

Starcloud is also running Google's open-source LLM Gemma in orbit, representing the first time a high-powered Nvidia GPU has been used to operate a large language model in space.

One solution (before nuclear power generation gets ramped up) to keep up with the rapid advances in AI and the ever-increasing demand for power and resources to prevent bottlenecks is to shift some of these data centers to low Earth orbit. This in itself will spark a space race-themed investment theme, hence why SpaceX is planning to go public next year at a valuation of $800 billion. Starlink will likely be powering these space-based data centers.

*  *  *

Read the full report: 

Loading recommendations...