Skip to content
Home » Computing power race heats up as Elon Musk hints at 100k H100 training cluster

Computing power race heats up as Elon Musk hints at 100k H100 training cluster

Elon Musk xAI

Leave it to Elon to drop a casual bombshell revelation about harnessing computing power at a scale that makes even the biggest current supercomputers look puny in comparison.

Elon xAI startup is arming itself to the teeth with cash. xAI just secured a stupidly huge $6 billion Series B investment to pour gasoline on its bold vision of dethroning heavyweights like OpenAI. xAI grandiose ambitions start with rapidly improving and commercializing its promising Grok AI model, which has already spawned more capable iterations like Grok-1.5V with enhanced context and multimodal skills.

xAI's Grok-1.5V

In response to a question about which company might build the world’s first data center packing a million H100 GPUs (Nvidia’s monster AI accelerator chips), Elon dropped a tantalizing teaser:

“Given the pace of technology improvement, it’s not worth sinking 1GW of power into H100s. The xAI 100k H100 liquid-cooled training cluster will be online in a few months.”

Yep, you read that right. According to Elon, xAI division is gearing up to fire its own ridiculously powerful 100k H100 GPU cluster into action within the next few months.

Just let that jaw-dropping number marinate for a second – we’re talking about a training system with over 100 times the raw number of cutting-edge H100 chips as the world’s current fastest AI supercomputer, LUMI, which tops out at under 1,000 units.

A Sneak Peek at the Next AI Revolution?

Elon went on to tease that the 100k H100 training behemoth is just a precursor to an even bigger system coming “next summer”: around 300k B200 GPUs linked with Nvidia’s latest ultra-fast CX8 networking.

Assuming the B200 in question is Nvidia’s Blackwell GPU architected specifically for AI domains like large language models, this points to Tesla potentially preparing the launchpad for revolutionary new AI breakthroughs.

The sheer scale of these computing clusters practically beggars belief. As Elon noted, it’ll likely take that unfathomable level of processing might to drive the “next big step” in AI progress.

While he didn’t outright specify what groundbreaking AI advancements he has in mind, any new systems from xAI leveraging that monstrous 300k B200 muscle could blow past the current barriers of today’s most cutting-edge language models.

An AI Computing Arms Race Heats Up

Of course, knowing Elon Musk, this grandstanding teaser campaign could be partially motivated by trying to rally competitors to up their own AI game and investments even further.

No matter who wins out, it’s becoming abundantly clear that staying on the bleeding edge of large language models, computer vision, robotics, and other transformative AI domains will require computing horsepower once thought unimaginable.

So while Musk’s 100k H100 teaser feels equal parts tantalizing and outrageous, it could very well represent the new battleground where the nextgen AI revolution will be won.