Musk's Quest for the Perfect AI: 100,000 GPUs and $3 Billion for Grok 3

Jul 08, 2024By NextMind
NextMind

Elon Musk, the founder of xAI, has announced grand plans to develop the next generation of language models, Grok.

The company plans to use 100,000 NVIDIA H100 GPUs for training Grok 3.

These resources far exceed those used by competitors. For example, OpenAI's GPT-4, rumored to have been trained on 40,000 NVIDIA A100 GPUs, which are now considered outdated compared to the H100.

Musk recently tweeted that Grok 3 will be "significantly larger" than previous versions. He emphasized that training language models on internet data requires enormous computational resources.

Grok is being developed as an AI assistant for premium users of the platform X (formerly Twitter). xAI plans to release an intermediate version, Grok 2, in August, and then move on to developing Grok 3. Estimates suggest that training Grok 3 could cost around $3 billion. Musk also plans to acquire NVIDIA Blackwell B200 accelerators for around $9 billion.

Experts believe that such significant investments could make xAI a leader in the AI market. While competitors are gradually improving their models, xAI is betting on a qualitative leap through the use of colossal computational resources. 

 
 
 
 
  
 

Dark Background Example