The 750,000 square foot AI supercomputer in Memphis has 100,000 Nvidia GPUs, and Super Micro says it is the largest data ...
This came after a post on X featured a meme of US President Donald Trump standing in front of a tanker with the words ...
Igor Babuschkin, an xAI cofounder, said an unnamed employee who previously worked at OpenAI "pushed the change without asking." ...
Grok 3 was trained on 200,000 Nvidia H100 GPUs, which is double that of Grok 2. The team said it took 92 days to expand its Memphis-based supercomputer, dubbed Colossus, to accommodate training for ...
Despite being introduced as an early-stage product, Grok demonstrated rapid iteration. Developed in just two months, its initial beta phase laid the foundation for significant improvements, supported ...
AI is getting better at coding, data analysis and image generation, but it's also getting better at some unexpected areas.
The team detailed that the new model is “a magnitude more capable” than Grok 2, indicating Grok 3 has 10 to 15 times more power than Grok 2. They also claim that Grok 3 is more powerful than its AI ...
Industry experts have shared their very positive opinions on Grok 3, the latest model of Elon Musk's AI chatbot that he previously touted as the "smartest AI on Earth." ...
Musk's xAI claims its chatbot outperforms other AI models in assertions that have not been independently verified.
Some tech influencers are sounding the alarm over Grok—the artificial intelligence chatbot developed by Elon Musk’s xAI—after social media users were easily able to gain potentially ...