‘Everyone and Their Dog is Buying GPUs,’ Musk Says as AI Startup Details Emerge

When Elon Musk was asked to confirm whether he was building Twitter’s computing power to develop a generative artificial intelligence project, he said that his company Tesla and Twitter had purchased a large amount of GPUs. I have confirmed thatMeanwhile, the Financial Times report (opens in new tab) Musk’s AI venture will be a separate entity from his other companies, but will likely use Twitter content for training.
Elon Musk’s AI project, which he began investigating earlier this year and is reportedly separate from other companies, uses Twitter content as data to train language models. , could take advantage of Tesla’s computing resources, said Elon Musk. financial timesThis somewhat contradicts previous reports that claimed the AI project would become part of Twitter.
To build the new project, Musk is recruiting engineers from top AI companies, including DeepMind, and has already hired Igor Babuschkin and about six other AI specialists from DeepMind.
Musk is also reportedly in talks with various SpaceX and Tesla investors about potentially funding his latest AI effort, according to individuals with direct knowledge of the talks. , it may be confirmed that the project is not intended to be part of Twitter.
In a recent Twitter Spaces interview, Musk was asked about a report claiming Twitter has procured around 10,000 Nvidia computing GPUs. Musk has admitted that everyone, including Tesla and Twitter, is buying GPUs these days for computing and AI. This is evident with both Microsoft and Oracle acquiring tens of thousands of his A100 and H100 GPUs from Nvidia in recent quarters for AI and cloud services.
“At this point, everyone and their dogs seem to be buying GPUs,” Musk said. “Twitter and Tesla are certainly buying GPUs.”
Nvidia’s latest H100 GPU for AI and High Performance Computing (HPC) is very expensive. CDW used Nvidia’s H100 PCIe card with 80 GB of HBM2e memory. $30,603 per unit. Ebay sells this product for over $40,000 per unit if you want to buy it now.
Recently, Nvidia announced an even more powerful product H100 NVL product It bridges two H100 PCIe cards with 96GB of HBM3 memory each for the ultimate dual GPU 188GB solution specifically designed for training large language models. While this product certainly costs well over $30,000 per unit, he said to customers buying tens of thousands of boards for LLM projects, it’s unclear at what price Nvidia would sell such units.
Meanwhile, the exact position of the AI team in Musk’s corporate empire remains unknown. The prominent entrepreneur said he founded a company called X.AI on March 9, The Times of Financial reported, citing Nevada business records. Meanwhile, he recently changed his Twitter name in company records to his X Corp. This could be part of a plan to build “all apps” under the “X” brand. Musk is now the sole director of X.AI, with Jared Birchall, who happens to manage Musk’s assets, listed as secretary.
The rapid progress of OpenAI’s ChatGPT, which Elon Musk co-founded in 2015 but is no longer involved with, reportedly prompted him to explore the idea of a rival company. Meanwhile, this new AI venture of his is expected to be a separate entity from his other companies, presumably to ensure that this new project is not confined to the framework of Tesla or his Twitter.