Musk addresses the 'sad reality' of AI training


An X user suggested that large language models are using each other's data to train AI, and Elon Musk admitted this is a sad reality.

On July 1, user Beff-e/acc commented that using each other's data would cause a "human centipede" effect - a phrase derived from a horror movie in which a giant centipede is created by connecting many people together.

In response, Elon Musk said that it would take a lot of effort to change this reality, which is to separate the training of large language models (LLMs) from the data on the Internet.

"Grok 2, which will be launched in August, will be a major improvement in this regard," the billionaire revealed.

Billionaire Elon Musk in Paris, France in June 2023. Photo: Reuters

Grok is a large language model developed by Musk's company xAI, leveraging the huge data source from the social network X, and is now available in Grok 1.5. A few minutes later, he continued to mention the next version: "Grok 3 will be released at the end of the year, after being trained with 100,000 H100s, it will definitely be something special."

This isn’t the first time the South African-born billionaire has mentioned 100,000 H100 GPUs. Back in May, The Information reported that Musk had spoken to investors about the number of specialized graphics cards his startup xAI needed to connect to a supercomputer to train the next version of its Grok chatbot .

The mention of numbers suggests that the idea could be close to reality. According to Insider , Musk's intentions also show the cost of an LLM project. With the average price of an H100 GPU on the market today being around $30-40k, or cheaper if purchased in bulk, the amount of money spent on buying the chip could be up to $3-4 billion, not including other costs.

But that’s not the biggest number yet. In January, Meta co-founder Mark Zuckerberg said he would buy about 350,000 Nvidia H100 GPUs by the end of 2024, bringing the number of chips owned to 600,000, including products from companies other than Nvidia.

As the AI ​​race intensifies, the company with the most dedicated GPUs will gain the upper hand. From startups to the world’s largest technology corporations, everyone is actively buying AI chips. Currently, Nvidia GPUs are the most ordered, while names like AMD have also begun to launch similar products.

According to internal documents obtained by Insider , Microsoft plans to triple its existing GPUs. By the end of the year, the company aims to have 1.8 million AI chips, most of which will be made by Nvidia, but it may also buy more from other partners. Meanwhile, Meta claims it is “ready to build an AI training system that is likely to be larger than any single company.” Last year, Musk ordered 10,000 H100 chips for xAI. Chinese companies are also looking to buy Nvidia’s high-end chips, while developing their own dedicated chips to avoid being left behind in the AI ​​race.



Website of Vietnam Union of Science and Technology Associations
License number: 169 / GP-TTĐT, dated October 31, 2017
Head of Editorial Department: VI VU
The Vietnam Union system was founded with 15 members. Currently, that number has risen to 148, including 86 national industry associations and 63 local associations. In addition, in the system of the Vietnam Union, there are more than 500 scientific and technological research units established under Decree 81 (now Decree 08); over 200 newspapers, magazines, electronic newspapers, newsletters, specialties, electronic news sites.
Address: 07 Nguyen Tat Thanh - Saigon - Vietnam. - Email: [email protected] - Phone: 818.337.007/7
Copyright © 2017 - SDC. All rights reserved