Whats Next?
Separately, the billionaire, who also co-founded OpenAI, said the lack of advanced chips was hampering the training of Grok's version 2 model, which is also an argument over AI regulations.
Musk said that training the Grok 2 model required around 20,000 Nvidia H100 graphics processing units (GPUs), adding that the Grok 3 model and later technologies will require 100,000 Nvidia H100 chips.
According to a recent investigation carried out between specialists from the University of Cambridge, Harvard and Oxford, as well as specialists from OpenAI, creators of ChatGPT, one of the ways to regulate this innovation is by controlling the chips that facilitate its operation.
The study highlights that although the technology has complexities due to which governments have not been able to establish a series of rules applicable in the same way in all territories, the components necessary to make it work have a series of characteristics in which can intervene to control its progress.
“AI-relevant computing is detectable, excludable, and quantifiable, and occurs through an extremely concentrated supply chain,” while big language models “are intangible, non-rival, and easily shareable goods, making them intrinsically difficult to control,” the document reads.
According to specialists, the components necessary to make AI work are manufactured by a very small group of companies and that is where it is easy for governments to establish restrictive measures.