The constant buzz around OpenAI’s ChatGPT refuses to die down. With Microsoft now using the same technology to power its new Bing Chat, it’s safe to say that ChatGPT can continue this upward trend for a long time. That’s good news for OpenAI and Microsoft, but they’re not the only two companies to benefit.
According to a new report, sales of Nvidia’s data center graphics cards may be on the rise. With the commercialization of ChatGPT, OpenAI may need as many as 10,000 new GPUs to support the growing model – and Nvidia seems to be the most likely supplier.
The research company TrendForce shared some interesting predictions today, and the most interesting part concerns the future of ChatGPT. According to TrendForce, the GPT model that powers ChatGPT will soon require a major hardware upgrade to maximize growth.
“The number of training parameters used in the development of this autoregressive language model has increased from around 120 million in 2018 to around 180 billion in 2020,” TrendForce said in its report. Although it did not share any estimates for 2023, it is safe to assume that these numbers will only continue to increase as far as technology and budget allow.
The company claims that the GPT model will require a whopping 20,000 graphics cards to process training data by 2020. As it continues to expand, that number is expected to rise to more than 30,000. This could be good news for Nvidia.
These calculations are based on the assumption that OpenAI will use Nvidia’s A100 GPUs to power the language model. These ultrapowerful graphics cards are very expensive – in the ballpark of $10,000 to $15,000 each. They’re also not Nvidia’s top data center cards right now, so it’s possible that OpenAI will go for the newer H100 cards, which should offer up to three times the performance of the A100. These GPUs have skyrocketed in price, with a single card costing around $30,000 or more.
The data center GPU market does not only consist of Nvidia – Intel and AMD also sell AI accelerators. However, Nvidia has always been seen as the go-to solution for AI-related tasks, so it’s possible that it could score a lucrative deal if and when OpenAI decides to scale up.
Should gamers be concerned if Nvidia does, in fact, end up supplying a whopping 10,000 GPUs to power up ChatGPT? It depends. The graphics cards required by OpenAI have nothing to do with Nvidia’s best GPUs for gamers, so we’re safe there. However, if Nvidia ends up shifting some production to data center GPUs, we will see a limited supply of consumer graphics cards down the line. In fact, the effect is not so bad – even if the 10,000-GPU prediction checks out, Nvidia doesn’t have to ship right away.
Editors’ Recommendations