China's DeepSeek chatbot reportedly gets much more done with fewer GPUs but Nvidia still thinks it's 'excellent' news

SUQIAN, CHINA - JANUARY 27, 2025 - An illustration photo shows the logo of DeepSeek and ChatGPT in Suqian, Jiangsu province, China, January 27, 2025. (Photo credit should read CFOTO/Future Publishing via Getty Images)
(Image credit: CFOTO/Future Publishing via Getty Images)

China's new DeepSeek R1 language model has been shaking things up by reportedly matching or even beating the performance of established rivals including OpenAI while using far fewer GPUs. Nvidia's response? R1 is "excellent" news that proves the need for even more of its AI-accelerating chips.

If you're thinking the math doesn't immediately add up, the stock market agrees, what with $600 billion being wiped off Nvidia's share price this week.

Needless to say, Nvidia doesn't see it that way, lauding R1 for demonstrating how the so-called "Test Time" scaling technique can help create more powerful AI models. “DeepSeek is an excellent AI advancement and a perfect example of Test Time Scaling,” the company told CNBC. "DeepSeek’s work illustrates how new models can be created using that technique, leveraging widely-available models and compute that is fully export control compliant."

Nvidia Hopper GPU die

H100 is Nvidia's cash cow. Did DeepSeek secretly manage to get hold of it? (Image credit: Nvidia)

Microsoft, for instance, expects to spend $80 billion on AI hardware this year, with Meta saying it will unload $60 to $65 billion. DeepSeek's R1 model seems to imply that similar results can be achieved for an order of magnitude less investment.

Your next upgrade

Nvidia RTX 4070 and RTX 3080 Founders Edition graphics cards

(Image credit: Future)

Best CPU for gaming: The top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game ahead of the rest.

The question is, does that automatically mean fewer GPUs being bought? Perhaps not. At the kind of investment levels demonstrated by the likes of Microsoft and Meta, the number of organisations that can get involved is necessarily limited. It's just too expensive.

Cut that by a tenth or more and suddenly the number of potential participants might explode. And they'll all want GPUs. It's a little like the move from mainframe to personal computing. Sure, each individual investment in computing was a lot smaller, but the computing business overall grew far larger.

So, maybe DeepSeek is an inflection point. From here on, AI development is more accessible, less dominated by a small number of hugely wealthy entities, and maybe even a bit more democratic.

Or maybe we'll find out that DeepSeek has used a mountain of H100s after all. Either way, Nvidia will be planning to make its own mountain—of cash.

TOPICS
Jeremy Laird
Hardware writer

Jeremy has been writing about technology and PCs since the 90nm Netburst era (Google it!) and enjoys nothing more than a serious dissertation on the finer points of monitor input lag and overshoot followed by a forensic examination of advanced lithography. Or maybe he just likes machines that go “ping!” He also has a thing for tennis and cars.