'It is basically DLSS. That’s the way graphics ought to be': Nvidia's Jensen Huang has a clear vision for the future of its gaming GPUs and is going to be all about neural rendering
I have questions, Nvidia. Many questions.
Perhaps to the surprise of no one, Nvidia's time at CES 2026 was all about one thing: AI. That said, PC gaming wasn't entirely ignored, as DLSS 4.5 was ninja-launched with the promise of '4K 240 Hz path traced gaming'. However, DLSS is still AI-based and in a Q&A session with members of the press, CEO Jensen Huang made it clear that artificial intelligence isn't just for improving performance, it's how graphics needs to be done in the future.
This much we already know, as Nvidia banged its neural rendering drum starting at last year's CES and then throughout 2025, and it wasn't the only graphics company to do so. Microsoft announced the addition of cooperative vectors to Direct3D, which is pretty much required to implement neural rendering in games, and AMD's FSR Redstone is as AI-based as anything from Intel and Nvidia.
So, when PC World's Adam Patrick Murray asked Huang, "Is the RTX 5090 the fastest GPU that gamers will ever see in traditional rasterization? And what does an AI gaming GPU look like in the future?", it wasn't surprising that Nvidia's co-founder avoided the first question entirely and skipped straight to the topic of AI.
"I think that the answer is hard to predict. Maybe another way of saying it is that the future is neural rendering. It is basically DLSS. That’s the way graphics ought to be."
He then expanded with some examples of what he meant by this: "I would expect that the ability for us to generate imagery of almost any style from photo realism, extreme photo realism, basically a photograph interacting with you at 500 frames a second, all the way to cartoon shading, if you like."
The keyword here is generate. If one wishes to be pedantic, all graphics are generated, either through rasterization or neural networks. It's all just a massive heap of mathematics, broken down into logic operations on GPUs, crunching through endless streams of binary values. But there is one important difference with neural rendering, and it's that it requires far less input data to generate the same graphical output as rasterization.
Fire up the original Crysis from 2007, and all those beautiful visuals are generated from lists of vertices, piles of texture maps, and a veritable mountain of resources that are created during the process of rendering (e.g. depth buffers, G-buffers, render targets, and so on). That's still the case almost 20 years on, and the size and quantity of those resources are now truly massive.
Keep up to date with the most important stories and the best deals, as picked by the PC Gamer team.
As DLSS Super Resolution proves, though, they don't need to be in the era of AI graphics. Nvidia's upscaling system works by reducing the frame resolution for rendering, scaling it back up once finished, and then applying a neural network to the frame to clean up artefacts. One idea behind neural rendering is take that a step further and to use lower resolution assets in the first place, and generate higher quality stuff as and when required.
Does it ultimately matter how a game's graphics are produced as long as they look absolutely fine and run smoothly? I dare say most people will say 'no', but we don't have any games right now that use neural rendering for any part of the graphics pipeline, other than upscaling and/or frame generation. Everything else is still rasterization (even if ray tracing is used, raster is still there behind the scenes).
Catch up with CES 2026: We're on the ground in sunny Las Vegas covering all the latest announcements from some of the biggest names in tech, including Nvidia, AMD, Intel, Asus, Razer, MSI and more.
That means GeForce GPUs of the future, both near and far, will still need to progress in rasterization to ensure games of tomorrow look and run as intended. But with Nvidia being dead set on neural rendering (I don't think Huang said "That’s the way graphics ought to be" lightly), have RTX graphics cards reached a plateau in that respect?
Does the company now expect that all generational performance increments will come from better DLSS? Will GPUs of the future be nothing more than ASICs for AI? How would such chips process older graphics routines? Is PC gaming heading backwards in time to the era when you needed a new GPU for every major new game, because previous chips didn't support the tech inside?
Answers that generate more questions than they resolve certainly aren't a bad thing, but in this case, I wish Nvidia would give us a much clearer picture as to its roadmap for gaming GPUs and how it plans to support games of the past, present, and future.

1. Best overall: AMD Radeon RX 9070
2. Best value: AMD Radeon RX 9060 XT 16 GB
3. Best budget: Intel Arc B570
4. Best mid-range: Nvidia GeForce RTX 5070 Ti
5. Best high-end: Nvidia GeForce RTX 5090

Nick, gaming, and computers all first met in the early 1980s. After leaving university, he became a physics and IT teacher and started writing about tech in the late 1990s. That resulted in him working with MadOnion to write the help files for 3DMark and PCMark. After a short stint working at Beyond3D.com, Nick joined Futuremark (MadOnion rebranded) full-time, as editor-in-chief for its PC gaming section, YouGamers. After the site shutdown, he became an engineering and computing lecturer for many years, but missed the writing bug. Cue four years at TechSpot.com covering everything and anything to do with tech and PCs. He freely admits to being far too obsessed with GPUs and open-world grindy RPGs, but who isn't these days?
You must confirm your public display name before commenting
Please logout and then login again, you will then be prompted to enter your display name.

