'It is basically DLSS. That’s the way graphics ought to be': Nvidia's Jensen Huang has a clear vision for the future of its gaming GPUs and is going to be all about neural rendering

Ray Reconstruction in Cyberpunk 2077 Update 2.0
(Image credit: CDPR)

Perhaps to the surprise of no one, Nvidia's time at CES 2026 was all about one thing: AI. That said, PC gaming wasn't entirely ignored, as DLSS 4.5 was ninja-launched with the promise of '4K 240 Hz path traced gaming'. However, DLSS is still AI-based and in a Q&A session with members of the press, CEO Jensen Huang made it clear that artificial intelligence isn't just for improving performance, it's how graphics needs to be done in the future.

This much we already know, as Nvidia banged its neural rendering drum starting at last year's CES and then throughout 2025, and it wasn't the only graphics company to do so. Microsoft announced the addition of cooperative vectors to Direct3D, which is pretty much required to implement neural rendering in games, and AMD's FSR Redstone is as AI-based as anything from Intel and Nvidia.

"I think that the answer is hard to predict. Maybe another way of saying it is that the future is neural rendering. It is basically DLSS. That’s the way graphics ought to be."

Nvidia's neural rendering demo Zorah at GDC 2025 (Image credit: Future)

The keyword here is generate. If one wishes to be pedantic, all graphics are generated, either through rasterization or neural networks. It's all just a massive heap of mathematics, broken down into logic operations on GPUs, crunching through endless streams of binary values. But there is one important difference with neural rendering, and it's that it requires far less input data to generate the same graphical output as rasterization.

Fire up the original Crysis from 2007, and all those beautiful visuals are generated from lists of vertices, piles of texture maps, and a veritable mountain of resources that are created during the process of rendering (e.g. depth buffers, G-buffers, render targets, and so on). That's still the case almost 20 years on, and the size and quantity of those resources are now truly massive.

As DLSS Super Resolution proves, though, they don't need to be in the era of AI graphics. Nvidia's upscaling system works by reducing the frame resolution for rendering, scaling it back up once finished, and then applying a neural network to the frame to clean up artefacts. One idea behind neural rendering is take that a step further and to use lower resolution assets in the first place, and generate higher quality stuff as and when required.

The original 2007 Crysis (top) still looks outstanding compared to the 2020 remaster (bottom) (Image credit: Crytek via Jonathan Bolding and Filip_7)

Does it ultimately matter how a game's graphics are produced as long as they look absolutely fine and run smoothly? I dare say most people will say 'no', but we don't have any games right now that use neural rendering for any part of the graphics pipeline, other than upscaling and/or frame generation. Everything else is still rasterization (even if ray tracing is used, raster is still there behind the scenes).

CES 2026

The CES logo on display at the show.

(Image credit: Future)

Catch up with CES 2026: We're on the ground in sunny Las Vegas covering all the latest announcements from some of the biggest names in tech, including Nvidia, AMD, Intel, Asus, Razer, MSI and more.

That means GeForce GPUs of the future, both near and far, will still need to progress in rasterization to ensure games of tomorrow look and run as intended. But with Nvidia being dead set on neural rendering (I don't think Huang said "That’s the way graphics ought to be" lightly), have RTX graphics cards reached a plateau in that respect?

Does the company now expect that all generational performance increments will come from better DLSS? Will GPUs of the future be nothing more than ASICs for AI? How would such chips process older graphics routines? Is PC gaming heading backwards in time to the era when you needed a new GPU for every major new game, because previous chips didn't support the tech inside?

Answers that generate more questions than they resolve certainly aren't a bad thing, but in this case, I wish Nvidia would give us a much clearer picture as to its roadmap for gaming GPUs and how it plans to support games of the past, present, and future.

Asus RX 9070 Prime graphics card
Best graphics card 2026
Nick Evanson
Hardware Writer

Nick, gaming, and computers all first met in the early 1980s. After leaving university, he became a physics and IT teacher and started writing about tech in the late 1990s. That resulted in him working with MadOnion to write the help files for 3DMark and PCMark. After a short stint working at Beyond3D.com, Nick joined Futuremark (MadOnion rebranded) full-time, as editor-in-chief for its PC gaming section, YouGamers. After the site shutdown, he became an engineering and computing lecturer for many years, but missed the writing bug. Cue four years at TechSpot.com covering everything and anything to do with tech and PCs. He freely admits to being far too obsessed with GPUs and open-world grindy RPGs, but who isn't these days?

You must confirm your public display name before commenting

Please logout and then login again, you will then be prompted to enter your display name.