Skip to main content

Nvidia GTC live: Jen-Hsun lays out 'the future of real-time rendering' in today's keynote and potentially 'it's gonna blow your mind'

Nvidia's GPU Technology Conference kicks off this week, with a leather-clad keynote from the company's CEO.

NVIDIA GTC Keynote 2026 - YouTube NVIDIA GTC Keynote 2026 - YouTube
Watch On

Okay, that's some bombastic headlining right there, but when one of the people behind the success of Nvidia's DLSS feature reckons a new AI innovation in gaming is "gonna blow your mind" I'm prepared to listen. The Nvidia GPU Technology Conference starts today, with Jen-Hsun Huang hitting the stage at 11am PDT (6pm GMT) for the opening keynote, and the company's GeForce social channels have been promising that we'll get a look at "the future of real-time rendering" during the presentation.

But combine those tweets with something Bryan Catanzaro, VP of Applied Deep Learning Research—and someone who's worked on DLSS for the past ten years—said at GDC last week, and you can colour me excited to hear what's going on.

During a GDC panel discussion titled: AI Trends of Today and Opportunities for Tomorrow: Ask Me Anything the question was asked, "What has surprised you in terms of innovation with AI in games in the past one to three months?"

Having spoken a little about DLSS and Frame Generation in response, he goes on to talk about generative AI rendering being the "most important update to the way that graphics are rendered in at least the last decade."

"Yeah, but to your question of, what have you seen in the past month?" He continues. "I can't tell you exactly what I've seen, but you'll find out very soon, and it's going to blow your mind."

So, er... join me as we have our minds collectively blown, I guess.

Live blog

Refresh

A screenshot of Nvidia's Mega Geometry foliage tech in The Witcher 4

(Image credit: Nvidia/CD Projekt Red)

Introduced at CES 2025, alongside the RTX Blackwell series of GPUs, neural rendering—and neural shaders—promise big things. From enhanced AI compression techniques to deliver "next-generation asset generation" as a potental VRAM crutch (useful at a time when memory be expensive) to RTX MEGA GEOMETRY which is going to give Witcher 4 lots of pretty trees, sticking AI into the graphics pipeline has the potential to hugely up the fidelity of PC games.

Our Nick had some good ideas about this earlier:

It would certainly make ray/path tracing a little lighter on GPUs if they only have to actually trace a couple real rays in the scene, rather than a couple per pixel.

Given how painfully expensive memory is for everything these days, however, I kinda think Nick ought to get on and patent RTX Ultra Memory.

This waiting music getting you all hyped, too?

Ciri rides into the port town of Valdrest in the Witcher 4 tech demo.

(Image credit: CD Projekt Red)

Here we go.

"This conference is going to cover every single layer of the five layer cake of artificial intelligence."

Nvidia GTC keynote slide

(Image credit: Nvidia)

"GeForce is Nvidia's greatest marketing campaign."

"This is the house that GeForce made... 25 years ago we made the programmable shader."

This is going somewhere... "GeForce brought CUDA to the world."

"AI is going to go back and revolutionise how graphics are made."

Nvidia GTC keynote slide

(Image credit: Nvidia)

Nvidia GTC keynote slide

(Image credit: Nvidia)

Aha, DLSS 5. Golly.

This is fascinating as it's going into games that may not have traditionally had DLSS built into it.

To be honest, 'DLSS 5' kinda feels a bit like a filter overlay than actually really DLSS 5.

Announcing NVIDIA DLSS 5 | AI-Powered Breakthrough in Visual Fidelity for Games - YouTube Announcing NVIDIA DLSS 5 | AI-Powered Breakthrough in Visual Fidelity for Games - YouTube
Watch On

Obviously the rest of the presentation has been purely about AI, and how great it is, and how transformative it's going to be.

But purely in gaming terms, I really don't know how well DLSS 5 as a transformative lighting filter is going to go down.

You must confirm your public display name before commenting

Please logout and then login again, you will then be prompted to enter your display name.