Love it or hate it, but frame generation is the one major graphics technology that really needs improving in 2026

Ray Reconstruction in Cyberpunk 2077 Update 2.0
(Image credit: CDPR)

When it comes to GPUs and graphics technology, I know what I'd like to see in 2026. Super-modern processors that are a substantial leap forward in terms of performance, capabilities, and affordability. We won't get that, of course, because we didn't get that this year. Or last year. But there is something GPU-related that can be a lot better, and that's frame generation.

Ever since GPU upscaling and frame generation first appeared (DLSS 1.0 in 2019, DLSS 3.0 in 2022), I've always held the opinion that if you couldn't tell they were working, other than just from the performance lift, everyone would be happily using them by default. With the first iteration of both technologies and their corresponding alternatives from AMD and Intel, you could very much tell they were being used, and that's only served to tarnish their reputations.


DLSS Quality upscaling + 3X frame generation
Ryzen 7 9800X3D, GeForce RTX 5090, 4K, Ultra preset

There's no way around the latency issue with frame interpolation, unless you start delving into the world of predicting input changes (which is a whole different tech), but I can live with it in certain games. What I can't live with is when frame gen turns my perfectly rendered and upscaling graphics into a wonky, blurry mess.

If Nvidia, et al can properly sort out the visual quality of frame generation in 2026, then I'll be happy to use it as intended: a simple switch to seriously boost performance in games that comfortably hit 60 fps. My worry is that we won't because GPU companies can't usually sell more of their latest graphics cards this way. Instead, they tend to use tech like ray tracing or performance metrics such as FP32 TFLOPS to promote new GPUs.

FSR 4 Frame Generation being tied to RDNA 4 GPUs is the exception for AMD, not the norm, though Nvidia ties pretty much every new DLSS feature to a certain GeForce series. At least Intel's AI-powered interpolator works on every Arc card.

However, we're almost certainly not going to see any new GPU architecture in 2026, even if Nvidia Super updates miraculously appear or Intel actually launches its big Battlemage G31-based card, they'll have no actual physical changes inside the chips.

Nvidia's RTX 60-series is at least 14 months away, as RTX Blackwell only launched in January of this year. AMD's RDNA 4 architecture followed a few months later, and while Intel got the jump on both of them, releasing its Battlemage chips in December 2024, there's no sign of any plans for a successor to make an appearance next year.

This leaves the door open for everyone to stay in GPU news headlines by seriously updating their frame generation tech. Upscaling is about as good as it's ever going to get—refinements to the neural networks are all that's left to do—so frame interpolation is perfectly placed to get a nice overhaul.


FSR Performance upscaling + 2X frame generation
Ryzen 9 9900X, Radeon RX 9070 XT, 4K, RT Ultra preset

While I genuinely hope we do, I fear that we're only likely to be fed more AI stuff that has limited appeal or applications. For example, AMD has only just released its FSR 4 Frame Generation, so it's going to be concentrating on finalising FSR Radiance Cache for developers. Nvidia will probably push neural rendering a lot more in 2026, and Intel… well, who knows with Intel.

Right now, I'm not interested in any of that. Actually, I am, but that's just because I'm a graphics nerd, and have been for my entire computing life. But as a PC gamer, I just want fully-functional, works-as-intended performance boosters. I've already got solid upscalers like DLSS and FSR 4, and I've perfectly low-latency switches like Reflex. Now, I just want the same level of quality from frame generation.

It's not like I'm asking for cheap RAM or anything like that.

Asus RX 9070 Prime graphics card
Best graphics card 2025
Nick Evanson
Hardware Writer

Nick, gaming, and computers all first met in the early 1980s. After leaving university, he became a physics and IT teacher and started writing about tech in the late 1990s. That resulted in him working with MadOnion to write the help files for 3DMark and PCMark. After a short stint working at Beyond3D.com, Nick joined Futuremark (MadOnion rebranded) full-time, as editor-in-chief for its PC gaming section, YouGamers. After the site shutdown, he became an engineering and computing lecturer for many years, but missed the writing bug. Cue four years at TechSpot.com covering everything and anything to do with tech and PCs. He freely admits to being far too obsessed with GPUs and open-world grindy RPGs, but who isn't these days?

You must confirm your public display name before commenting

Please logout and then login again, you will then be prompted to enter your display name.