AMD Linux devs jamming nearly 24,000 lines of RDNA 4 supporting code into its mainstream driver suggests next-gen launch may be close at hand

A stylised image of AMD's RDNA 3 GPU design for its Radeon RX 7000-series graphics cards
(Image credit: AMD)

AMD's software engineers have been very busy of late, updating its Linux GPU kernel driver, shader compiler, and other code sources to provide support for its next generation of graphics architecture. Now it's done the same for its RadeonSI OpenGL driver, adding almost 24,000 lines of code for GFX12, aka RDNA 4, to the Mesa open-source graphics library.

The big code merge was spotted by Phoronix and there's only one reason why engineers would be so busy updating their code base for an architecture that isn't currently on the market—it will be very soon.

RDNA 3, the current GPU chip design, is internally codenamed as GFX11 by AMD, so anything referring to GFX12 is clearly about its successor. I'm not talking about RDNA 3.5, which is listed as GFX11.5 and will only bring minor changes when it appears in AMD's Strix Point and Strix Halo laptop APUs. All of the recent code merges are for RDNA 4, although the files themselves don't tell us much about what we can expect from the forthcoming architecture.

Thanks to dwindling stocks of certain RX 6000 models and various price cuts for the likes of the RX 7900 XT and 7800 XT, things are a little rosier but overall, AMD hasn't managed to capture a significantly better share of the discrete GPU market. So what things can we expect, or hope, to see with RDNA 4—or another way of asking this is, what needs to be improved in AMD's next GPU architecture for sales to improve?

As there's nothing wrong with the fundamental rendering performance of AMD's design (aka rasterization), it comes down to areas where RDNA 3 trails behind Nvidia's Ada Lovelace-powered RTX 40-series. So that's ray tracing, leveraging machine learning for better performance, and power efficiency. And that's where the rumours have pegged AMD's new GPU architecture as being different from the current generation; in a different way of approaching ray tracing for RDNA 4

RDNA 3 GPUs have dedicated hardware for accelerating ray-triangle intersections, but nothing for doing BVH traversals—that's all done via the same units that handle all of the usual shaders. It's a similar situation for doing calculations for deep learning neural networks. Where Intel and Nvidia have large dedicated matrix units, AMD uses a combination of 'AI accelerators' and shader units to achieve the same thing.

Your next upgrade

Nvidia RTX 4070 and RTX 3080 Founders Edition graphics cards

(Image credit: Future)

Best CPU for gaming: The top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game ahead of the rest.

However, it's looking increasingly likely that we won't see a very high-end RDNA 4 graphics card, or certainly not at launch. There's been much talk of AMD skipping the halo sector for its next GPU release and focusing more on the mid-range and mainstream markets, which points to sticking with a monolithic design or one with just a few chiplets.

Whatever changes AMD has in store for us with RDNA 4, at least it's got everything all set from a driver perspective, because if it's all there for the relatively small Linux market, it's also certainly got everything in hand for the dominant DirectX market. 

And best of all, it looks like we don't have much longer to wait to find out how good it is.

Nick Evanson
Hardware Writer

Nick, gaming, and computers all first met in the early 1980s. After leaving university, he became a physics and IT teacher and started writing about tech in the late 1990s. That resulted in him working with MadOnion to write the help files for 3DMark and PCMark. After a short stint working at Beyond3D.com, Nick joined Futuremark (MadOnion rebranded) full-time, as editor-in-chief for its PC gaming section, YouGamers. After the site shutdown, he became an engineering and computing lecturer for many years, but missed the writing bug. Cue four years at TechSpot.com covering everything and anything to do with tech and PCs. He freely admits to being far too obsessed with GPUs and open-world grindy RPGs, but who isn't these days?