The Alan Wake 2 mesh shader issue is a reminder that one day your mega expensive GPU just won't be good enough to run the very latest games

A collection of different graphics cards
(Image credit: Future)

As reported just yesterday, if you're a proud owner of a Radeon RX 5700-series graphics card, you've got little chance of being able to play Alan Wake 2. That's because the game uses a feature of DirectX 12 called mesh shaders and that particular GPU doesn't support them. And yet the GPU's architecture is only four years old.

Now the same is true if you've got an Nvidia card that's Pascal-based or older, but that's perhaps more understandable as that design first appeared back in 2016. Almost, but not quite, double the age of AMD's RDNA architecture.

But what if you'd splashed out on a fancy Titan X Pascal, the most powerful GPU you could get back then? With an MSRP of $1199, it was considered to be ridiculously expensive at the time, though today's GPU prices make it look a bit of a bargain. On paper, it looked like it would be a card to last you for many years of gaming.

With 3,584 shaders, 224 TMUs, 96 ROPs, and 12GB of fast GDDR5X, it was seriously quick. I should know because I bought one, replacing a pair of GeForce GTX 980 Ti cards in SLI. That one GPU was faster than those two in almost every game and application, though my endless tinkering with heatsinks and voltage mods eventually killed it.

But if I still had it, then I'd have no chance to use it in Alan Wake 2, as the game employs the use of mesh shaders, a feature of DirectX 12 that was introduced toward the end of 2019 to improve the level of control that developers had over geometry processing in a game. Compute shaders and ray tracing may grab all the attention at the moment, but geometry is still an important part of it all.

That's because games have moved from using textures and pixel shaders to simulate the correct lighting of a world to a far more complex approach that better reflects the real nature of light. Chunky, low polygon models and environments don't suit this approach, which is why the likes of Unreal Engine 5 offer systems to massively increase the amount of polygons used.

(Image credit: UL Solutions)

Mesh shaders can be used to seriously improve the performance of geometry processing, which is probably why Remedy is using it in Alan Wake 2. You can see the potential yourself by running the mesh shader feature test in 3DMark, where the use of the tech more than doubles the frame rate.

Progress in graphics technology nearly always leaves hardware behind, at some point, but PC gamers have been somewhat spoilt in recent years. Where the first GPUs to support vertex and pixel shaders were rapidly made obsolete once games started to heavily favour later shader models, something like a Radeon R9 290 or a GeForce GTX 970 offered support for almost every shader version.

I'm not suggesting that either of those cards could play the latest games without any problems, but the latter actually meets the minimum hardware requirements for Baldur's Gate 3. That particular game is a bit of an exception, though, as the development process for its graphics engine started many years ago. And just as BG3 is an exception, so is Alan Wake 2. For now.

Your next upgrade

Nvidia RTX 4070 and RTX 3080 Founders Edition graphics cards

(Image credit: Future)

Best CPU for gaming: The top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game ahead of the rest.

Mesh shaders and sampler feedback are the last aspects of DirectX 12 Ultimate (which appeared in 2020) to be picked up and used in anger by game developers. Plenty of games already use the DirectX ray tracing API, as well as variable rate shaders, and once all of these become commonplace in games, any GPU based on architectures older than AMD's RNDA 2 and Nvidia's Turing will be defunct.

Of course, they'll still be able to play older games just fine, just as a GeForce GTX 970 can run something like Grand Theft Auto 5 without much fuss. GPU vendors want you to upgrade on a regular basis and along with Microsoft, they also want developers to employ the latest tech features in forthcoming releases to help promote new models.

If you currently own a Radeon RX 7900 XTX or a GeForce RTX 4090, the two most powerful GPUs you can buy for a gaming PC right now, you have to accept the fact that one day they'll be consigned to the pages of history. We fund and enjoy the fruits of technology research and progress, but eventually, all those shaders and all that power will just collect dust.

Such as it ever was in the world of PC gaming.

Nick Evanson
Hardware Writer

Nick, gaming, and computers all first met in 1981, with the love affair starting on a Sinclair ZX81 in kit form and a book on ZX Basic. He ended up becoming a physics and IT teacher, but by the late 1990s decided it was time to cut his teeth writing for a long defunct UK tech site. He went on to do the same at Madonion, helping to write the help files for 3DMark and PCMark. After a short stint working at Beyond3D.com, Nick joined Futuremark (MadOnion rebranded) full-time, as editor-in-chief for its gaming and hardware section, YouGamers. After the site shutdown, he became an engineering and computing lecturer for many years, but missed the writing bug. Cue four years at TechSpot.com and over 100 long articles on anything and everything. He freely admits to being far too obsessed with GPUs and open world grindy RPGs, but who isn't these days?