Resident Evil Requiem is one of the few games to use GPU data decompression but it's a bit hit and miss as to whether your GPU will ever actually use it

Leon and Grace amid the flames
(Image credit: Capcom)

Resident Evil Requiem is proving to be one of the standout releases of 2026 so far but there's one aspect of the game that puts it into a very exclusive club: The use of DirectStorage and GPU data decompression. However, what's going on behind the scenes is a bit of a mystery because it appears to be a bit random as to which GPUs use it, even ones that are theoretically fully capable.

I knew that RER was a DirectStorage-enabled game from my own performance testing a couple of weeks ago, but it was Compusemble that spotted the use of GDeflate, using the SpecialK tool to peek underneath the game's hood. If you're wondering just what GDeflate is, it's a data compression algorithm developed by Nvidia that can use the power of a GPU to decompress stuff very quickly.

The idea behind it all, along with the rest of the tech behind DirectStorage, is to streamline the flow of data from storage to system memory, and then to the graphics card's VRAM. In short, everything loads quicker.

Article continues below

Up until now, very few games have actively used it (off the top of my head, Ratchet & Clank: Rift Apart is one, Spider-Man 2 being another), but you can still use the system without shoving the algorithm through a GPU, as there's a CPU fallback option with GDeflate.

What Compusemble noticed is that with an RTX 5090, RTX 5070, and RTX 5060, Resident Evil Requiem uses the GPU to decompress data. However, with an RTX 4060 laptop, the CPU fallback was used, despite the graphics chip fully supporting GPU decompression. More puzzlingly, a quick driver reinstall with the RTX 5090 resulted in the fallback being used.

Microsoft DirectStorage and GDeflate example with times

DirectStorage improves your PC's ability to load avocados super fast. And zombies. (Image credit: Microsoft)

Intrigued, I fired up RER on three different test PCs: an RTX 5070 rig, an RTX 4080 Super PC, and one with a Radeon RX 7900 XT. In all three cases, the CPU fallback option is being used, even though every single one of those GPUs is capable of handling the decompression.

Some games only ever use CPU decompression, such as Ghost of Tsushima, because the developers deem that it's better to leave the GPU to do nothing but rendering, and the fact that modern CPUs are actually more than up to the task anyway.

But none of this explains why Resident Evil Requiem seems to be randomly choosing where and when to enable GPU-powered GDeflate. Driver version doesn't seem to affect it, nor whether the graphics card has resizable BAR enabled or not (I've briefly checked), so there's either some bugs going on in the game's code, or the detection mechanism for the GPU decompression just isn't robust enough.

Fortunately, it doesn't matter if your CPU is handling GDeflate, or even if the GPU is, because the performance difference between the two is too small to be really noticeable, at least that's what Compusemble is saying. I can't confirm or deny this with my own testing because none of my test rigs will let me use GPU GDeflate.

Dare I say it? Yes, I will: I feel a tad deflated by this fact.

Asus RX 9070 Prime graphics card
Best graphics card 2026
TOPICS
Nick Evanson
Hardware Writer

Nick, gaming, and computers all first met in the early 1980s. After leaving university, he became a physics and IT teacher and started writing about tech in the late 1990s. That resulted in him working with MadOnion to write the help files for 3DMark and PCMark. After a short stint working at Beyond3D.com, Nick joined Futuremark (MadOnion rebranded) full-time, as editor-in-chief for its PC gaming section, YouGamers. After the site shutdown, he became an engineering and computing lecturer for many years, but missed the writing bug. Cue four years at TechSpot.com covering everything and anything to do with tech and PCs. He freely admits to being far too obsessed with GPUs and open-world grindy RPGs, but who isn't these days?

You must confirm your public display name before commenting

Please logout and then login again, you will then be prompted to enter your display name.