I've been testing Nvidia's new Neural Texture Compression toolkit and the impressive results could be good news for game install sizes

A screenshot taken from the 2025 Nvidia tech demo Zorah
(Image credit: Nvidia)

At CES 2025, Nvidia announced so many new things that it was somewhat hard to figure out just what was really worth paying attention to. While the likes of the RTX 5090 and its enormous price tag were grabbing all the headlines, one new piece of tech sat to one side with lots of promise but no game to showcase it. However, Nvidia has now released a beta software toolkit for its RTX Neural Texture Compression (RTXNTC) system, and after playing around with it for an hour or two, I'm far more impressed with this than any hulking GPU.

At the moment, all textures in games are compressed into a common format, to save on storage space and download requirements, and then decompressed when used in rendering. It can't have escaped your notice, though, that today's massive 3D games are…well…massive and 100 GB or more isn't unusual.

RTXNTC works like this: The original textures are pre-converted into an array of weights for a small neural network. When the game's engine issues instructions to the GPU to apply these textures to an object, the graphics processor samples them. Then, the aforementioned neural network (aka decoding) reconstructs what the texture looks like at the sample point.

The system can only produce a single unfiltered texel so for the sample demonstration, RTX Texture Filtering (also called stochastic texture filtering) is used to interpolate other texels.

Nvidia describes the whole thing using the term 'Inference on Sample,' and the results are impressive, to say the least. Without any form of compression, the texture memory footprint in the demo is 272 MB. With RTXNTC in full swing, that reduces to a mere 11.37 MB.

The whole process of sampling and decoding is pretty fast. It's not quite as fast as normal texture sampling and filtering, though. At 1080p, the non-NTC setup runs at 2,466 fps but this drops to 2,088 fps with Interfence on Sample. Stepping the resolution up to 4K the performance figures are 930 and 760 fps, respectively. In other words, RTXNTC incurs a frame rate penalty of 15% at 1080p and 18% at 4K—for a 96% reduction in texture memory.

Those frame rates were achieved using an RTX 4080 Super, and lower-tier or older RTX graphics cards are likely to see a larger performance drop. For that kind of hardware, Nvidia suggests using 'Inference on load' (NTC Transcoded to BCn in the demo) where the pre-compressed NTC textures are decompressed as the game (or demo, in this case) is loaded. They are then transcoded in a standard BCn block compression format, to be sampled and filtered as normal.

The texture memory reduction isn't as impressive but the performance hit isn't anywhere near as big as with Interfence on Sample. At 1080p, 2,444 fps it's almost as fast as a standard texture sample and filtering, and the texture footprint is just 98 MB. That's a 64% reduction over the uncompressed format.

All of this would be for nothing if the texture reconstruction was rubbish but as you can see in the gallery below, RTXNTC generates texels that look almost identical to the originals. Even Inference on Load looks the same.

Of course, this is a demonstration and a simple beta one at that, and it's not even remotely like Alan Wake 2, in terms of texture resolution and environment complexity. RTXNTC isn't suitable for every texture, either, being designed to be applied to 'physically-based rendering (PBR) materials' rather than a single, basic texture.

And it also requires cooperative vector support to work as quickly as this and that's essentially limited to RTX 40 or 50 series graphics cards. A cynical PC enthusiast might be tempted to claim that Nvidia only developed this system to justify equipping its desktop GPUs with less VRAM than the competition, too.

Your next upgrade

Nvidia RTX 5090 Founders Edition graphics card on different backgrounds

(Image credit: Future)

Best CPU for gaming: The top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game ahead of the rest.

But the tech itself clearly has lots of potential and it's possible that AMD and Intel are working on developing their own systems that achieve the same result. While three proprietary algorithms for reducing texture memory footprints aren't what anyone wants to see, if developers show enough interest in using them, then one of them (or an amalgamation of all three) might end up being a standard aspect of DirectX and Vulkan.

That would be the best outcome for everyone, so it's worth keeping an eye on AI-based texture compression because just like with Nvidia's other first-to-market technologies (e.g. ray tracing acceleration, AI upscaling), the industry eventually adapts them as being the norm. I don't think this means we'll see a 20 GB version of Baldur's Gate 3 any time soon but the future certainly looks a lot smaller.

If you want to try out the NTC demo on your own RTX graphics card, you can grab it directly from here. Extract the zip and then head to the bin/windows-x64 folder, then just click on the ntc-renderer file.

TOPICS
Nick Evanson
Hardware Writer

Nick, gaming, and computers all first met in 1981, with the love affair starting on a Sinclair ZX81 in kit form and a book on ZX Basic. He ended up becoming a physics and IT teacher, but by the late 1990s decided it was time to cut his teeth writing for a long defunct UK tech site. He went on to do the same at Madonion, helping to write the help files for 3DMark and PCMark. After a short stint working at Beyond3D.com, Nick joined Futuremark (MadOnion rebranded) full-time, as editor-in-chief for its gaming and hardware section, YouGamers. After the site shutdown, he became an engineering and computing lecturer for many years, but missed the writing bug. Cue four years at TechSpot.com and over 100 long articles on anything and everything. He freely admits to being far too obsessed with GPUs and open world grindy RPGs, but who isn't these days? 

Read more
A screenshot taken from the 2025 Nvidia tech demo Zorah
Nvidia RTX 50-series and dev kit show that rasterization is old news and we're now firmly in the era of AI rendering
Collage of images to represent Nvidia's RTX AI PCs
I'll say it: The best thing I saw from Nvidia at CES wasn't its sweet new GPUs, but some tasty AI every RTX gamer can enjoy
Nvidia RTX 5090 Founders Edition graphics card on different backgrounds
Nvidia GeForce RTX 5090 FE review
Nvidia RTX 5090 Founders Edition graphics card on different backgrounds
AI will be crammed in more of the graphics pipeline as Nvidia and Microsoft are bringing AI shading to a DirectX preview next month
Nvidia RTX Blackwell family
Here's how the new RTX 50-series cards perform against the previous generation of GeForce GPUs
Screenshots from Half-Life 2 RTX, showing the various new effects delivered by full ray tracing and enhanced assets.
I just played Half-Life 2 RTX, a fully ray-traced overhaul of the original, and its meaty headcrabs have me hankering for more
Latest in Graphics Cards
Half-Life 2 running on 8 MB VRAM on a tiny resolution in Windows XP with graphics settings disabled or lowered to ridiculously light levels
Getting Half-Life 2 to work on 8 MB of VRAM means turning it into an eerily befitting voidscape: 'there were absolutely no effects left'
Nvidia RTX 4090 Founders Edition graphics card
A single RTX 4090 managed to brute force crack an Akira ransomware attack in just 7 days
MSI RTX 5090 Suprim SOC graphics card on a grey background with a gradient
Nvidia has cut the MSRP of RTX 50-series FE cards in the UK and Europe and that means... not a whole lot right now
A photo of Nvidia's Zorah graphics demo running a large gaming monitor
Nvidia's expanded Zorah demo tells us how AI is the future of graphics: 'There's no rasterization going on at all. This is all ray traced and the amazing part is that it's actually faster than rasterizing'
A photograph of the opening slide of a Microsoft lecture on Cooperative Vectors at GDC 2025
AMD, Intel, Microsoft, and Nvidia are all excited about cooperative vectors and what they mean for the future of 3D graphics, but it's going to be a good while before we really see their impact
A collage of Radeon RX 9000 series graphics cards, as shown in AMD's promotional video for the launch of RDNA 4 at CES 2025
AMD claims it has 45% gaming GPU market share in Japan but jokingly admits it 'isn't used to selling graphics cards'
Latest in News
A blue dragon rises into storm clouds
Wizards of the Coast throws a bone to players who miss vanilla Magic: The Gathering with a dragon-themed set called Tarkir: Dragonstorm
Lonely Mountains: Snow Riders
Lonely Mountains: Snow Riders is getting a new mountain next month and a whole bunch more throughout the year, including a game editor
Lady smiling with the sun in her face
Clair Obscur: Expedition 33's director was 'starving for new turn-based RPGs,' and figured if he wanted them, there would be others out there who'd want to play his game
farcana
'The Middle East's answer to Marvel Rivals' is an 'AI-powered', crypto-infused hero shooter that looks like hot garbage
A monster made of glowing skulls has a brinrevolver aimed at it in Abyssus.
Wield a brinerevolver as a brinehunter in Abyssus, the briniest ‘brinepunk’ shooter this side of the Mariana Trench
Two airships fire broadsides into each other's hull in Echoes of Elysium.
In this airship-building survival game, I faced an enemy worth punching trees over: 'The hubris of man'