Actually, ray tracing is just getting started

(Image credit: Nvidia)

We're doing this then, are we? First-gen performance-tanking pretties are a waste of time... that's where we're at? People spending big bucks on Turing were always going to suffer from early adopter syndrome, but real-time in-game ray tracing had to start somewhere, and before you come at me asking why we need it in the first place when devs have gotten mind-blowingly good at faking all manner of baked-in lighting effects, imma get to that in a minute.

RTX Off

We're not a hive mind here on PC Gamer—Alan has a very different take on this.

But trust that it had to happen. And Nvidia decided to be the company to take the hit, sacrificing the Turing generation of GPUs to the lords of the realistic photons. Luckily for Jen-Hsun it had no real competition at the time. 

AMD was way off releasing Navi, or anything that could possibly compete with the RTX 20-series, and so it had the market for high-end graphics cards to itself. In the end there was no real hit to take because of this, no other place for enthusiasts to pick up a new, more powerful GPU. And so Nvidia could price with impunity, and the 20-series enthusiasts became ray tracing's early adopters almost by proxy.

So why did real-time ray tracing need to happen in-game? For the same reason it's been used in movie studios for years: for rapid iteration and so companies don't have to spend either time or money creating baked lighting effects that look almost as good as the real thing when some super-clever hardware can do it real-time by copying physics. When games were just a long line of corridors faking lighting effects wasn't an issue, but now we've got vast open worlds, with dynamic day/night cycles and weather patterns, and faking all that lighting is a royal pain in some dev's ass.

When it just works™, and no-one will ever have to pre-bake lighting effects for game worlds, and development resources will be freed up to work on other areas. As these sorts of tools always say, it takes the painful grunt work away so devs can work on other, more creative avenues. And it doesn't just potentially make development easier, ray traced effects genuinely look better, more like the real world.

Cyberpunk 2077 will give RTX some love for sure. (Image credit: CD Projekt Red)

That's the future, when ray traced effects are a ubiquitous standard in gaming, whether on PC or console, and we end up at a point where no-one even mentions it because it's now just called 'game lighting'.

But today, most especially in this first generation of RTX GPUs, ray tracing is a resource-hog. The computational demands are brutal, especially when you're going all-out on the fanciest of effects. Sure, some soft shadows, effective reflections, or a little gloomy global illumination here and there isn't going to change the world, but there's a realism to these effects I really notice when they're missing. 

It's like I now see all the tricks devs have been pulling for years when I switch back to Battlefield V on my last-gen Nvidia card, or on my AMD RX 5700-powered office rig. To be honest, it's kinda like when I first tried G-Sync and then actually saw the juddering effect I'd been unconsciously putting up with from enabling V-Sync all these years.

(Image credit: Nvidia)

But yeah, ray tracing isn't literally game changing, and I don't think it ever will be, at least not in terms of directly affecting gameplay. Because ray tracing is just about making a world more real, more immersive, and sadly the difference that turning on those effects makes is almost always subtle. Minecraft with RTX notwithstanding… The real problem right now is turning on those subtle effects equates to a performance hit that's not commensurate with the visual improvement for the end user, often even on a $1,200 RTX 2080 Ti.

Here's hoping Nvidia's Ampere, and the RTX 3090, lives up to the promise of negating the performance hit of enabling DirectX Raytracing effects. Now, I'm not saying just buy it, but I wouldn't be surprised if ray tracing performance gets a serious uplift in the next month. Just probably still not $1,400 worth of uplift though, eh?

But real-time ray tracing is not a scam, it's just still early days for something that will eventually just be an accepted part of gaming. 

(Image credit: Nvidia)

The following copy was originally published in a separate article intended to be seen as a counterpoint, but we are adding it in here so that everyone gets both sides of the argument in one place.

Ray tracing has failed to deliver on its promise

Ray tracing was the great hope for Nvidia's current generation of graphics cards, and the main reason given for their high costs. Yet here we are on the dawn of the next-generation, and I'm still waiting for a ray tracing game that I actually give a damn about. 

Don't get me wrong, there have been some impressive examples of the tech: the reflections of Battlefield V, the shadows of... erm... Shadow of the Tomb Raider, and the global illumination of Metro Exodus are all good. But they're not jaw-droppingly awesome.

We even got one game that went all out to include as many of these effects as possible—I am of course talking about Control, which genuinely made for some beautiful scenes. Even here though, I didn't feel like ray tracing was actually doing much to improve, or even change, gameplay. 

And so, nearly two years after the release of the first RTX-capable card, the insanely expensive Nvidia GeForce RTX 2080 Ti, there still isn't anything that really stands out as a must have ray tracing experience. The biggest problem for ray tracing is that games have got really good at faking what we see, and all the games mentioned look really good even without the ray tracing cleverness.

Metro Exodus is a case in point. It has a lot of work to do when you enable global illumination, and while it looks great, the performance hit is significant. Turn it off, and it's still a really good looking game. It helps that you spend a lot of time in the dark of course, but even so, games are just really good at convincing us that we're in a real world.

The same is true in Shadow of the Tomb Raider. Yes the ray traced shadows are really good, particularly in the party scene at the beginning of the game, but even here the non-ray traced shadows are fine. Run the game at the highest quality settings, and you're going to get a great visual experience, even if the shadows aren't always quite 'perfect'. More importantly at no time did I feel the quality of the shadows impacted gameplay. 

Shadow of the Tomb Raider's shadows look pretty good even if you turn the RTX cleverness off. (Image credit: Eidos)

One of the reasons for this is obviously that the market for ray tracing hardware is small compared to the size of the PC market as a whole. Why would anyone create a game that the vast majority of people couldn't experience. You could make an argument that that is exactly what Valve did with Half-Life: Alyx, but that is something of a special case. Although as an aside, It is one game I've played since the release of the RTX 2080 Ti where I genuinely wished it did have some ray traced reflections, shadows, etc. 

There's a problem there though, and that problem is the performance hit calculating all those rays introduces. As Turing was the first generation of GPUs to feature real time ray tracing hardware, it's reasonable that not everything is running optimally. Even so, the performance hit is significant, especially at higher resolutions, where even a $1,200 RTX 2080 Ti struggles to keep up.

Nvidia has a solution to this in the form of DLSS, which helps offset the performance hit by rendering at a lower resolution and then upscaling, using the power of machine learning. The initial implementation suffered from numerous artefacts, but the second iteration, as found in the likes of Mechwarrior V: Mercenaries, Control, and Wolfenstein: Youngblood does a much better job. In fact DLSS 2.0 is one of the saving graces from this whole generation, but that's a piece for another day.

Control looks pretty good even without all its RTX ray tracing cleverness.  (Image credit: Remedy Entertainment)

The whole narrative that ray tracing is the future of gaming seems to have gone quiet lately too: we've had Call of Duty: Modern Warfare, Deliver Us the Moon and a few other minor titles, but very little so far in 2020. That is set to change with one of the most anticipated titles of the year, Cyberpunk 2077, and Vampire: The Masquerade - Bloodlines 2 could also do wonders, whenever that actually gets released. I'm still hopeful that Atomic Heart will pull something special out of the bag as well, but by the time these games hit, we'll probably be on the second generation of ray tracing cards.

To be fair, Nvidia hasn't been idle. It tried to inject some path traced loveliness into the veritable classic that is Quake II. Quake II RTX took the 1997 masterpiece and slashed framerates in order to add a few pretties to its boxy environments. Regardless of how rose-tinted your glasses may be, Quake II has not aged well, and you absolutely cannot convince me otherwise. It may have been great at the time, but the 23 years since its release have not been kind. Adding ray tracing to Quake II seemed frankly ridiculous to be honest.

This made for one of those strange moments where I felt I was living in a completely separate reality to everyone praising it. I'm sorry, but giving bullets a subtle glow as they blaze down corridors is not something I've been crying out for. Admittedly the water looked alright, and obviously much better than it originally did all those years ago, but that's hardly had me rushing back to find keys, collect ammo, and shoot eight polygons masquerading as the Strogg.

Minecraft's ray tracing has potential, but being limited to prebuilt maps puts me off. (Image credit: Majong)

Another classic getting the path tracing treatment is Minecraft. Minecraft with RTX seemed to have so much potential. The various screenshots and videos that were released prior to its Beta launch had me genuinely excited. The shafts of light coming through trees, the delightfully rendered water, and the glow of lava all made for an upgrade to everyone's favourite block-based game that genuinely seemed worth waiting for. 

And then it launched. With its limited maps and the now customary performance hit, and my interest dwindled almost instantly. It could still turn out to be something, but I'm not convinced. 

There is the hope that the consoles getting ray tracing support will help things here. This is especially true if you're looking forward to spiderman having a perfect reflection in a puddle. The fact that AMD and Intel will be joining the party should help as well, although we really need these to be more than half-arsed implementations if the technology is going to move forward. I fear AMD's and Intel's implementation will just be check boxes on a feature list.

Frustratingly it's probably Nvidia that will drive ray tracing forward more than anyone, and charge us all a small fortune for the privilege. If the rumours for the RTX 3090 (or whatever it's called) are to be believed, and Nvidia is asking $1,400-$1,600 for a graphics card, then I need more than the promise of ray tracing. I need ray tracing that makes a difference.

Alan Dexter

Alan has been writing about PC tech since before 3D graphics cards existed, and still vividly recalls having to fight with MS-DOS just to get games to load. He fondly remembers the killer combo of a Matrox Millenium and 3dfx Voodoo, and seeing Lara Croft in 3D for the first time. He's very glad hardware has advanced as much as it has though, and is particularly happy when putting the latest M.2 NVMe SSDs, AMD processors, and laptops through their paces. He has a long-lasting Magic: The Gathering obsession but limits this to MTG Arena these days.