Ray tracing tested: Battlefield 5 shows how demanding Nvidia's new tech really is

null

Nvidia announced its Turing architecture and its halo feature of real-time ray tracing hardware acceleration in August and it sounded promising, but there was one major concern: When would we actually see games that could utilize the new features? We were told they would be coming soon, and there are 25 announced games already slated to use DLSS and 11 games that will use ray tracing in some fashion. But we've gone through the RTX 2080 Ti, RTX 2080, and RTX 2070 launches with not a single actual game we could use for testing. That all changed yesterday with the arrival of the DXR patch for Battlefield 5.

I've spent a good chunk of yesterday and today trying to get a handle on what ray tracing does—and more importantly doesn't do—in Battlefield 5. As the first publicly available test case, there's a lot riding on game, and let's be frank: it's not the end-all, be-all of graphics and gaming excellence right now. In fact, the only thing ray tracing is used for in Battlefield 5 is improved reflections. They look good, better than the non-DXR mode certainly, and it's pretty cool to see mirrors, windows, and vehicle surfaces more accurately reflecting their surroundings. But this particular implementation isn't going to revolutionize the gaming industry.

And that's okay. I've been playing games since the '80s, and I've seen a lot of cool technologies come and go over the years. To put real-time ray tracing in proper perspective, I think we need to look at commonplace techniques like anti-aliasing, shadow mapping, hardware transform and lighting, pixel shaders, and ambient occlusion. When those first became available in games, the tradeoff in performance vs. image quality was often massive. But those were all important stepping stones that paved the way for future improvements.

Anti-aliasing started out by sampling a scene twice (or four times), and in the early days that meant half the performance. Shadow mapping required a lot of memory and didn't always look right—and don't even get me started on the extremely taxing soft shadow techniques used in games like F.E.A.R. Hardware T&L arrived in DirectX 7 but didn't see much use for years—I remember drooling over the trees in the Dagoth Moor Zoological Gardens demo, but games that had trees of that quality wouldn't come until Far Cry and Crysis many years later. The same goes for pixel shaders, first introduced in DirectX 8, but it was several years before their use in games became commonplace. Crysis was the first game to use SSAO, but the implementation on hardware at the time caused major performance difficulties, leading to the infamous "will it run Crysis?" meme.

I point these out to give perspective, because no matter the performance impact right now (in one game, basically in beta form), real-time ray tracing is a big deal. We take for granted things like anti-aliasing—just flip on FXAA, SMAA, or even TAA and the performance impact is generally minimal. Most games now default to enabling AA, but if you turn it off it's amazing how distracting the jaggies can become. And games without decent shadows look… well, they look like they were made in 2005, before shadow mapping and ambient occlusion gained widespread use. The point is that we had to start somewhere, and all the advances taken together can lead to some impressive improvements in the way we experience games.

Battlefield 5 initial performance

Okay, enough rambling. How does Battlefield 5 perform, with and without ray tracing? And how does it look? The answer to the first part is easier. I ran some benchmarks on the three RTX GPUs, along with a GTX 1080 Ti as a reference point for Nvidia's previous generation hardware. I planned to test a few more settings / GPUs, but got locked out of the game with a message about using too many different display adapters. Damn DRM.

All the testing was done on a Core i7-8700K running at 5GHz, my standard GPU testbed. I ran the DirectX 12 version, with and without DXR enabled. I also tested a couple of GPUs in DX11 mode, which performed better on all Nvidia GPUs. Sigh. For ease of testing (ie, no server issues to worry about, no multiplayer messiness where I keep dying during testing, repeatability, etc.), most of my testing was done in a singleplayer mission. But I did try playing multiplayer for a bit as well.

I'll start with the multiplayer, just to get that out of the way. In its present state, with Dice already pointing out some bugs and known issues and talking about future patches, it's far too early to make a final evaluation. But things are seriously messy. On an RTX 2070 at 1080p ultra, sometimes performance was okay, hovering around 40-50 fps, but I got a lot of framerate drops, often hitching for up to half a second or more, and 97 percentile minimums were 7 fps! That's not even remotely playable if you're trying to be competitive, though this is a bug to fix rather than something that's inherently part of the ray tracing hardware. Sometimes I'd go 10-15 seconds without any major stalls, other times it would happen every second or two.

Also, to be clear, the hitching isn't solely a problem with DXR enabled. Using the same 2070 GPU but in vanilla DX12 mode, I still saw periodic stalls where a single frame would take 0.1 seconds or longer to render, and at least a few half-second stalls, all in a period of a few minutes. Overall framerates were more than double the DXR mode, but only because DXR had so many drops—and even without DXR, 97 percentile minimums were below 30 fps. These stalls occurred occasionally in the singleplayer campaigns as well, but it was more like one frame every minute or two, and it wasn't consistent either. So yes, the game is still buggy and needs some work, and that's why this is only my initial impressions rather than a full performance analysis.

Let's move over to the singleplayer testing, which is where I see ray tracing being far more useful in the near term anyway. In multiplayer, especially a fast-paced shooter like Battlefield 5, I don't think many competitive gamers will sacrifice a lot of performance for improved visuals. I might, but then I'm not a competitive multiplayer gamer. Singleplayer modes are a different matter, because the pace tends to be slower and a steady 60 fps or more is sufficient. I play most games at maximum quality, even 4k, simply because I have the hardware to do so and I want the games to look their best. Battlefield 5's War Stories mode has some impressive graphics and often looks beautiful, and adding ray tracing for reflections improves the overall look. But here's how it performs:

There's good news and bad news, obviously. The good news is that all the people claiming Battlefield 5 would need to be at 1080p and only get 30 fps on an RTX 2080 Ti (based on the early preview we got in August) were too quick to jump to conclusions. With DXR enabled and at the ultra setting, the 2080 Ti runs 1080p at over 90 fps, and even minimums stay above 60 fps. The RTX 2080 likewise averages 78 fps, with minimums still above 60. Only the RTX 2070 comes up short of a steady 60 fps at 1080p, and at 58 fps it's not a huge deficit.

The bad news is that performance, just like with the earlier mentioned new technologies, takes a big hit. Framerates aren't quite cut in half on the 2080 and 2070 at 1080p and even 1440p, but they're close—and at 4k you do get less than half the framerate. Even the beastly $1,200 RTX 2080 Ti only averages about 40 fps at 4k ultra, though that's a bit like taking a sports car to a winding, mountainous pass and complaining that you could only average 40 mph through the S-curves. At the middle ground of 1440p, the 2080 Ti does break 60 fps again (for minimums as well), while the 2080 comes up just a bit short.

I did limited testing with some of the other settings for DXR reflection quality, but there are some bugs right now that can affect the medium and high modes. Nvidia recommends using the low setting, which improves performance on all GPUs by about 25-30 percent compared to ultra DXR reflections, but the low setting also tends to miss a lot of the reflections that are present on the higher quality modes. In other words, even after an extra month or more of working on the game and engine, this is still a work in progress. Pretty much standard fare for any cutting edge technology, I suppose.

Initial thoughts

Battlefield 5 performance with DXR is not where I'd like it to be, especially in multiplayer. But high quality graphics at the cost of framerates doesn't really make sense for multiplayer in the first place. How many Fortnite or PUBG players and streamers are running at 4k ultra compared to 1080p medium/high? Improving lighting, shadows, and reflections often makes it more difficult to spot and kill the other players first, which is counterproductive. Until the minimum quality in games improves to the point where ray tracing becomes standard, most competitive gamers will likely give it a pass. But that doesn't mean we don't need or want it.

Ray tracing isn't some new technology that Nvidia just pulled out of its hat. It's been the foundation of high quality computer graphics for decades. Most movies make extensive use of the technique, and companies like Pixar have massive rendering farms composed of hundreds (probably thousands) of servers to help generate CGI movies. Attempting to figure out ways to accelerate ray tracing via hardware isn't new either—Caustic for example (now owned by Imagination Technologies) released its R2100 and R2500 accelerators in 2013 for that purpose. The R2500 was only moderately faster than a dual-Xeon CPU solution at the time, but it 'only' cost $1,000 and could calculate around 50 million incoherent rays per second.

Five years later, Nvidia's RTX 2070 is able to process around six billion rays per second. I don't know if the calculations are even comparable or equivalent, but we're talking a couple of orders of magnitude faster, for half the price—and you get a powerful graphics chip as part of the bargain. As exciting as consumer ray tracing hardware might be for some of us, this is only the beginning, and the next 5-10 years are where the real adoption is likely to occur. Once the hardware goes mainstream, prices will drop, performance will improve, and game developers will have a great incentive to make use of the feature.

For now, Nvidia's RTX ray tracing hardware is in its infancy. We have exactly one publicly available and playable game that uses the feature, and it's only using it for reflections (and maybe refractions). Other games are planning to use it for lighting and shadows, but we're definitely nowhere near being able to ray trace 'everything' yet. Which is why DirectX Raytracing (DXR) is built around a hybrid rendering approach where the areas that look good with rasterization can still go that route, and ray tracing can be focused on trickier stuff like lighting, shadows, reflections, and refractions.

It's good to at least have one DXR game available now, even if it arguably isn't the best way to demonstrate the technology. We're still waiting on the ray tracing patch for Shadow of the Tomb Raider, and DLSS patches for a bunch of other games. Compared to a less ambitious launch, like AMD's RX 590 for example, GeForce RTX and DXR have stumbled a lot. Of course performance without ray tracing is still at least equal to the previous generation, so it's not a total loss. And compared to adoption rates for AA, hardware T&L, pixel shaders, and shadow mapping, having a major game use a brand new technology just months after it became available is quite impressive. Let's hope other games end up being more compelling in how it looks and affects the game world.