Red Dead Redemption 2 is finally here on PC, and it has a ton of graphics settings to play with. It's also has stability issues and requires a fast CPU (or a workaround to help eliminate stuttering)—it's been a rough launch on PC for many players. Your PC might be in need of an upgrade to run it well, in other words, and Black Friday deal season should be an ideal time for some holiday purchases. There's also the important choice between DX12 and Vulkan graphics APIs, not to mention performance across popular CPUs and GPUs. It's a lot to cover, so let's get to it.
[Note: After posting this initial look at RDR2 performance, a patch has arrived that aims to improve performance, particularly in regards to minimum fps on certain CPU and GPU combinations (particularly 4-core/6-core CPUs with Nvidia GPUs). However, there remain some clear issues, and more patches are forthcoming. I'll be retesting cards at some point and updating the charts and text, but the general observations about settings and performance remain true.]
A word on our sponsor
As our partner for these detailed performance analyses, MSI provided the hardware we needed to test Red Dead Redemption 2 on a bunch of different AMD and Nvidia GPUs, multiple CPUs, and several laptops. See below for the full details, along with our Performance Analysis 101 article. Thanks, MSI!
Looking at the PC features, the list of graphics settings is good if perhaps a bit overkill (see below). Resolution support is good—I was able to select widescreen, ultrawide, and doublewide resolutions, as well as old school 4:3 stuff like 1024x768. Except RDR2 doesn't properly handle those in fullscreen mode.
Borderless windowed and windowed modes are fine, but in fullscreen mode RDR2 just stretches whatever your monitor's native aspect ratio is to the chosen resolution. So 2560x1080 on a 2560x1440 display looks wrong fullscreen but looks fine in windowed mode. Conversely, 1920x1080 on a 2560x1080 monitor in fullscreen mode also looks wrong. It's not terrible and normally you'd use your display's native AR regardless, but RDR2 gets a yellow on aspect ratios. The FOV meanwhile can be adjusted, for both first- and third-person cameras, but the range is quite limited so it's another yellow.
Controller support and remapping the controls get green happy faces. The latter is pretty much required unless you have extra fingers and appendages, or like the brain-bending camera-relative horse controls for some reason.
Official mod support isn't really a thing, but there are already mods available for the singleplayer campaign, and more are likely to show up. Rockstar is taking the same stance as with GTA5: no mods for multiplayer or you might get banned, but for singleplayer use mods are generally okay. Just be careful if you're toying with mods and then launch Red Dead Online—you'll want to remove any extra files first.
One final piece of good news is that, like Grand Theft Auto 5, Red Dead Redemption 2 is officially component agnostic. Whether your graphics card comes from AMD or Nvidia, or you're running a gaming CPU from AMD or Intel CPU, RDR2 generally doesn't care—Rockstar doesn't have a stake in the hardware vendor game. That doesn't mean all CPUs and GPUs are guaranteed to run flawlessly (more on this in a moment), and I encountered quite a few crash-to-desktop events in the testing that I've completed. But at least it wasn't specifically designed to favor one component vendor.
Red Dead Redemption 2 settings overview
Like GTA5, RDR2 has no presets for graphics. The game will attempt to auto-detect settings that it deems appropriate for your hardware, but you'll almost certainly end up wanting to tweak things. There's a slider labeled "Quality Preset Level" that might seem like a good starting point, but it has 21 tick marks available, many of them apparently overlap, and it has nebulous targets: 'favor performance,' 'balanced,' and 'favor quality.' The issue is that one PC's 'balanced' settings won't always be the same as another PC, and many of the advanced settings (which are locked by default) get set to different values depending on your CPU and GPU.
Figuring out exactly how to reliably benchmark RDR2 took a bit of trial and error, but I've got that sorted out now. The simple solution is to manually configure every setting for each hardware combination I test, after cranking the variable "Quality Preset Level" to minimum—annoying, but it could be worse. I've included a large selection of current graphics cards and CPUs, and it's enough data to show that performance is clearly a problem right now.
I've standardized on the Vulkan API, which generally gives higher average fps but lower minimum fps (for now). Since I'm using a fast CPU for the graphics card testing, at least I'm not getting hit with massive stutters that require workarounds (see above boxout). DX12 can in some cases deliver more stable framerates (ie, better minimum fps), but Vulkan is the default API and indications are that Rockstar's focus will be on fixing Vulkan first. I've got some additional API test results below for those who are interested, but if you're using Vulkan and performance seems bad, try DX12—and vice versa.
Also note that 3GB cards can't even attempt to run with all settings at ultra, and 2GB cards are limited to low on many settings. If you have an older GPU with only 1GB or 1.5GB VRAM, I'm not sure what will happen, but you'll probably be locked into whatever settings the game decides to use. GTA5 has an "ignore memory limits" option, but presumably to try and mitigate instability, RDR2 doesn't have an equivalent and won't allow you to exceed your GPU's VRAM.
If you just stick with the default settings on a 1080p display (or drop to 1080p on a higher resolution monitor), you'll probably be okay. And by okay, I mean that if you have at least a GTX 1060 / RX 570 or faster graphics card with 4GB VRAM, you can probably run RDR2 at 1080p and get 30-60 fps. Unless your CPU is a problem, or some other software is a problem, or the game keeps crashing, or … you get the point. I haven't had too many problems on my test PCs, but then I've been running benchmarks more than playing the game. Regardless, if you want to know how the various GPUs and CPUs stack up, that's what I'm here for.
Running through Red Dead Redemption 2's graphics settings, there are about 40 different options to adjust. As a baseline measurement of performance, I drop the Quality Preset Level to minimum, then unlock the advanced graphics settings and set everything in there to minimum. Then I set the main options to maximum quality but leave MSAA off. That's the starting point for my "ultra quality" and the above settings performance charts on the RTX 2060 and RX 5700. The difference between maximum and minimum on the Quality Preset Level when customizing all the other settings isn't massive, but it's measurable and it's best to be sure.
You can see specific image shots of my test settings in the above gallery, if you're interested. (Note that the latest patch has made a few minor changes in the advanced settings.) Let me also define my low, medium, and high benchmark settings while I'm here. Low is simple: take the settings above, but set everything in the top section to low/off/minimum. Medium and High use the medium and high values for the primary settings, with 2x and 8x anisotropic filtering, respectively (but leave MSAA and FXAA off). Again, images of all of these are in the above gallery.
There. Now we can finally talk about what settings actually matter, as well as performance.
Most of the settings only cause a minor dip in framerates. Reflection Quality and Volumetrics Quality are the two major settings to adjust if you're looking to improve framerates. Global Illumination Quality, Shadow Quality, Screen Space Ambient Occlusion, and Texture Quality can also provide a modest boost to performance—though lower resolution textures are very noticeable and I'd leave them at ultra or at least high on any GPU with 4GB or more VRAM. I'd also leave SSAO on medium or higher if possible.
In terms of advanced settings that can reduce performance, enabling 4x MSAA causes more than a 25 percent drop in framerates. Enable MSAA at your own peril—your GPU almost certainly can't handle it. Unless you're reading this in 2025, in which case, I hope your RTX 5080 or RX 8700 or whatever is awesome. In contrast, TAA is more than sufficient and causes an imperceptible 1 percent dip. Likewise, enabling 4x Reflection MSAA causes performance to drop around 8 percent, and it's not an effect you're likely to notice while playing.
Elsewhere, setting Parallax Occlusion Mapping Quality to ultra can cause a modest 4-5 percent drop in performance. Depending on your GPU, the Tree and Grass LOD settings can also drop performance a few percent—but setting trees to max makes them look nicer and is probably worth the hit. And finally, setting Soft Shadows to ultra reduces performance a few percent but is worth considering.
The remaining advanced settings mostly cause a very small (1-2 percent at most) dip in performance—it adds up if you crank everything to max, but individually the various options don't matter much. That's basically true of all the settings I didn't specifically call out. Nearly everything else in the settings won't substantially change performance, based on my testing. That's 22 different settings that all cause less than a 1-2 percent change in performance. Maybe some of those settings matter more on an older or slower GPU, as I only checked the 2060 and 5700, but if you're at that point you should probably just drop the resolution or use resolution scaling first.
Overall, going from my ultra to low 'presets' improves performance substantially—more than double the fps in my testing—while increasing all of the advanced settings from the defaults (the 'maximum' setting at the bottom of the settings charts) causes a modest 15 percent loss of performance. No single GPU is currently able to maintain a steady 60 fps at 4K ultra, and even 1440p ultra is a stretch, so I've only tested those resolutions at high quality settings.
Red Dead Redemption 2 system requirements
The official RDR2 system requirements are pretty tame. Rockstar lists some relatively old hardware for its minimum recommendation, but given the amount of crashing and other problems users have reported, you should probably err on the side of higher-end components. Rockstar also doesn't state what level of performance you should expect, and I'm guessing it's 30 fps at 1080p low with the minimum setup, while the recommended PC hardware is probably aiming for 30 fps or more at 1080p high. Either way, you're going to need a lot of storage space.
Minimum PC specifications:
- OS: Windows 7 SP1
- Processor: Intel Core i5-2500K / AMD FX-6300
- Memory: 8GB
- Graphics Card: Nvidia GeForce GTX 770 2GB / AMD Radeon R9 280 3GB HDD
- Storage Space: 150GB
Recommended PC specifications:
- OS: Windows 10 April 2018 Update (v1803 or later)
- Processor: Intel Core i7-4770K / AMD Ryzen 5 1500X
- Memory: 12GB
- Graphics Card: Nvidia GeForce GTX 1060 6GB / AMD Radeon RX 480 4GB
- Storage Space: 150GB
Those specs don't look too bad, but the CPU specs in particular are suspect—or at least, a workaround was required to eliminate lengthy stalls and stuttering on lower end CPUs. There's a patch that partially addresses the problem, but right now having a PC that greatly exceeds the minimum specs is a good idea. Especially if you're hoping for a smooth 1080p high at 60 fps, in which case you're probably looking at an RX 5700 or RTX 2060 Super with a 6-core/12-thread CPU or better.
Red Dead Redemption 2 graphics card benchmarks
PC GAMER RDR2 TESTBED
That brings us to actual performance, and I continue to use my standard testbed for graphics cards (see the boxout to the right). Red Dead Redemption 2 includes its own benchmark tool, which was used for all of the benchmark data. The built-in benchmark runs through five scenes, the first four of which are fairly static and don't really represent areas of the game where slowdowns are likely to occur or matter. Each lasts about 20-25 seconds and none are particularly demanding, while the final sequence is a 130 second robbery followed by a horse ride through town, with some shooting—a much better test sequence that's more representative of play.
I'm collecting frametimes from the last portion, using FrameView (an Nvidia variant of PresentMon). Each GPU is tested multiple times to verify the results, though variability between runs is relatively small. Needless to say, I've watched the benchmark a few too many times already. At one point, Arthur fires off up to 14 shots from his six-shooter without reloading—because he's overclocked I guess. Anyway, the benchmark only looks at performance in one area of the game. Other areas will perform better, some will perform worse, but it at least gives a reasonable baseline measurement of the performance you can expect.
All of the discrete GPU testing is done using an overclocked Intel Core i7-8700K with an MSI MEG Z390 Godlike motherboard, using MSI graphics cards. AMD Ryzen CPUs are (or at least will be) tested on MSI's MEG X570 Godlike, except the 2400G which uses an MSI B350I board since I need something with a DisplayPort connection. MSI is our partner for these videos and provides the hardware and sponsorship to make them happen, including three gaming laptops: the GL63 with RTX 2060, GS75 Stealth with RTX 2070 Max-Q, and GE75 Raider with RTX 2080.
I used the presets I defined earlier, along with the latest AMD and Nvidia drivers available at the time for testing: AMD 19.11.1 and Nvidia 441.12, both of which are game ready for Red Dead Redemption 2. I will eventually test—or at least try to test—Intel and AMD integrated graphics at 720p low. I'm not holding my breath that the former will work.
At low / minimum quality, Red Dead Redemption 2 looks okay, but the texture quality is really poor and the world in general looks very bland and blurry. There's still plenty of geometry and objects to pretty things up, and distant surfaces look okay, but anything close to the camera starts to look like it has textures from the original Deus Ex. There's a massive difference between low, medium, and high texture quality, and a modest difference between high and ultra.
Even at minimum quality settings, performance is nothing special. The GTX 1060, both 3GB and 6GB variant, can average 60 fps, and so can the RX 570, but anything slower is going to struggle. Cards like the GTX 1050 only hit 40 fps, and 2GB VRAM means many settings can't even go any higher.
If you're looking at the chart and wondering what the hell is going on with minimum fps, you're not alone. There appears to be a bug right now (which still exists in the latest patch, indicated by the "2080 Ti new" result) where higher average fps ends up causing more framerate instability. Basically, anything averaging over 120 fps starts to get a bit sketchy. But you wouldn't want to run minimum quality settings on any of the GPUs above the 1060 level regardless.
Besides the oddities with the fastest GPUs, where minimum fps actually improves as the settings are increased, there are also problems with budget and midrange cards. Minimum fps on such GPUs can fall well below 60, and Rockstar's engine is definitely not built to hit high framerates. The fastest cards can just barely break 144 fps, but dips into the sub-100 fps range are plenty common.
AMD GPUs do better on both averages and minimums for a change. The RX 570 for instance nearly stays above 60 fps, with a 58 fps minimum in the benchmark. The 1060 6GB in contrast has 97 percentile minimums of just 48 fps. After the sketchy launch performance for AMD GPUs in Ghost Recon Breakpoint and The Outer Worlds, both AMD promoted games, I wouldn't have expected AMD to come back swinging in RDR2. Then again, it's been out on consoles for a year, which utilize AMD hardware, and it does use low-level APIs that traditionally have favored AMD. Either way, it's a nice change of pace for Team Red (Dead).
Bumping everything up to medium quality (except the advanced options, as noted earlier), performance drops about 15-20 percent on the slower cards, while the fastest cards are still mostly CPU limited. Also notice how minimum fps on some GPUs (eg, 2080 Super and 2080 Ti) is better at 1080p medium than 1080p low, and it will be better still at 1080p high. Yeah, there are some performance anomalies with RDR2 on fast PC hardware.
As far as image quality goes, even the medium quality textures still don't look great up close, but there's a definite improvement vs. minimum quality overall.
AMD GPUs continue to lead their closest Nvidia counterparts as well—the 570 is 13 percent faster than the 1060 6GB, and 29 percent faster than the 1060 3GB. For reference, there are many games where the 1060 3GB actually comes out ahead of the 570. It's a bit ironic to see AMD GPUs perform this well in a game that's supposedly vendor agnostic, and perhaps drivers and patches will change things, but this is how things stand right now.
Of the cards I've tested so far, the RX 570 and GTX 1060 6GB still clear 60 fps averages, though minimum fps is far below that. The wide gap between the average and 97 percentile average framerates usually indicated plenty of stuttering/micro-stuttering and framerate dips, which is definitely happening in RDR2. To smooth things out, you'll want at least GTX 1660 Ti (1070 should also suffice) or RX Vega 56 level hardware—and a fast CPU, but more on that below.
Switching to high quality settings drops performance another 20-25 percent relative to medium, unless you're on an ultra-fast card like a 2080 Ti. This is probably as high as most people should go on current hardware, reserving ultra quality for the future. It's not like the slight change in ultra quality reflections and volumetrics is really noticeable.
Finally, the GPU rankings and minimum fps start to make more sense at 1080p high. As I mentioned above, there appears to be some poor coding where faster fps results in more stuttering and framerate instability. That's probably due to the console origins, where fps was never going to get much above 60, never mind 120. At 1080p high and beyond, average framerates are low enough that the problem is mostly gone.
The newer Nvidia Turing and AMD Navi architectures offer some clear advantages. Notice how the GTX 1650 beats the 1060 3GB and comes relatively close to the 1060 6GB? The same goes for AMD's RX 5700 series compared to the Vega and Radeon VII.
Hitting 60 fps without high-end hardware gets a bit more difficult, with the RX 590 and GTX 1660 Ti getting there but, again, with relatively poor minimum fps. (I should note that the GTX 1070 was tested after the latest patch, so minimums may have improved on other GPUs as well.) To get a steady 60 fps for minimums as well as averages, you're looking at the RX 5700 or RTX 2060 Super—the vanilla RTX 2060 falls just a hair short (pre-patch).
These are also the last settings where I can test the 4GB cards, as ultra quality requires a bit too much VRAM. Actually, I can still do 1440p at high quality on 4GB cards, but first let's look at 1080p ultra.
Ultra quality is simply too demanding for most of today's graphics cards. The difference in visual fidelity is also pretty small—slightly better textures, lighting, shadows, etc. And the settings aren't even fully maxed out in my tests, as there are several advanced options that can still be cranked up and drop performance another 10-15 percent.
Sure, the RTX 2080 Ti can still handle 1080p ultra at more than 60 fps, and a handful of other GPUs will average 60 fps as well, but minimums are going to be lower. Otherwise there's not much to say here. If you want to try pushing one or two options to ultra, that's fine. Just leave reflections and volumetrics quality at high or even medium, because you don't really need them. The discernible difference between each level is minimal.
1440p at high settings is actually less demanding than 1080p at ultra, which makes it potentially viable for the high-end cards. The problem is maintaining 60 fps at 1440p, as usual.
AMD's minimum fps are generally worse than the top Nvidia cards now, though the 6GB 2060 also looks pretty weak (pre-patch). Of the cards I've tested so far, only the 2070 Super and above keep minimums above 60 for Nvidia, while AMD doesn't have any GPUs that manage a steady 60+ fps. Average fps still favors AMD on most matchups, however, with the 5700 XT beating the 2070 Super by a hair.
If you're only looking to hit 30 fps, the GTX 1660 Ti and above should be fine, and maybe even the RX 590. The RX 570 4GB does average 40 fps, but the dips into the low 20s are definitely noticeable and something I wouldn't recommend. Nvidia's 1060 likewise feels very choppy—maybe 1440p medium would be okay, but high quality isn't.
Finally, 4K at high quality is as far as I tried to push things. Ultra quality drops performance about 25-35 percent, depending on your GPU, which means nothing comes close to averaging 60 fps at maximum quality and 4K in RDR2. But 4K high still looks crisp and clean, and at least one GPU—the RTX 2080 Ti—can average more than 60 fps. But that average comes with a 43 fps minimum (52 fps with the latest patch) indicating plenty of dips, unfortunately.
I'm reminded of my early GTA5 testing, where I maxed out everything including the advanced settings. Back then, the GTX 980 Ti was the king of the graphics cards, but 4K and max quality on a single GTX 980 Ti simply wasn't going to cut it. In fact, GTA5 at maxed settings (including 4xMSAA) plugged along at just 24 fps on the then-fastest GPU.
So if you're looking at RDR2 and wondering how not even the RTX 2080 Ti can handle 4K at maximum quality, this isn't really anything new. Of course, multi-GPU support was still more of a thing back in 2015, whereas SLI and CrossFire support is practically gone these days.
Shockingly (to me, anyway), RDR2 actually does have explicit multi-GPU support under Vulkan. I don't have the necessary hardware to test this (specifically, I need an NVLink connector), but I've been told by Nvidia that any NVLink enabled GPUs (2070 Super and above) will work, with the latest patch that just came out.
So if you have a pair of 2070 Super or faster GPUs, maybe playing RDR2 at 4K, maximum quality, and 60 fps is possible. Certainly it should be possible with dual 2080 Ti cards. But for the mere mortals, it's not happening.
After all the initial testing results, I also wanted to show some of my API testing numbers. The above gallery represents a lot of benchmarks, all to basically reach the conclusion that the choice of which API to use has no clear winner. Some cards at some settings do better with Vulkan, others do better at DX12. I suspect additional patches will change things quite a bit as well, but the above is how RDR2 was running in its initial state. Regardless of which API you choose, RDR2 has bugs and performance anomalies that need to be squashed.
Red Dead Redemption 2 CPU benchmarks
I've already mentioned stuttering on CPUs with lower core and thread counts, but it's now time to show just how bad things can get. As usual, I'm testing with the fastest consumer graphics card currently available, the RTX 2080 Ti, in order to show as much of a difference between the CPUs as possible. I've also run the Core i3-8100 with and without the stuttering workaround (limiting RDR2.exe to 98 percent of the CPU).
I'll be adding more Ryzen CPU results soon enough. I've tested the Ryzen 9 3950X, which does fine, but I'm holding off on additional tests for a bit. The main difficulties seem to be related to CPUs with fewer than eight threads, so a 4-core/8-thread CPU actually works better than a 6-core/6-thread CPU in this game, even if the latter is usually faster in other games. The latest patch is supposed to help address this (via a launch parameter for now).
Also note that the issues with faster hardware getting worse minimum fps is definitely a problem at 1080p low and medium settings. Basically, results are all over the map at those resolutions. Once things settle down, I'll look at retesting the CPUs. Until then, here are the CPU testing results, with some CPUs tested in both DirectX 12 and Vulkan modes:
The Core i9-9900KS mostly ends up at the top of the charts, but the choice of API is still a factor. Note that the Ryzen 9 3950X was tested after the latest patch (and I also retested the 8700K OC), so not all of the results are currently on equal footing. Anyway, right now I'd hold off on trying to draw any firm conclusions as to what CPUs are best for RDR2.
Much of the drop in performance from the 8700K to the 8400 is from the lack of Hyper-Threading on the latter. I've tried several tweaks to improve the i5-8400 results, which are currently prone to stuttering at times, but I haven't found a magic bullet yet. It's weirdly inconsistent—1080p ultra it actually beat the 8700K OC in DX12 mode, but it's not clear why in certain combinations it does well, where others it sucks.
That stands in contrast to the i3-8100, where minimum fps is so bad using the default settings as to render RDR2 almost entirely unplayable. But the stuttering fix does wonders, relatively speaking, and while minimum fps is still well below 60, it's better than the alternative of not playing at all. Maybe.
Keep in mind that the Core i3-8100 is going to be very similar to any of Intel's previous generation of Core i5 parts. It has 4-cores and no Hyper-Threading, which doesn't sit well with RDR2. A stuttering fix is pretty much required for anyone using this sort of CPU. Rockstar says the latest update helps with performance on such CPUs, but I haven't had time to retest yet.
Red Dead Redemption 2 laptop benchmarks
What about laptops? Considering the problems with CPUs with fewer cores and threads, I was a bit worried about how RDR2 would perform on the GL63. It has a 4-core/8-thread Core i5-8300H mobile CPU, with lower clocks than you'll typically see from the desktop i5-8400. The other two laptops have 6-core/12-thread Core i7-8750H processors, which should be less of a concern.
Turns out, my misgivings were mostly unwarranted. There were no massive stalls on the GL63, or any of the other laptops. On the other hand, the lower clockspeeds and fewer threads definitely won't help performance.
The mobile RTX cards aren't able to keep up with the desktop models. Part of that is because the mobile GPUs are clocked lower (especially the Max-Q variants), but the slower mobile CPUs are certainly a factor. Keeping minimum fps above 60 is going to be difficult on midrange gaming laptops, but if you go whole hog on something like the GE75 you should be okay.
The GL63 does fall a bit behind the GS75, but it's difficult to say whether that's the CPU or the GPU slowing it down. The 2060 and 2070 Max-Q usually perform about the same, but the latter also has 8GB VRAM, which can help.
All three of my test laptops from MSI are also equipped with 32GB of system RAM. That normally wouldn't matter for games, but RDR2 seems to push beyond the level of hardware I consider sufficient, so I wanted to check. As a quick test, I slapped an additional 16GB of RAM into my desktop and retested the 2080 Ti. At least in my testing, the extra system RAM didn't appear to be a factor, though it might help some during longer play sessions.
Parting thoughts and port analysis
Desktop PC / motherboards / Notebooks
MSI MEG Z390 Godlike
MSI Z370 Gaming Pro Carbon AC
MSI MEG X570 Godlike
MSI X470 Gaming M7 AC
MSI Trident X 9SD-021US
MSI GE75 Raider 85G
MSI GS75 Stealth 203
MSI GL63 8SE-209
MSI RTX 2080 Ti Duke 11G OC
MSI RTX 2080 Super Gaming X Trio
MSI RTX 2080 Duke 8G OC
MSI RTX 2070 Super Gaming X Trio
MSI RTX 2070 Gaming Z 8G
MSI RTX 2060 Super Gaming X
MSI RTX 2060 Gaming Z 8G
MSI GTX 1660 Ti Gaming X 6G
MSI GTX 1660 Gaming X 6G
MSI GTX 1650 Gaming X 4G
MSI Radeon VII Gaming 16G
MSI Radeon RX 5700 XT
MSI Radeon RX 5700
MSI RX Vega 64 Air Boost 8G
MSI RX Vega 56 Air Boost 8G
MSI RX 590 Armor 8G OC
MSI RX 580 Gaming X 8G
MSI RX 570 Gaming X 4G
MSI RX 560 4G Aero ITX
Thanks again to MSI for providing the hardware for our testing of Red Dead Redemption 2. To put things bluntly, this has been a bungled launch on PC. The cynics among us will point at the delayed Steam release as proof that Rockstar knew the PC launch of RDR2 was premature. Even if Rockstar didn't know, it's surprising to see big stability problems in a marquee PC game as good looking as RDR2.
Check back next month when the Steam release arrives, and I won't be surprised if the stability and performance woes are a thing of the past. Also, it would be lovely if Rockstar just axed the first four scenes from the built-in benchmark; all they're doing for me (and others) is doubling the amount of time it takes to run my tests.
For now, RDR2 generally needs a good graphics card for 1080p at high settings, but just as important is a CPU with sufficient cores and threads. I'm looking to test more of AMD's Ryzen parts as well, and early indications are that the 6-core/12-thread and higher models are going to do fine in RDR2. Old AMD parts like the FX series wouldn't be my first pick, but then they never were—I haven't had an FX PC around for testing since the first Ryzen CPUs shipped, and I don't miss it at all.
Overall, AMD's graphics cards do quite well in RDR2, all things considered. Minimum fps at higher settings drops off, but for 1080p medium or high, particularly on midrange GPUs, the Radeon models definitely hold up better than the previous gen GTX cards.
Let me also talk briefly about the PC port of RDR2, which shows clear signs of having been designed for console hardware. There are several indications that RDR2 has code that makes assumptions about your PC's hardware, and those assumptions end up being wildly inappropriate at times. Two shining examples of this are the CPU stuttering problems and the minimum fps results on high-end GPUs.
The horrible initial performance on 4-core/4-thread and 6-core/6-thread CPUs is weird, considering the official minimum spec CPU is an i5-2500K. Why would an i5-8400 do so much worse than an i7-8700K, with seconds long pauses at times? The PS4 and Xbox One both have 8-core AMD Jaguar processors, and it looks as though the PC code incorrectly spawns too many threads on CPUs with fewer than eight threads. Or at least it allocates processing time very poorly, which is why the original workaround was to limit RDR2 to 98 percent of CPU time.
The other clear indication of "console portitus" is the minimum fps results at 1080p low and medium. The RTX cards are all over the place, with the best minimum fps result coming from the 2070 and 2060 Super (95.8 fps), and the worst results are on the 2080 Super and 2080 Ti—and the latter gets worse after the latest patch, dropping from 86.7 in my initial testing to 75.5 fps now. Maybe it's the extra 3GB VRAM, maybe it's too much raw speed... or most likely it's some code that doesn't work very well with high framerates.
Considering RDR2 launched over a year ago on console, the current state of the PC launch is shocking. Especially in light of the fact that GTA5 continues to sell like hotcakes on Steam. It's not like Rockstar doesn't have the funds to do a proper port, or even the expertise.
But hey, at least Red Dead Redemption 2 made it out on PC right in time for Black Friday deals, which I'm sure had absolutely nothing at all to do with it releasing in its current state. And I fully expect patches and driver updates to improve the situation. Six months from now, hopefully PC gamers won't have to worry about using utilities or launch parameters to make RDR2 run okay.
Anyway, that's how things stand right now. Rockstar says it's working with hardware partners to improve things, and additional patches are on the way. That means these benchmarks may be more of a snapshot in time rather than something to look back on for months to come, but I'll cross that bridge when I come to it.