How to get the best performance out of PlayerUnknown's Battlegrounds

PlayerUnknown's Battlegrounds is one of the hottest multiplayer games right now, having spawned the battle royale genre and boasting concurrent player counts north of 1.5 million. However you play it, the game is a constant adrenaline rush. Unless of course your hardware isn't up to the task.

When I looked at PUBG last year (here's the original PUBG PA video), there were some clear problems with performance. Besides network code and lag, the engine strongly favored Nvidia GPUs. But I've retested with the retail release (currently 3.6.8.2) on new hardware and with the latest drivers. It appears time heals all wounds, as the game is now far more GPU agnostic, with performance up to 30 percent faster than last year.

That doesn't mean running PUBG is a cakewalk, however, as it's is still far more demanding than games like Counter-Strike: Global Offensive and Overwatch. Part of that is thanks to Unreal Engine 4, which has a good reputation for image quality, but can also tax even the best systems.

Provided you're not trying to run at ultra-high resolutions and maximum quality, however, PUBG's system requirements aren't too bad. The minimum GPU recommendation is a GeForce GTX 660 2GB or Radeon HD 7850 2GB, but you can get by with less... just not at a smooth 60fps. CPU requirements are even more modest, with Core i3-4340 / AMD FX-6300 listed as the minimum, and I didn't see a lot of difference between the various CPUs I tested.

Quickly running through the features checklist, Battlegrounds has plenty of graphics options, and it checks most of the right boxes. Resolution support is good, and aspect ratio support works properly in my testing, with a change in the FOV on ultrawide displays. There's also an FOV slider, but that only affects the FOV if you're playing in first-person perspective.

There are a couple of negatives. First, the 144fps cap is still present, and I wish it would go away. Then again, the network tickrate reportedly runs well below 20Hz at the start of a match and can be quite variable, so maybe Bluehole should fix that first. As for modding support, that's officially out—not that there aren't plenty of unauthorized hack mods floating around, though PUBG has been quite active about banning cheater accounts.

A word on our sponsor

As our partner for these detailed performance analyses, MSI provided the hardware we needed to test PlayerUnknown's Battlegrounds on a bunch of different AMD and Nvidia GPUs, multiple CPUs, and several laptops—see below for the full details. Thanks, MSI!

Fine tuning PUBG settings

While the global preset is the easiest place to start tuning performance, sometimes you'll want to tweak things to find a better balance. Using a GTX 1070 with limited testing at 1440p, I got the following results:

  • Ultra preset gets 61 fps
  • High preset gets 81 fps
  • Medium preset gets 90 fps
  • Low preset gets 101 fps
  • Very low preset gets 136 fps (and may be hitting the fps cap).

Or if you prefer percentages, the high preset runs about 33 percent faster than the ultra preset, medium adds another 11 percent, low bumps it up 12 percent more, and very low gives a sizeable 35 percent increase (with the largest drop in visual quality).

If you're hoping to tweak the individual settings to better tune performance, I did some testing to see how much each setting affects framerates using the ultra preset as the baseline. I then dropped each setting down to the minimum (very low) setting and to measure the impact.

Screen Scale: The range is 70-120, and this represents undersampling/oversampling of the image. It's like tweaking your resolution by small amounts, but I mostly recommend leaving this at the default 100 setting.

Anti-Aliasing: Surprisingly not a major factor, but this is because Unreal Engine requires the use of post-processing techniques to do AA. If you want better AA, you could set screen scale to 120 to get a moderate form of super-sampling. Going from ultra quality AA to very low quality AA had a negligible impact on performance.

Post-Processing: A generic label for a whole bunch of stuff that can be done after rendering is complete. This has a relatively large impact on performance—going from ultra to very low improved framerates by 15 percent.

Shadows: This setting affects ambient occlusion and other forms of shadow rendering, and going from ultra to very low improved performance by just over 16 percent.

Texture: Only a minor impact on performance, provided you have enough VRAM. Dropping from ultra to very low increased framerates by 5 percent.

Effects: This setting relates to things like explosions, among other elements. Interestingly, in earlier testing it didn't appear to affect performance much, but now it's the single most demanding setting in the game. Dropping to very low improves performance by up to 25 percent.

Foliage: Given all the trees and grass, you might expect this to have a larger impact on performance, but I only measured a 2 percent (1 fps) difference after setting it to very low.

View Distance: This appears to have a greater impact on CPU performance than on graphics performance, so if your CPU is up to snuff you can safely set it to ultra. Even on a Core i3 system, dropping to very low only made a 3 percent difference in framerates.

Motion Blur: There's a reason this is off by default, right? Spotting enemies while moving around is more difficult with motion blur enabled. But if you like the effect, turning it on causes about a 2 percent drop in framerates.

For the benchmarks, I've used my standard choice of 1080p medium as the baseline, and then supplemented that with 1080p, 1440p, and 4k ultra. I know some people will want to run at minimum graphics quality, except for the view distance, to try and gain a competitive advantage. It looks ugly, but it may be easier to spot people hiding in the grass or shadows.

MSI provided all of the hardware for this testing, mostly consisting of its Gaming/Gaming X graphics cards. These cards are designed to be fast but quiet, though the RX Vega cards are reference models and the RX 560 is an Aero model.

My main test system uses MSI's Z370 Gaming Pro Carbon AC with a Core i7-8700K as the weapon of choice, with 16GB of DDR4-3200 CL14 memory from G.Skill. I also tested performance with Ryzen processors on MSI's X370 Gaming Pro Carbon. The game is run from a Samsung 850 Pro SSD for all desktop GPUs.

MSI also provided three of its gaming notebooks for testing, the GS63VR with GTX 1060 6GB, GE63VR with GTX 1070, and GT73VR with GTX 1080. The GS63VR has a 4Kp60 display, the GE63VR has a 1080p120 G-Sync display, and the GT73VR has a 1080p120 G-Sync display. For the laptops, I installed the game to the secondary HDD storage.

PUBG benchmarks

Starting at 1080p medium, the 144fps cap comes into play on AMD's Vega 56 and higher, or Nvidia's GTX 1070 and higher. The top six cards are effectively tied due to the cap, though there are still minor variations in minimum fps. Not that you'd really want to use any of the high-end GPUs for 1080p medium gaming.

Continuing down the chart, we get to the mainstream cards. The 1060 6GB and 1060 3GB outperform the RX 580 8GB and RX 570 4GB in PUBG, where in other games AMD's cards often come out ahead. At lower quality settings, the engine appears to scale better on Nvidia's GPUs. This holds for the GTX 1050 vs. RX 560 as well, where the 560 falls well short of 60fps, even though it has 4GB VRAM, while the 1050 card with 2GB VRAM just edges past the 60fps mark. Older GPUs like the GTX 770 end up falling right in line with the GTX 1050.

What about integrated graphics? I've only tested with Intel's latest HD Graphics 630, which is about 20-30 percent faster than the HD Graphics 4600 found in 4th gen CPUs. It's not pretty, as even the RX 560 is over four times faster than the HD 630. On the other hand, if you drop to minimum quality and 720p, the HD 630 does manage a somewhat tolerable 25fps. AMD's upcoming Ryzen APUs should look much better when they arrive later this month.

Moving up to 1080p ultra drops performance by about 33 percent relative to medium quality. Most mainstream and above cards are playable, though you may want to adjust a few settings on cards like the GTX 1060 or RX 570 to hit 60+ fps. I've also included results from Nvidia's halo card, the Titan V, just for fun (thanks Falcon Northwest).

Nvidia's 1080 and above claim top honors, and you can also see the results for the GTX 1080 tested on the new desert map. Whether or not you like the new map is a different matter, but at least performance is consistent. The Vega cards end up placing just ahead of the GTX 1070, while the 1060 cards trade blows with the RX 570/580. At the bottom, the GTX 1050 still remains a much better choice than the RX 560.

Compared to my early access testing, performance has improved on all of the GPUs, but the biggest gains have been on AMD hardware and on Nvidia's high-end offerings. AMD GPUs are up to 30 percent faster, and the 1080 Ti is 40 percent faster, but other cards like the GTX 1060 only see a modest 5-10 percent improvement.

The pattern for 1440p ultra is virtually the same as 1080p ultra, with a couple of small changes. The RX 580 8GB now beats the 1060 6GB, and the RX 560 4GB also beats the GTX 1050 (though the latter is admittedly a pyrrhic victory). The Vega cards continue to fall between the 1070 and 1080, and if I had tested it, the 1070 Ti should basically match the Vega 64. For 60fps, however, you'll want at least a GTX 1070 or Vega card.

If you're hoping to push the limits of a 1440p 144Hz display, about the only way you might get close is with SLI 1080 or a $3000 Titan V. Yeah, that's a bit nuts, and you're better off lowering your expectations a bit. 70 to 80 fps is a great target for most of us, particularly if you own a G-Sync or FreeSync display.

You can also hit 60fps and above on mainstream cards like the 1060 and 570, provided you reduce some of the settings to medium or high. The big ticket items for improving framerates are shadows, post-processing, and effects, so focus on those three if you're looking to boost performance.

4k ultra as usual proves too much for most of the GPUs, though the Titan V still chugs along happily at over 75fps. That's with a modest overclock, basically matching the sort of factory overclock you get on MSI's Gaming X cards. More aggressive overclocking of a GTX 1080 Ti should also get you above 60fps, though you're probably better off just tweaking a few settings instead.

Using dual GPUs via SLI is also supported now, though I didn't include test results as we don't generally recommend SLI any longer. There are simply too many major games that don't support the technology (more than half of the big releases in 2017 didn't), and usually it doesn't help minimum fps much. I may revisit this chart with SLI in the near future, though, just for the sake of completeness.

PUBG CPU performance

Image 1 of 4

Swipe left/right for more images.

Image 2 of 4

Swipe left/right for more images.

Image 3 of 4

Swipe left/right for more images.

Image 4 of 4

Swipe left/right for more images.

What about the CPU side of things—how many cores does Battlegrounds need to run properly? I've used the GTX 1080 Ti for all of these tests in order to create the biggest difference in CPU results you're likely to see. And the gap is... decidedly narrow. This is another area where PUBG has improved over the past year.

I didn't include any older Core i3 parts (the 2-core/4-thread models from earlier generations), but 4-core/4-thread parts and above are very nearly equal to the 6-core/12-thread and 8-core/16-thread models. Even clockspeeds don't seem to matter much, judging by the minor difference between the 3.6GHz i3-8100 and the 4.3GHz i7-8700K.

Of course, if you're doing other things while playing PUBG (like livestreaming), you might want a more potent CPU than the i3-8100. But if you're running lean (as in, not a ton of background tasks), even the $100 modern CPUs will suffice.

Older processors should also be fine, though I wouldn't necessarily upgrade to a 1080 Ti if you're running a 3rd or 4th Gen Intel chip.  Either way, most of these CPU limitations are only visible with an ultra-fast graphics card. Using a slower mainstream card like a GTX 1060 3GB, basically all of the CPUs are more than sufficient.

PUBG notebook performance

Image 1 of 2

Swipe left/right for more images.

Image 2 of 2

Swipe left/right for more images.

Shifting gears to notebook testing, the mobile CPUs aren't quite able to keep the fastest GPUs fully fed with data at 1080p medium. The slower mobile RAM may also be a factor. Once we move to 1080p ultra, however, the only real difference between mobile and desktop performance comes down to GPU clockspeeds.

That gives a relatively small gap of only 10-15 percent, and Nvidia's 10-series notebook GPUs (not the Max-Q versions) are very close to desktop performance levels. If you have a Max-Q laptop, performance can be about 15 percent slower than the non-Max-Q mobile GPU.

If you're experiencing a lot of intermittent stuttering in PUBG, I highly recommend upgrading to an SSD.

One thing I do want to mention is that while PUBG is installed to a Samsung 850 Pro SSD for the desktops, due to the limited storage capacity on the laptop SSDs I opted to use their HDD storage. There's a massive difference in some of the stuttering and pop-in, particularly during the initial air-drop into the level where everything is far worse with a hard drive. That goes double if you have a 5400RPM HDD.

If you're experiencing a lot of intermittent stuttering in PUBG, I highly recommend upgrading to a larger SSD. Then again, PUBG is only about 13GB in size, so you should be able to make room on even a modest SSD.

PUBG benchmarking methodology

Benchmarking Battlegrounds is a completely different beast from singleplayer games, thanks to the randomized starting locations. Trying to get a repeatable benchmark sequence with multiple people in the vicinity just isn't going to happen—if anyone is nearby, I'm likely to wind up dead long before I can finish testing. Plus multiple players in the same area tends to drop framerates, so if I ended up with too many people, I'd just quit/suicide and restart.

The urban settings in Battlegrounds have more polygons and other objects to render, resulting in lower framerates than when you're out in the grassy countryside. However, cities and buildings are where much of the action takes place, and I eventually settled on a benchmark location in Yasnaya Polyana. It's not quite as popular with the locals as some of the other cities, but it's still reasonably accessible.

You can see the actual test location in the video, and I basically run laps around a building in the northeast section of the city, logging framerates at the various settings and resolutions. I tested each setting twice, to control for variables like other players, using the best result (and sometimes running one or two more tests if the first two results were wildly different).

Thanks again to MSI for providing the hardware. All the updated testing was done with the latest Nvidia and AMD drivers at the time, Nvidia 390.77 and AMD 18.1.1. Assuming graphics card prices return to sane levels, Nvidia's 1080 and 1080 Ti are the fastest options and also the most expensive (MSRP), while the Vega 56/64 and 1070/1070 Ti are more closely matched (assuming Vega cards ever get down to AMD's suggested prices of $400/$500).

While previously Nvidia GPUs held a clear advantage, things are far closer these days, and really you can play on just about any decent graphics card with the right settings. I'd lean a bit more toward Nvidia cards right now, particularly for budget builds, but if you're playing other games that favor AMD cards there's no harm in going with team Red.

Now that Battlegrounds is out of early access, we likely won't see massive upgrades in performance. The game now runs well on a large variety of hardware, and it offers plenty of settings to tweak. Now if only the developers would see fit to remove that framerate cap….