Fortnite Battle Royale performance and settings guide

After a slow start for the original Fortnite, Epic made some changes and added a free battle royale mode, turning a struggling game into an overnight sensation. There's clearly a fine line between homage and copying, but while there are many similarities to PlayerUnknown's Battlegrounds, Fortnite takes a decidedly sillier approach to the battle royale genre. Key to this is a willingness to experiment, with limited time game modes, holiday events, and more—not to mention plenty of aesthetic customizations that can be earned/purchased.

Battle Royale mode is technically still in Early Access, but with a player base of over 20 million—and as already noted, it's free to try—we felt it was worth checking out how performance is shaping up. Given the somewhat cartoony graphics, you might expect performance to be a non-issue, but that's not entirely true. At the lowest settings, Fortnite can run on just about any PC built in the past five years. Officially, the minimum requirements for Fortnite are an Intel HD 4000 or better GPU and a 2.4GHz Core i3. The recommended hardware is quite a bit higher: GTX 660 or HD 7870, with a 2.8GHz or better Core i5. But what does that mean?

In practice, a minimum spec system is going to struggle to maintain 30 fps at minimum quality 720p, while the recommended build should manage closer to 60fps at 1080p and medium quality. But crank up the quality and resolution beyond that point and the framerate will plummet. Epic quality is brutal—it runs even worse than PUBG at ultra quality. But do you even need to run Epic quality? No, and turning down some of the settings can give a healthy boost to performance with little impact on visuals.

Our recommendation for hardware is a bit higher than Epic's GTX 660 and Core i5. On the CPU side, any relatively recent Core i5 or better CPU (including Ryzen processors) should be fine, but for the graphics card, Nvidia GPUs are currently doing much better AMD's parts. Yes, it's the same old "not quite fully optimized" story, though with a few tweaks you can still easily get any decent GPU into the smooth 60+ fps range.

Starting with our features checklist, Fortnite had a bit of a rough start, but things have improved in the past couple of months. Previously, everyone was locked into the same FOV, with vertical cropping for widescreen and ultrawide resolutions. There's still a bit of odd behavior in that Fortnite only lists resolutions for your display's native aspect ratio if you're in fullscreen mode (meaning, on a standard 4K display you'll only see 16:9 resolutions), though you can get around this using Windowed Fullscreen mode in a pinch. FOV will adjust based on your resolution, but you're stuck with whatever FOV Epic has determined to be 'correct'—there's no FOV slider. That's better than pre-December when FOV was completely locked, though.

A word on our sponsor

As our partner for these detailed performance analyses, MSI provided the hardware we needed to test Fortnite on a bunch of different AMD and Nvidia GPUs and laptops—see below for the full details. Thanks, MSI!

The number of settings to tweak is a bit limited, with only six primary settings plus a couple more on/off options for grass and motion blur. About half of these only have a small impact, however, with only three options that significantly alter the way the game renders and performs. There's also no toggle HUD option for taking nice screenshots—you can turn off some of the HUD elements, but the compass, user name, and timer/players/kills counter always remain visible (at least as far as I could tell).

None of these are showstoppers, though fans of modding will be disappointed yet again with the lack of support. As is often the case these days, modding is currently out as it would potentially make it easier to create cheats/hacks—not that this hasn't happened anyway. Perhaps more critically (though Epic hasn't said this directly), mods would likely cut into the profitability of the item store. Why buy an outfit from the store if you could just create your own? Either way, modding isn't supported, and that's unlikely to change.

Image 1 of 4

1440p Low preset

Image 3 of 4

1440p High preset

Image 4 of 4

1440p Epic preset

Fine tuning Fortnite settings

The global Quality preset is the easiest place to start tuning performance, with four levels along with 'Auto,' which will attempt to choose the best options for your hardware. Note that most of the settings will also use screen scaling (3D Resolution) to render at a lower resolution and then scale that to your display resolution. For testing, I've always set this to 100 percent, so no scaling is taking place. Using the presets with a GTX 1080 Ti at 1440p and an RX 580 8GB at 1440p, and doing some limited testing, the presets yield the following results (1080 Ti first, 580 second):

  • Epic: 105 fps vs. 39 fps
  • High: 137 fps vs. 52 fps
  • Medium: 191 fps vs. 79 fps
  • Low: 253 fps vs. 149 fps

Depending on your GPU, dropping from the Epic preset to High will boost performance by around 25-35 percent. Going from High to Medium can improve performance another 40-50 percent. And finally, dropping from Medium to Low gives the potential for another 30-90 percent boost to framerates. Wait, what's with the huge margin on that last drop? Basically, you might start to bump into CPU limits for framerates on the fastest GPUs at low quality.

You can see screenshots of the four presets above for reference, but the Low preset basically turns off most extra effects and results in a rather flat looking environment. Medium adds a lot of additional effects, along with short-range shadows. High extends the range of shadows significantly, and then Epic… well, it looks mostly the same as High, perhaps with more accurate shadows (ie, ambient occlusion).

Image 1 of 8

1440p Epic preset

Image 3 of 8

Shadows on low

Image 5 of 8

Textures on low

Image 6 of 8

Effects on low

For those wanting to do some additional tuning, let's look at the individual settings, using Epic as the baseline and comparing performance with each setting at low/off.

View Distance: Extends the range for rendered objects as well as the quality of distant objects. The overall impact on performance is relatively minimal, however, with framerates improving by 4-5 percent by dropping to minimum. I recommend you leave this at Epic if possible.

Shadows: This setting affects shadow mapping and is easily the most taxing of all the settings. Going from Epic to Low improves performance by around 50 percent. I wouldn't turn this off, but dropping to the high or medium option can still yield a sizeable improvement in framerates.

Anti-Aliasing:  Unreal Engine 4 uses post-processing techniques to do AA, with the result being a minor hit to performance for most GPUs. Turning this from max to min improves performance by around 5 percent.

Textures: Provided you have sufficient VRAM (2GB for up to high, 4GB or more for Epic), this only has small effect on performance. Dropping to low increases framerates by 3-5 percent.

Effects: Among other things, this setting affects ambient occlusion (a form of detail shadowing), some of the water effects, and whatever shader calculations are used to make the 'cloud shadows' on the landscape. It may also relate to things like explosions and other visual extras. Dropping to low improves performance by 20-25 percent, though I recommend sticking with medium or high quality if possible.

Post-Processing: Things get a lot darker when you turn this all the way down, so among other elements it seems to include contrast/brightness scaling and dynamic range calculations. This controls various other post-processing effects (outside of AA), and can cause a fairly large drop in performance. Turning it to low improves framerates by 15-20 percent, though I recommend medium for most decent GPUs, or high on faster PCs.

Show Grass: Despite the name, this doesn't actually disable all the grass—at least not yet? It's a recent addition (within the last week or so), and in testing didn't appear to change performance or visuals at all. Maybe it's a work in progress. Fully disabling the grass might make it more difficult for other players to hide, though the grass isn't really an effective hiding spot.

Motion Blur: This is off by default, and I suggest leaving it that way. Motion blur can make it more difficult to spot enemies, and LCDs create a bit of motion blur on their own. If you want to enable this, it causes a 2-3 percent drop in framerates.

For the benchmarks, I've used my standard choice of 1080p medium as the baseline, and then supplemented that with 1080p, 1440p, and 4k epic. Some players will choose to run at minimum graphics quality, except for the view distance, as it can potentially make it easier to spot opponents, but I prefer image quality—and let's be real, most of us don't play at a level where image quality is what impairs our ability to win.

MSI provided all the hardware for this testing, consisting mostly of its Gaming/Gaming X graphics cards. These cards are designed to be fast but quiet, though the RX Vega cards are reference models and the RX 560 is an Aero model.

My main test system uses MSI's Z370 Gaming Pro Carbon AC with a Core i7-8700K as the primary processor, and 16GB of DDR4-3200 CL14 memory from G.Skill. I also tested performance with Ryzen processors on MSI's X370 Gaming Pro Carbon. The game is run from a Samsung 850 Pro SSD for all desktop GPUs.

MSI also provided three of its gaming notebooks for testing, the GS63VR with GTX 1060 6GB, GE63VR with GTX 1070, and GT73VR with GTX 1080. The GS63VR has a 4Kp60 display, the GE63VR has a 1080p120 G-Sync display, and the GT73VR has a 1080p120 G-Sync display. For the laptops, I installed the game to the secondary HDD storage.

Fortnite benchmarks

Right from the start, the Nvidia performance advantage is clear. While it's not catastrophic, for budget cards where you'd want to run 1080p medium, the 1050 and 1050 Ti are vastly superior to the RX 560. Interestingly, that gap doesn't quite hold for previous generation cards, as the R9 390 places ahead of the GTX 970—and so does the GTX 770, though it's only at lower settings that this happens. The GTX 1060 cards also easily surpass the RX 570 and 580, by about 25 percent. Beyond that, the gap gets to be meaningless, though I'm happy to see there's no framerate cap at all. AMD needs to do some work on its drivers for Fortnite, particularly on the newer RX series.

Shifting over to integrated graphics, I've only tested Intel's latest HD Graphics 630, which is about 20-30 percent faster than the HD Graphics 4600 found in 4th gen CPUs. The official minimum GPU is an HD 4000 (3rd Gen Intel Core), but that might be a stretch. At 1080p medium, the HD 630 gets less than one fourth the performance of the RX 560, but that improves to 20 fps at 1080p low, and a respectable if somewhat blocky 37 fps at 720p low. With AMD's Ryzen APUs I expect performance to fall somewhere between the RX 560 and RX 570—not bad for a $100-$170 processor.

1080p epic quality cuts performance by more than half on all the GPUs, and as I've mentioned already, most players will want to use the high preset instead unless you have GPU power to spare. The 1060 6GB and above break 60fps, while the budget and mainstream cards generally fall in the 20-50 fps range. For 144Hz displays, you'd need at least a heavily overclocked GTX 1080 Ti to max out the refresh rate—or a Titan V if money is no object (thanks Falcon Northwest).

The relative gap between similar AMD and Nvidia parts grows at the epic preset. The 1060 cards now lead the RX 570/580 by around 35-40 percent, and the GTX 1080 is nearly 25 percent faster than the RX Vega 64. Given the popularity of Fortnite Battle Royale, AMD should have plenty of incentive to work on improving its GPUs' performance.

1440p epic is basically the same ranking order, only with even lower framerates. Now even a GTX 1070 struggles to maintain 60fps. Realistically, you'd drop a few settings to high and then you'd be set, but if you simply must max out every setting you'll need a top shelf graphics card. 1440p 144Hz displays are mostly out of reach, since SLI and CrossFire aren't supported by Fortnite, though G-Sync and FreeSync displays make that less of a problem. Even the overclocked Titan V falls short of 144fps at these settings.

What does it take to push more than 60fps at 4k epic? A $3,000 Titan V will do the trick, otherwise you're looking at dropping some of the settings. Not that we really recommend 4k gaming—we've repeatedly recommended 1440p 144Hz G-Sync/FreeSync displays as the best overall gaming solution, or perhaps a 3440x1440 ultrawide 100Hz display if you prefer the wraparound experience. Surprisingly, given the cartoony style, Fortnite is quite a bit more demanding at maximum quality than PUBG. Part of that is likely the Early Access nature of Fortnite BR, but we'll have to wait and see if things continue to improve.

Fortnite CPU performance

Image 1 of 4

Image 2 of 4

Image 3 of 4

Image 4 of 4

I've also done CPU testing with Intel's latest i3, i5, and i7 parts, and most of AMD's current Ryzen parts. Note that these tests all use the same MSI GTX 1080 Ti Gaming X 11G graphics card, to emphasize CPU differences. If you're running a mainstream GPU like a GTX 1060, you can expect nearly identical performance from any of these CPUs.

Here's where things take a turn for the strange. Core i3-8100 and i5-8400 beat the higher clocked, Hyper-Threaded i7-8700K at all four test settings. Best guess is that right now, Fortnite runs into some resource contention when Hyper-Threading or SMT are enabled, and that it's tuned more for quad-core processors. The i7-8700K is clocked 20 percent higher than the i3-8100, with 50 percent more cores and three times as many threads, and it's all for naught.

AMD's Ryzen processors look a bit more sensible at 1080p medium, where the Ryzen 3 1300X is the slowest of the bunch and the 1800X takes top honors, but the 1300X moves to the top once epic quality is engaged. Not that it's a huge difference, as the gap is only a few fps and that's basically margin of error, but clearly Fortnite isn't making full use of higher thread count CPUs.

Fortnite notebook performance

Image 1 of 2

Image 2 of 2

Wrapping up the benchmarks, I've only run 1080p medium and epic on the notebooks—the GS63VR is the only unit that has a 4k display, and frankly its hardware isn't sufficient for 4k gaming. At 1080p medium, the desktop graphics cards easily beat all the mobile GPUs, thanks to higher clockspeeds and a faster CPU. At 1080p epic, things change and the mobile 1080 takes top honors. Part of that might be the quad-core optimizations I just talked about, but the GT73VR does have an overclocked 1080 and it's basically on par with the desktop 1080.

The 1070 and 1060 are in 15.6-inch notebooks, with less cooling potential (especially for the thin and light GS63VR), leading to a larger gap in performance. The desktop 1070 is about 10 percent faster than the mobile 1070, while the desktop 1060 6GB is just over 40 percent faster. Nvidia's Max-Q notebook parts drop clockspeeds even further, to reduce power requirements, typically shedding an additional 10-15 percent in performance. But the good news is that all the notebooks can handle 1080p gaming at 60fps or more, with a bit of tuning for your settings.

Fortnite benchmarking methodology

Like PUBG, benchmarking Fortnite can be a bit problematic thanks to the randomized Battle Bus starting path. I had to try multiple times with some graphics cards just to reach my test location without getting killed. The test sequence involves running around in the Wailing Woods in the northeast corner, and while other areas may be slightly more or less taxing, it's a fair representation of overall performance. I'm also running on the 2.0 version of the map, so places like Tilted Towers and Junk Junction are present.

While I tried to ensure no other players were nearby during my testing (since that can introduce some other variables), margin of error for the benchmarks is going to be higher than in single-player games. However, each test was repeated at least twice to confirm consistent results, and sometimes additional testing was done if the first two attempts showed discrepancies.

I've used my standard 1080p medium and maxed out 1080p/1440p/4k epic settings for these tests. Both AMD and Nvidia cards show about a 25 percent improvement in framerates if you drop to high quality, however, and the change in visuals is extremely hard to notice. There is potential for future updates to make epic quality look better than it currently does, however, so these tests represent more of a worst-case scenario.

If you want to run your own comparable benchmarks against my results, the performance analysis video shows the path I've used. I logged frametimes using OCAT or FRAPS, with CPU affinity for the process set to the last CPU core in Windows Task Manager. 97 percentile minimums are calculated by finding the 97 percentile frametimes (the point where the frametime is worse than 97 percent of frames), then finding the average of all frames with a worse result. The real-time overlay graphs in the video use the frametime data, using custom software that I've created.

Thanks again to MSI for providing the hardware. All the updated testing was done with the latest Nvidia and AMD drivers at the time, Nvidia 390.77 and AMD 18.1.1. Nvidia's GPUs are currently the better choice for Fortnite, though if you have an AMD card you don't really need to worry—just drop settings down a notch or two and you should be fine. Now all we need to do is wait for graphics card prices to return to sane levels, which may be in progress as the current cryptocurrency market appears to be heading south again.

These test results were collected in early February 2018. Since the Battle Royale mode is officially still in Early Access, things are more likely to change in the coming months. Epic is putting a lot of manpower into Fortnite, given the wild success of the BR mode, so consider these results a snapshot in time rather than the final word on Fortnite performance.