Tom Clancy's The Division – Benchmarks and optimization guide

Tom Clancys The Division Front Image2

Welcome to the End of the World

Early last week, The Division hit the ground running, with generally favorable initial impressions. Time hasn't been that kind to the game, but there are a lot of things Ubisoft could do to change the experience. Our full review has the details, if you're interested in hearing about the game play, story, and other elements, but what about performance? A game that won't run well on your system—or will only run well if you drop the graphics quality so far that it looks like a different game—isn't any fun. So we're here with some benchmarks and a full investigation of the graphics settings.

We'll start with the benchmarks before we get to the optimization guide, as it's good to know where hardware falls in terms of performance before we start trying to tweak settings. Let's clear the air on a few other items as well, though. The Division can be nominally called an "Nvidia" title, as it makes use of some GameWorks libraries—nothing new there, since Ubisoft has been rather quick to pick up GameWorks libraries. Interestingly, however, this is not a "The Way It's Meant To Be Played" title—or at least, there's no Nvidia branding when you launch the game. Make of that what you will, but the presence of GameWorks means there's at least one setting that isn't available on AMD's GPUs (Nvidia's HFTS shadows, which requires a Maxwell 2 GPU).

As far as performance goes, there's some good news and bad news when it comes to The Division; let's start with the good. First, it has a built-in benchmark sequence, which in our experience churns out consistent results. That means anyone and everyone can run similar tests to what we're using, and use that to compare their hardware to ours—feel free to sound off in the comments.

The bad news is that the benchmark numbers seem to be a bit off—we've looked at the results and used FRAPS to log frames over the test sequence, and the reported results are almost always higher than what FRAPS gives us. Also, the game reports a 99 percentile FPS, which is sort of the opposite of what we normally want to see—we like to see the lowest one percent (or in our case, three percent) of frame rates, not everything except the lowest one percent.

The other piece of bad news is that the benchmark, while consistent, is not necessarily a great indication of actual gameplay. It's a scripted sequence, and we've noticed that in some scenes frame rates can be lower than the reported averages by about 25 percent. So, know that the benchmark results are not a worst-case result, but more an indication of relative performance. Furthermore, our results will differ by up to 15 percent (give or take) from the game's reported FPS, in most cases for the worse, since the start and end of the benchmark sequence typically have higher fps.

We've used the following hardware for testing:

PC Gamer's 2016 GPU Test Bed
CPUIntel Core i7-5930K: 6-core HT OC'ed @ 4.2GHz
MoboGigabyte GA-X99-UD4
GPUsAMD R9 Fury X (Reference)
AMD R9 390 (Sapphire)
AMD R9 380X (Sapphire)
AMD R9 380 (Sapphire)
AMD R9 290X (Gigabyte)
AMD R9 285 (Sapphire)
Nvidia GTX 980 Ti (Reference)
Nvidia GTX 980 (Reference)
Nvidia GTX 970 (Asus)
Nvidia GTX 960 2GB (EVGA)
Nvidia GTX 950 2GB (Asus)
Nvidia GTX 770 2GB (Reference)
SSDSamsung 850 EVO 2TB
MemoryG.Skill Ripjaws 16GB DDR4-2666
CoolerCooler Master Nepton 280L
CaseCooler Master CM Storm Trooper
OSWindows 10 Pro 64-bit
DriversAMD Crimson 16.3 Beta
Nvidia 364.51


Jarred got his start with computers on a Commodore 64, where he has fond memories of playing the early AD&D Gold Box games from SSI. After spending time studying computer science and working in the IT industry, he discovered a knack for journalism, which he’s been doing for more than a decade. This enables him to play with (aka “test”) all of the latest and greatest hardware; it’s a tough life, but someone has to do it. For science.
We recommend