Anthem performance benchmarks reveal CPU bottlenecks, difficulty hitting 60 fps

Last week, Anthem's VIP test was full of issues. Many of those problems have been smoothed over in the currently running open beta (Feb. 1-3), but if you expected excellent framerates on your PC, that isn't likely to happen. I've benchmarked Anthem using a short outdoor (ie, free play) sequence on a collection of graphics cards and CPUs, and hitting 60 fps is proving difficult.

Top-tier hardware like an RTX 2080 Ti is playable, sure, but 60 fps at 4K ultra isn't guaranteed. Drop to a less extreme GPU and 1440p medium quality may be your best bet. It's not just the graphics cards getting pummeled, as an overclocked i7-8700K can still be a CPU bottleneck, averaging 140 fps at 1080p low with the RTX 2080 Ti. Mainstream hardware like a GTX 1060 6GB meanwhile can do 1080p low or medium quality at more than 60 fps, but not much more than that.

Of course this is a beta, and we don't know how much things will change between now and the game's official Feb. 22 release. Could performance improve a lot with further tuning? Sure. But if you're playing the beta this weekend, here's a look at the performance I gathered during a more demanding outside scene. (Some indoor areas and caverns show substantially higher framerates, but the sequence I selected represents more of the low-end of the spectrum.)

A few things to note before we get to the charts. Due to the limited time and the three 'systems' per account limit, plus the beta nature of this testing, I didn't run every possible GPU or CPU—not even close. I selected a few Nvidia GPUs including popular cards like the GTX 1060 6GB, GTX 1070, and GTX 1080, plus the newer RTX 2060 and the top-end RTX 2080 Ti, along with an AMD RX 580 8GB as a point of reference, but that's it for now. For CPUs, I've used the Intel i7-8700K at 5.0GHz, an i5-8400, and an i3-8100, plus Bo did some testing on an i7-5820K. I also ran one set of tests with a Ryzen 7 2700X, and Corbin tested on a Ryzen 5 1600. All the CPU testing used an Nvidia GTX 1080 at reference clocks, while the GPU testing elsewhere uses MSI graphics cards.

Also of note is that the world of Anthem features a dynamic weather system, and while it doesn't appear to have a major impact on performance, it certainly causes more variability between benchmarks runs than I'd like. I didn't have time to run multiple tests on every setting and GPU, so consider these preliminary results a quick snapshot of what to expect rather than the final word. (You'll see a few oddities as well, like slower CPUs sometimes beating faster CPUs, which I chalk up to the dynamic weather and such.)

Image 1 of 2

Image 2 of 2

Starting with the low bar of 1080p at minimum quality, the tested graphics cards and processors do reasonably well. Everything breaks 60 fps, but then we're talking about pretty capable hardware here. A GTX 1050 Ti probably isn't going to break 60 fps, at least not consistently, and more modest hardware will clearly struggle.

I did try running Intel's UHD Graphics 630 at 720p low just for kicks, and while Anthem was able to load and I could walk around Fort Tarsis (at 10-20 fps), leaving into the mission area ended up crashing with graphics that didn't render properly (if at all).

At least the CPU side of the equation isn't horrible. Everything from the i3-8100 and up breaks 60 fps, provided you have a sufficiently fast graphics card. CPU cores do help a bit, but it looks like clockspeed is a bigger factor—witness the i3-8100 often outperforming the i7-5820K. That's a 4-core/4-thread CPU running at a static 3.6GHz CPU, with a newer architecture, compared to a 6-core/12-thread running at 3.3-3.6GHz. Only AMD's slower Ryzen CPUs appear to struggle with maintaining 60fps using a decent GPU.

Image 1 of 2

Image 2 of 2

Kicking up the graphics settings to medium drops performance by about 25 percent for cases where the GPU is the limiting factor, but faster GPUs see lesser dips due to the CPU. The GTX 1070 and RTX 2060 for example appear to hit a CPU bottleneck (performance only drops about 12 percent), and the RTX 2080 Ti dips by just 9 percent.

Meanwhile, going from an i7-8700K at 5GHz to an i5-8400 at 3.8GHz with the GTX 1080 causes a similar 10 percent dip, and the i3-8100 runs about 20 percent slower. AMD's Ryzen 2700X is about equal to the i7-5820K, while Corbin's testing of the Ryzen 1600 shows much lower performance.

Image 1 of 2

Image 2 of 2

I'm skipping the high preset, as performance was relatively close to the ultra preset in limited testing—it was about 1-5 percent faster is all. Going from medium to ultra reduces performance by about 25 percent on the midrange GTX 1060, but there's a larger 37 percent drop on the GTX 1070 and RTX 2060. Only the 1070 and above average 60fps.

CPU performance is still a factor, but less so this time, with the 8700K outperforming the i3-8100 by 10 percent while the i5-8400 performance is nearly the same. The 2700X does fine as well, but the 5820K and 1600 drop well below 60fps.

Image 1 of 2

Image 2 of 2

1440p is arguably the limit for most PCs, and ultra quality is only really viable on the fastest GPUs like the 2080 Ti (and probably 1080 Ti and 2080 as well). 144Hz displays aren't going to see their full use, obviously, though FreeSync and G-Sync can still be helpful. Only the 1080 and above manage to average 60fps, while the 1060 still only gets 55fps at medium quality (not shown).

The CPU results continue to be a bit perplexing. I'd expect things to be fully GPU limited now, but the 8100, 2700X, and particularly the 5820K show relatively low performance. That might be the weather or other factors coming into play.

Image 1 of 2

Image 2 of 2

And given the above results, 4K ultra at 60 fps clearly isn't going to be happening, with even the RTX 2080 Ti periodically dipping below that mark. The medium preset boosts performance by about 40 percent, though, so if you're willing to drop the settings there are still GPUs that can handle 4K. But a 1070 at 4k low still only averaged 52fps.

The CPUs are mostly at the same level, with a few still coming up short. Platform differences for Bo's 5820K testing might be to blame, or other software, and clearly you can't just expect every PC with similar hardware to perform the same.

That's it for now. Anthem in it's current beta state clearly taxes PC hardware in different ways than some other games. Generally speaking, however, the performance I'm seeing is typical of games with relatively large open world settings. Assassins' Creed Odyssey also struggles to hit higher framerates, for example.

There are plenty of other issues as well, besides just performance. Getting dropped to the desktop happened multiple times during testing, and reconnecting to an in-progress expedition failed at least five times for me. Others have reported similar issues, but given the improvements between the first VIP beta and the current open beta, there's at least some hope the game will be ready by Feb. 22.

I haven't discussed this above, but looking at AMD vs. Nvidia, the RX 580 8GB slightly outperforms the GTX 1060 6GB in some cases, and loses just barely in others. Basically, the two cards are pretty close to equal, so it doesn't look like either side is inherently better right now (though of course AMD doesn't have anything to compete against the top Nvidia cards). That's good news at least, considering Anthem is one of the games Nvidia is currently offering for buyers of RTX graphics cards.

As for whether or not the game is any good, I'll leave that for others to debate. Best thing you can do is give it a try this weekend, but with the limited areas and story missions available right now, the Anthem demo can certainly feel disjointed. Let's hope the full game rectifies that.