Skip to main content

What does it mean when a game has AMD or Nvidia branding?

PC gamers are an incredibly diverse audience, and I'm not just talking about race or gender. For PC gaming enthusiasts, besides their favorite games, there's the hardware used to run those games. Everything from the CPU, graphics card, motherboard, SSD, memory, case, monitor, keyboard, mouse, and even power supply can be a personal statement. It's not surprising then to see hardware companies helping to promote certain games, and the biggest names for gaming hardware are AMD and Nvidia—and Intel as well, currently mostly for CPUs, though that could change next year. So, what does it mean when a game includes specific branding from one of those companies?

One thing it means is money, and that can come in a variety of forms. Advertising and marketing dollars are a common approach, and there may be other related deals like giving out 'free' copies of a game to people who buy a new graphics card. Providing hardware and software to game developers is another option—what better way to ensure a new game runs properly on a specific set of hardware than to provide that hardware to the developers? But perhaps more important than the hardware is expertise in optimizing a game for that hardware, in the form of software developer expertise.

The Batman games have been one of the few instances where Nvidia's PhysX was put to good use. All those streamers and paper are only available with an Nvidia GPU.

All three of the major CPU/GPU companies have in the past provided software engineering resources to game developers, and I expect that to continue. The forms this takes can vary quite a bit, however, and when we speak of a game being "built for AMD" or "built for Nvidia," usually it's the engineering resources that end up being the most impactful.

We've seen in the past (AMD Gaming Evolved, Nvidia The Way It's Meant To Be Played, and Intel Inside), and will certainly see in the future, games that are heavily optimized for a particular GPU and/or CPU architecture. Ideally, this merely provides the best performance possible on one set of hardware without significantly impacting performance on other hardware brands, but it can—and has—result in sometimes questionable performance comparisons.

The Witcher 3 uses Nvidia's HairWorks library, to create the luscious locks on Geralt and Roach.

To cite a few examples of the latter, Crysis 2 and its DX11 patch were one of the first examples of using a lot of tessellation in a game. Crysis 2 was promoted by Nvidia, and Nvidia GPUs at the time had faster tessellation hardware than AMD offerings. Analysis done by various hardware review sites discovered often extreme uses of tessellation—eg, flat surfaces that had no geometry to speak of still ended up being broken into thousands of "tessellated" polygons, and there were tessellated water meshes under the ground that weren't even visible.

What's the big deal with extra tessellation? It can hurt performance on GPUs that don't have sufficient hardware resources to handle all the additional polygons. In the case of Crysis 2, that means worse relative performance on AMD GPUs, often with no change in visuals. To be fair, without Nvidia's help, the DX11 and tessellation patch for Crysis 2 may never have existed in the first place, but it certainly favored Nvidia GPUs.

That's one example for Nvidia, but I could give others. The Witcher 3 and HairWorks, many of the Assassin's Creed games, the Batman Arkham games, and more. Pretty much anything that has supported one or more of Nvidia's GameWorks libraries is suspect, and it's difficult to say how much those libraries help or hurt performance—or the game developers. In short, GameWorks is designed to highlight Nvidia's latest GPUs, making them look more enticing.

Lara Croft has had the option for beautiful hair as well, since the reboot in 2013, courtesy of AMD's PureHair library.

We've seen similar things from AMD, though as the underdog in the graphics world it's usually given a pass. Still, many DirectX 12 enabled games have been promoted by the company, and some like Total War: Warhammer and its sequel only received "beta" DX12 support. Even now, a couple of years later, Total War: Warhammer tends to run poorly in DX12 mode on Nvidia GPUs, compared to DX11 performance.

The reasons for AMD's superior DX12 are multiple—AMD GPUs have some extra hardware features (specifically, asynchronous compute) that tend to be useful in low-level APIs like DX12 and Vulkan, and low-level APIs require more significant developer resources to be properly optimized for each GPU architecture. Combined, it's not too difficult to make a game using a low-level API where performance on one GPU is great while performance on a different GPU is quite poor.

Even Intel gets in on the action sometimes, like in Grid 2 with these special options that can only be enabled with HD Graphics.

Making games costs a lot of money, and developers don't intentionally want to lose customers or create problems. It happens on occasion, but what we end up with more often is games with extras tacked on, sometimes months after a game is released. These extras might bring a big change in performance or visuals, or they might only cause a minor change. Often, they're a proof of concept for techniques that could become common in the future. We've seen this with PhysX, PureHair, HairWorks, tessellation, various DirectX releases, etc. The latest hot topics include things like ray tracing, DLSS, and VRS — all features Nvidia added with its Turing architecture GPUs.

Ray tracing, the new hotness for realistic reflections, shadows, and global illumination. Also great for tanking framerate.

Of course, developers and publishers generally want to support as many PCs as possible, so rarely do we see large chunks of older hardware locked out. All the ray tracing and DLSS enabled games so far (Battlefield 5, Metro Exodus, and Shadow of the Tomb Raider) support Nvidia's new RTX GPUs, but they can still run on non-RTX cards without all those fancy extras. I don't expect that to change any time soon, either, but while ray tracing support may not be critical today, in 5-10 years, DirectX Raytracing (DXR) could become the primary API for Windows games. Getting a head start on new technology while receiving some advertising and marketing help for a game? That can be almost too good to pass up.

CPU specific optimizations in games have become more common since AMD released Ryzen.

That's the graphics side of things, but CPU optimizations are also possible. Intel CPUs have been the standard for … well, pretty much since the beginning of PC gaming. But AMD also makes processors, and often what's best for Intel isn't best for AMD. This is especially true with the latest AMD Ryzen processors, where higher core and thread counts on AMD CPUs mean games designed for 4-core/4-thread Intel CPUs (the old "typical" PC) don't run as well as they could.

There has been a steady push over the past couple of years to move games beyond 4-core designs, but it's a more difficult problem than making a game that can benefit from a faster graphics card. GPUs are already handling most of the work that can be spread among hundreds and even thousands of simultaneous calculations, leaving CPUs with the tasks that typically don't lend themselves to being handled by a GPU—things like processing user input, AI, and game world updates.

That's not to say games can't make use of more CPU cores and threads, but it's more difficult, and few games would be able to benefit from running on something like a 56-core/112-thread or 64-core/128-thread CPU.

This branding triple threat will never happen, thanks to marketing.

Back to the original question, then: What does it mean when a game has branding for a specific hardware company? There's no single clear-cut answer. It could mean marketing and promotional help, with a minor amount of software engineering resources—that's the minimum in most cases. Or it could mean extensive optimizations to make use of new GPU features (eg, ray tracing, DLSS, tessellation, DX12, etc.), or new CPU architectures (more cores/threads, different latencies, etc.)

Something else to note is that branding from multiple competing companies generally doesn't happen. So when a game goes with AMD branding, that locks out Nvidia and vice versa. Intel and Nvidia branding may have been possible in the past, but I don't think we're likely to see that either going forward.

Whatever the company or game, however, marketing materials claiming a game has been designed specifically for one company's products are often just that: marketing. I've seen AMD promoted games that end up running better on Nvidia GPUs, or Nvidia promoted games that end up performing better on AMD GPUs … and I've also seen AMD or Nvidia promoted games where performance is 30-40 percent faster than on the competition's hardware. There are no guarantees.

So when a game like Borderlands 3 says it's "optimized for top performance and incredible gaming experiences on AMD Radeon graphics cards and Ryzen CPUs," we don't actually know what the end result will be. Will it run better on AMD hardware than on Intel or Nvidia hardware? Maybe, maybe not. We'll have to wait until the game is out, which is why I run benchmarks.

Jarred's love of computers dates back to the dark ages when his dad brought home a DOS 2.3 PC and he left his C-64 behind. He eventually built his first custom PC in 1990 with a 286 12MHz, only to discover it was already woefully outdated when Wing Commander was released a few months later. He holds a BS in Computer Science from Brigham Young University and has been working as a tech journalist since 2004, writing for AnandTech, Maximum PC, and PC Gamer. From the first S3 Virge '3D decelerators' to today's GPUs, Jarred keeps up with all the latest graphics trends and is the one to ask about game performance.