Do games at 1080p look worse on a higher resolution monitor?

LG gaming monitor with Tech Talk logo
(Image credit: LG)

PC hardware is incredibly diverse. Some gamers have brand-new, top-of-the-line gaming PCs sporting the latest graphics cards, CPUs, and fast SSDs. Others get by with far less. The potential solutions to poor performance in nearly every game are the same: upgrade your hardware, turn down the settings, and/or drop the resolution. It's that last bit I want to discuss today.

In a perfect world, you want to run all of your games at your monitor's native resolution. I started gaming back when we hooked up bulging TVs to our computers (C-64), and we were happy to play at 320x200. These days, I have multiple 4K and ultrawide monitors, and the difference in graphics quality is amazing. Still, there are plenty of games where even the fastest current hardware simply isn't capable of running a new game at 4K, maximum quality, and 60 fps. Look no further than Control, Gears of War 5, and Borderlands 3 if you want examples.

Depending on the game, it might be possible to play at 4K with a lower quality setting, and the difference between ultra and high settings is often more of a placebo than something you'd actually notice without comparing screenshots or running benchmarks. Dropping from ultra to medium on the other hand might be too much of a compromise for some. There are also games like Rage 2 where even going from maximum to minimum quality will only improve framerates by 50 percent or so.

Outside of upgrading hardware, that leaves dropping the resolution, and this can massively boost performance. 1440p typically runs nearly twice as fast as 4K at the same settings (if you're GPU limited), and 1080p is usually around 30-40 percent faster than 1440p. Playing at 1080p instead of 4K, even on a 4K monitor, will more than double your framerate on everything short of a 2080 Ti. This is why we continue to recommend 1440p as the best gaming solution—it's often within reach of midrange or lower high-end graphics cards and still looks great.

But what if you want a 4K or 1440p display for general use—for work and maybe movies—but you also want to play games on it? Does it look worse to play at 1080p on a 4K or 1440p display than if you simply used a 1080p monitor instead? The answer is yes, it does look a bit worse (mostly it's blurrier), but for many people it doesn't matter that much. Disclaimer: I'm one of those people.

Sony's GDM-FW900 was one of the last great CRTs

(Image credit: Sony)

Back when we all used CRTs, running at a lower resolution than your native monitor resolution was commonplace. However, CRTs were inherently less precise and always had a bit of blurriness, so we didn't really notice. I hated dealing with pin cushioning, trapezoidal distortion, and the other artifacts caused by CRT technology far more than the potential blurriness of not running at a higher resolution.

When we shifted to LCDs and digital signals, suddenly all the pixels were perfectly square and running at a different resolution than native presented more visible problems.

Consider a simple example of a 160x90 resolution display with a diagonal black stripe running through it. Now try to stretch that image to 256x144. We run into a problem of not easily being able to scale the image, and there are different techniques. One option is to use nearest neighbor interpolation, or you can do bilinear or bicubic interpolation. There are pros and cons to any of those, but all look worse than the native image.

I've taken a source image at 160x90 and blown that up to 1280x720 (using nearest neighbor and bicubic interpolation) in Photoshop. Then I've scaled that to 256x144 using several NN and bicubic, and afterward blown that up to 720p via NN (so you can see what the individual pixels would look like). The result is a closeup of what "native" rendering looks like on an LCD, compared to the two different scaling algorithms for non-native resolutions.

LCDs and video drivers have to take care of the interpolation, and while the results can look decent, there's no denying the deficiencies of both nearest neighbor and bicubic scaling. NN results in some pixels getting doubled and others not, while bicubic scaling causes a loss of sharpness. I intentionally started with an extreme example using black on white—with game images, it's far less problematic. The other factor is what resolution you're using relative to native.

Running 1080p on a 4K display ends up being one fourth the native resolution. If your graphics card drivers support integer scaling, you can double the width and height and get a "sharper" picture. Intel and Nvidia now support integer scaling, though it requires an Ice Lake 10th Gen CPU for Intel (laptops), or a Turing GPU for Nvidia. Otherwise, you get the bicubic fuzziness that some people dislike.

It's perhaps better to show what this looks like while dealing with real resolutions, like 1080p scaled to 1440p and 4K, using Integer Scaling (nearest neighbor) vs. bicubic filtering. Integer scaling is a great feature for pixel art games, but it's often less important (and perhaps even undesirable) when dealing with other games and content.

Why am I talking about integer scaling and various filtering techniques in the drivers when I started with discussing playing games at lower than native resolutions? It's because the two topics are intertwined. Integer scaling can sometimes be used while running at lower resolutions, if you have the right hardware, or you can pass the signal along to the display and let it handle the scaling. Most displays these days do some form of bicubic upscaling—some really old LCDs did nearest neighbor interpolation, and it could look horrible (eg, when scaling high contrast text like in the above black line on white background example) and has since gone away.

If you look at the above images, you can definitely see the differences in scaling modes, and when you're not running at your display's native resolution you'll usually end up with some form of interpolation—either from the GPU or from the display scaler. Running at native is almost always the best option, if your hardware can deliver playable performance at the desired resolution. But if you have a 1440p or 4K monitor, that often won't be the case.

So why not just buy a 1080p display? Because most people don't use their PCs purely for gaming. If you've used a 1440p or 4K display for web browsing and office work, having multiple windows open can make you more productive and you get used to swapping between them. Dropping down to 1080p on a desktop can be painful. And the higher your display resolution, the less likely you are to notice the scaling artifacts—the pixels become small enough that you don't see them.

In short, while running at your displays native resolution is ideal, it's often not practical for games. 1080p remains ubiquitous for a reason. However, many people still want a higher resolution display for other tasks. If you have a 4K monitor and end up running games at 1440p or 1080p, unless you've got really sharp eyes you probably won't notice after a bit—especially when the game is in motion. And if you do and it's too bothersome, I guess you can either upgrade your graphics card or buy a second monitor.

Jarred Walton

Jarred's love of computers dates back to the dark ages when his dad brought home a DOS 2.3 PC and he left his C-64 behind. He eventually built his first custom PC in 1990 with a 286 12MHz, only to discover it was already woefully outdated when Wing Commander was released a few months later. He holds a BS in Computer Science from Brigham Young University and has been working as a tech journalist since 2004, writing for AnandTech, Maximum PC, and PC Gamer. From the first S3 Virge '3D decelerators' to today's GPUs, Jarred keeps up with all the latest graphics trends and is the one to ask about game performance.