Have you ever had to downgrade after an upgrade?

Agent 47 on a tiny computer
(Image credit: Eidos Interactive)

"I can never go back to seeing 60 images per second", a member of the PC Gamer staff who shall remain nameless recently said. "From here on over 100 images must be shot at my face each second or I will whine and complain. In 10 years I will refuse to use anything less than 1000Hz." While that was said with tongue in cheek, the sentiment is relatable. It's hard to go back after an upgrade. There are some conveniences you get so used to that a step backward seems unthinkable, like cleaning plates with your actual human hands after owning a dishwasher, or seeing a mere 1080p after you've upgraded to 1440, or 4K, or something well out of my price range.

Have you ever had to downgrade after an upgrade?

Here are our answers, plus some from our forum.

Wes Fenlon, Senior Editor: I've been a wireless mouse promoter for a good number of years now: they no longer have the performance issues they did a decade ago, and battery life has gotten much better. I'm still using the faithful Logitech G900 I got in 2016, and I maintain it's the best gaming mouse ever designed. But for the past six months I've shamefully been using it plugged in all the time, because I misplaced the wireless receiver. After years of flying free, I've shackled myself to the earth once again.

It's fine, you know. Nothing really wrong for a wired mouse. It's just nice not to have to ever worry about the cable getting snagged or collecting a whole bunch of dust or getting in the way of the knick-knacks on my desk. It hasn't quite bothered me enough to figure out what the hell I did with that receiver or order a replacement, but the thought does sit in the back of my head and occasionally surface when I'm playing games: I could have it better. I did have it better. Someday I'll upgrade to Logitech's wireless charging mousepad and neve have to worry about plugging my mouse in ever again.

Destiny 2

(Image credit: Bungie)

Phil Savage, Editor-in-Chief, UK: I played Destiny 2 on PS4 for years—shackled to the platform despite the PC launch, because I didn't want to lose my progress to that point. When cross-save was finally introduced, I was finally free to upgrade. Except cross save was introduced in 2019 and full crossplay wouldn't arrive until 2021. That meant for almost two years, I regularly had to dip back into the PS4 version to play with friends who didn't make the jump to PC—and it sucked. On PC, I'm running the game at 144 fps, on a 1440p monitor. On PS4, there's a 30 fps cap. Before I upgraded, it felt bearable. It wasn't as smooth as the shooters I was playing on PC, sure, but I didn't have a direct 1:1 comparison of how much better it could be. After I found out, I couldn't go back: it felt like a lurching, ugly mess. It was an unpleasant space to be in.

Compounding the problem, once I'd got used to playing the game on mouse and keyboard, my abilities with the controller—honed over years across both Destiny 1 and 2—completely disappeared. I was playing worse in a game that looked worse. In the end, I just stopped going back altogether, friends be damned.

EPOS GSP600 sale

(Image credit: Epos)

Chris Livingston, Features Producer: I've bought a couple different pairs of quality gaming headsets over the past few years, but I always go back to just using cheap old wired earbuds. I just don't like having giant cans clamped over my ears, and while blotting out the noise of everything but the game is nice, I don't really like being shut off from important sounds happening in my house (when you have pets, it's useful to know when they're destroying something or trying to kill each other). And I don't play competitive multiplayer games where I need crystal clear sound to hear enemy footsteps. I'm usually doing stuff like baking cookies in a city builder or baking cakes in a farming simulator or deep-frying everything in a restaurant game. Earbuds are just fine for that.

Evan Lahti, Global Editor-in-Chief: When I switched to an ultrawide 3440x1440 display a year or two ago, I shelved my 16:9 TN panel gaming monitor. First-world PC gamer problems, I know.

But as an FPS player, leaving behind the TN panel did have a noticeable impact—as we explain in this guide to monitor tech, VAs aren't as fast as TNs, and ghosting can be an issue. Still, the bigger real estate has been worth it, and I much prefer running one big desktop canvas over staring down a Dell double-barrel every day.

Person holding Razer DeathAdder V2 Pro mouse

(Image credit: Razer)

Andy Chalk, NA News Lead: I decided some years ago that I wanted a "better" mouse. Something with angles and buttons and funky lights that would set me apart as a True Gamer. I tried a few different models—the Razer Deathadder, a Logitech G-something, one from Steelseries I think—and they were good! They were nice. Perfectly fine. And as each eventually gave up the ghost in one way or another, I'd think about my old, timeless Logitech MX518, the mouse that for awhile I thought might even outlive me: Simple, unassuming, utterly reliable.

I eventually decided to pick up one of the new MX-518s—Logitech brought them back in 2019—and stepping back from those swanky gamer mice to this much more "basic" unit has only deepened my appreciation for it. It fits right, it feels good, it's got enough buttons, and the black-on-black colour scheme is perfectly unobtrusive—I can barely see it when I'm not using it. Less is more, as the saying goes. I still feel an urge to try something fancy every now and then, but I think that this is probably it for me.

From our forum

Colif: I intentionally downgraded from a 4k monitor to 2k, as well... desktop icons are really hard to read at 4k so even windows defaulted scaling to 150%. I have a 2k monitor now so at least games don't ignore Windows scaling and load in 4k native.

I think using a HDD now would drive me mad. Start PC, walk away for a few minutes while it loads desktop, compared to NVMe where it's start PC... wait a few seconds and then logon.

Refresh rate is silly, I expect people will still imagine they can see flicker at 1000mhz or more. Much of it's in their head, and if displays with 2000mhz show up and they wonder, would that be better. People probably still see flicker at infinite refresh rates.

(Image credit: THQ)

Brian Boru: A few short spells while traveling, I had to make do with one monitor—oh the horror! On a similar note, I still chafe at the lack of dual monitor support in almost all games—I so enjoyed that in Supreme Commander.

WoodenSaucer: I think people probably downgrade their expectations more than their hardware. Like people spouting off about high frame rates and resolutions, and then RTX comes out and throws a wrench in that. New, cutting-edge effects tend to keep us held back all of the time in the resolution and frame rate departments. Rather than getting games to run at beastly speeds on modest hardware, they want to keep giving us new eye candy, and keep us crippled.

Far Cry 6

(Image credit: Ubisoft)

DXCHASE: I upgraded my GPU (3070ti) but my GPU went into a system that's roughly 7 years old. And I know I get crazy bottleneck in some games (Far Cry 6), Does that count?

Zloth: I never had to do that—but I can see how I could. If my video card died right now, I would need to get a new one fast. If the stores didn't have a comparable one, I would need to get something cheaper to keep using until I could get a good one. Quite a bit cheaper, given that I wouldn't be using it for long.

Pifanjr: I've been making due with one monitor (a laptop one even!) since I started working from home. It hasn't bothered me much, though once I return to two monitors I'll probably enjoy all the extra room.

We do have a desk, but it was buried under stuff upstairs in our old apartment and is buried now in the living room in our new apartment. That should be temporary until we can get some more cabinets to put stuff in though.

ZedClampet: The closest I've come to this is dropping the resolution to 720p on an old laptop to get a game running acceptably, but I've never actually gone backward in hardware. I tend to keep my hardware far longer than your average PC gamer, so when I need something new, it usually isn't even possible to downgrade. Well, anything is possible these days with people selling 20 year old computer parts on Ebay, but I've never purchased used hardware, either.

A man with a grenade for a nose

(Image credit: CD Projekt)

Sarafan: While I didn't have the occasion to downgrade in terms of hardware thankfully, I often play older games. My PC is quite capable, but it doesn't stop me from returning to titles that were released many years ago. So basically I'm a witness of graphical downgrade on daily basis. One day I can play marvelous looking Cyberpunk 2077, while the other enjoy pixelated Quake 1. And I don't have a problem with that. I can switch at every moment from a stunning visually game to some old gem that was released before 3D accelerators became a thing. I adapt to the situation in a matter of minutes. Switching from 60 fps to 30 fps is a little more painful, but it's only a matter of time to get used to it. I even don't mind the blurry image caused by running the game in a resolution lower than the native resolution of my monitor. Looks like I'm bulletproof when it comes to graphical downgrade...

PC Gamer

The collective PC Gamer editorial team worked together to write this article. PC Gamer is the global authority on PC games—starting in 1993 with the magazine, and then in 2010 with this website you're currently reading. We have writers across the US, UK and Australia, who you can read about here.