Today's GPUs are supremely capable and graphics in the latest games are utterly spectacular, so why do I not get the same wow factor that I got 26 years ago?

A screenshot of the Steel Nomad graphics test in 3DMark
(Image credit: UL Benchmarks)
Nick Evanson, Hardware writer

PC Gamer staff writer headshot image

(Image credit: Future)

This month I've been testing: Ghost of Tsushima. Nixxes has done one heck of a great job porting it to the PC, especially its support for the PS5 Dualsense controller. Oh, and I've also been delving into a Ryzen 7 5700X3D, as an upgrade for a 5600X. More on this soon.

Last week, UL Benchmarks released Steel Nomad, a new graphics test for 3DMark, with the ambition that it will eventually replace Time Spy Extreme as the most used benchmark for GPUs. Although I've been running it a fair bit of late, on a variety of different systems, I tried it briefly before launch while collating a performance analysis of Ghost of Tsushima.

Steel Nomad and Ghost of Tsushima both have superb graphics, either through a combination of the very latest rendering techniques and high-resolution assets or because the art direction is top-notch.

But as I was writing up the analysis, it got me thinking about the days of Final Reality benchmark, the precursor to 3DMark, and the first Unreal game. Back then, I had an Intel Pentium II 233 MHz gaming PC, with an Nvidia Riva TNT paired with a 3dfx Voodoo2 graphics card.

By modern standards, it was all incredibly basic stuff, but I used to get goosebumps seeing the use of multi-textures and lighting in Unreal. Final Reality was far less pretty than some examples in the 90s demo scene but it was exciting to see how well my PC could run it.

Each year, new benchmarks and games would get released and raise the graphics bar to another level. Like so much of the PC crowd I hung around with then, the Nature test in 3DMark2001 was a real 'wow' moment, as was the Advanced Pixel Shader test.

A screenshot of the Nature test in the 3DMark2001 benchmark tool

(Image credit: UL Benchmarks)

I couldn't believe that such graphics were possible on everyday hardware and still run at a fair rate. Successive generations of GPUs offered increasingly more features and both ATI and Nvidia would make demos to show off what their chips could do.

Fast forward to now and you have mid-range graphics cards capable of achieving high frame rates at high resolutions, all without getting too stressed. Some of the more recent games we've seen possess visuals of such detail and intricacy that they wouldn't look out of place in a movie or TV show from just a few years ago. But as much as I like them all, none of them gives me quite the same feeling the Nature test or the Voodoo2 running Unreal gave me.

Even the introduction of real-time ray tracing in games didn't move me like seeing per-pixel water reflections for the first time. Cyberpunk 2077, with all the bells and whistles turned to their maximum values, looks staggering and it's a great benchmark for grinding any GPU to dust.

And yet while I admire its technical achievements, I haven't spent anything like the amount of time I used to in old games, just staring at graphics.

Hogwarts Legacy fast travel

(Image credit: Portkey Games)

My partner has been gaming for most of her life, but I've recently introduced her to PC gaming. Her current game of choice is Hogwarts Legacy and our reactions to the textures, lighting, and overall details couldn't be more different. Where I felt the developers did a good job at capturing the whole Harry Potter vibe but fell short of giving it great graphics, her opinion has been the complete opposite.

"Have you seen this? Just look at that! Wow, that is so cool…"

That's the kind of giddy excitement I had with 3DMark2001, Unreal, Quake III Arena and countless others, so I don't think my subdued feelings have anything to do with the fact that many of today's games have poor environment readability—this is where artists pack so much detail into their game's world that they're just too busy and too complex for any one aspect to really stand out.

So I guess it must come down to familiarity. Graphics and games have been a part of my life, either professionally or merely for entertainment, for over 40 years. While it's certainly not a case of 'familiarity breeds contempt' or the like, I suspect that it's harder to surprise someone who's seen so much of it, for so long.

Don't get me wrong, though. I'm thoroughly enjoying Ghost of Tsushima, both its graphics and gameplay. I'm also really looking forward to seeing what AMD, Intel, and Nvidia will do in their next generation of GPU architectures, even though I know there won't be any significant breakthroughs, in terms of design, performance, or features. I know full well that the days of seeing a 50%+ increase in rendering power between successive chip releases are long gone, just as it has with the single-thread performance in CPUs.

A screenshot from Ghost of Tsushima, showing a wide landscape, full of trees and hills, with the setting sun casting a red hue across the view

(Image credit: Sony/Sucker Punch)

Advances in hardware and software technology have pushed chip makers towards a fairly homogenous design and while there are still some fundamental differences between an AMD and an Nvidia GPU, it mostly concerns things like shader occupancy or cache hierarchies—stuff that affects overall performance rather than what the GPU can or can't do.

Pick up any new graphics card and it'll fully support Direct3D and Vulkan graphics APIs, a far cry from the early days of GPUs.

What transpired 26 years ago was ground-breaking and both game developers and hardware engineers were constantly stepping into uncharted territory. Explorers of a new world, so to speak. I guess it's just not new for me anymore and although I can enjoy the easy living in this world that coders and engineers have made, I can't experience the wonder of seeing it new for the first time.

But watching my partner beam with delight upon seeing a glorious 3D environment, replete with meshes, textures, lights and shadows, I know that there will always be fresh arrivals in this world who've yet to fully experience what it has to offer.

And that encourages me greatly, even though the entry fee to this PC Wonderland is as high as it has ever been. Games and graphics might not be a quantum leap better in another 26 years, but I can't wait to see what they'll be like—because they'll still give someone that 'wow' factor.

Nick Evanson
Hardware Writer

Nick, gaming, and computers all first met in 1981, with the love affair starting on a Sinclair ZX81 in kit form and a book on ZX Basic. He ended up becoming a physics and IT teacher, but by the late 1990s decided it was time to cut his teeth writing for a long defunct UK tech site. He went on to do the same at Madonion, helping to write the help files for 3DMark and PCMark. After a short stint working at, Nick joined Futuremark (MadOnion rebranded) full-time, as editor-in-chief for its gaming and hardware section, YouGamers. After the site shutdown, he became an engineering and computing lecturer for many years, but missed the writing bug. Cue four years at and over 100 long articles on anything and everything. He freely admits to being far too obsessed with GPUs and open world grindy RPGs, but who isn't these days?