A futuristic chopper passing over the Lingshan Islands by moonlight. Gruff men in nanosuits prepare to drop into the combat zone below. After quoting half of Predator at each other, they jump. You jump with them. You reach the sands after a speedier descent than planned, turn your flashlight on, and press W. Then—then I don't know. Nobody does.
At least, nobody did in 2007 when Crysis first came out, because this moment in the opening level was about as far as we could stand to endure at 7fps. The entire rest of Crytek's innovative sandbox shooter was a complete mystery. We'd been told to expect high system requirements and a rough ride for our GPUs but truthfully, we didn't really believe that running Crysis on even the lowest settings would be such a struggle. Or that attempting to do so on max settings would be impossible. All the stern talk about its technical demands was probably just cautionary bluster, many of us thought, intended for students who thought they could run it on their Lenovo Thinkpads during lectures. And then we ran to the shops, bought a boxed copy on release day, ran home again, entered our CD keys in the install menu like overgrown, far less cute children on Christmas morn, whacked the graphics up to max, and cried into our hands at the slide show on our monitors.
This was certainly my experience, and the solution was obvious: I needed a better PC. This was the platform's next seminal title, a seismic moment like Doom or Half-Life, and I would not be witnessing it on medium settings or at sub-30fps. That would be like getting married in a Burton suit or watching the World Cup final in 144p. Like a great many gamers in 2007, I was not a man of means. Having just graduated university, I'd taken the cunning decision to work my way back down through the education tiers and enrolled at the local college. Three night shifts per week at the nearby Sainsbury's, stacking shelves on the meat aisle, accounted for 100% of my total income. So I did what anybody would do: I upgraded my RAM and hoped for the best.
It's been a truism since the neolithic era that RAM offers the best bang to buck ratio of all potential upgrades, and that was perhaps truer in the noughties than it's ever been. DDR2 was making its way into systems, and a 2x1GB kit of it would set you back a mere £60 with decent speeds—one mere night shift's worth of toil for this upgrader. I went from 2GB to 4GB, clicking the new modules in place and fully expecting a playable Crysis was waiting for me when I got the case panel back on.
There was a noticeable improvement in performance, and in retrospect, knowing that the very best PCs of the late 2010s still struggled to hold Crysis to a firm 60fps in some areas, it's remarkable that this upgrade was even perceptible. But I'd saved up for that RAM and I wanted more than a few extra token frames here and there. The graphics card would have to go next. The GeForce 8800 GTX that sang like a bird when you loaded Team Fortress 2 or World in Conflict was no longer fit for purpose.
It wasn't until the following summer of 2008 that I found the £400 for a GTX 280, Nvidia's next generation of DirectX 10 card. It cost the majority of my pay packet. I'd squirrelled a bit a way, month by month, and finally the new silicon was inside my PC. By that point I'd tried every combination of graphics options in Crysis' menus, trying to strike a magical balance of performance saving and gorgeousness that couldn't be found. I'd overclocked my Core 2 Quad Q6600 for all its worth, and squeezed the last drop out of my RAM speeds. There was nowhere else to look. This GPU had to do it.
It was probably cognitive dissonance that led me to become satisfied with the result. A combination of knowing that running Crysis in the mid-20s on high settings was not worth the upgrades I'd just spent, and the knowledge that I'd never spend another penny in the name of this accursed game. On balance, I just accepted the outcome and finally, a year after release, began to play beyond Crysis' opening five minutes.
This was, of course, one of PC gaming's great unifying experiences, our Woodstock. Dylan going electric, except it was the consumer, not the artist, being sold out. CryEngine's lighting, rendering, and postprocessing techniques continued to confound hardware for a further decade, and in the end, was it even that good once you got off the beach and started fighting aliens? Irrelevant: the real game was trying to run the game, and that's now deep in our culture.
PC Gamer Newsletter
Sign up to get the best content of the week, and great gaming deals, as picked by the editors.
Phil 'the face' Iwaniuk used to work in magazines. Now he wanders the earth, stopping passers-by to tell them about PC games he remembers from 1998 until their polite smiles turn cold. He also makes ads. Veteran hardware smasher and game botherer of PC Format, Official PlayStation Magazine, PCGamesN, Guardian, Eurogamer, IGN, VG247, and What Gramophone? He won an award once, but he doesn't like to go on about it.
You can get rid of 'the face' bit if you like.