In a perfect world, the hardware experts at PC Gamer would accompany you on a shopping trip to pick up your next graphics card. We'd happily share our experience and tell you what to watch out for, what to avoid, and what you need from a GPU to squeeze the highest number of frames per second out of your gaming rig. Then again, would you really want to spend an afternoon with our posse of hardware-obsessed game addicts? The good news is you can receive the same benefit by reading our new buyer's guide below. When you're done, you don't even have to shake our clammy, mouse-worn hands.
AMD are in a strong position right now, thanks to the presence of their GPUs in both of the 'next-gen' consoles. Yesterday, they revealed the next step in 'Operation: Make All The Graphics', which I assume is their codename for the global graphical domination they're so clearly chasing. It's called Mantle, and its a new low-level API that gives developers direct access to GPUs using AMD's Graphics Core Next architecture.
PC graphics have come a long way since the year 2000, when Deus Ex was our Game of the Year, people were emerging from panic about the breakdown of Western society, and the Lord of the Rings was still but a wonderful book. The YouTube channel Perfect Hand Videos has released a compilation of over half a hundred tech demos from the decade-plus period, showcasing the progression of particle effects, shaders, and shadows. It's sort of like that chart of evolution from apes to humans, but with polygons.
Recently I’ve been spending my days (and some fevered night’s dreams) benchmarking pretty much every GPU from this latest generation of graphics cards. One card has stood out as the absolute best. We’ve been recommending the standard 2GB HD 7850 as the go-to graphics card of today, as its combination of impressive price-point and brilliant 1080p performance is tough to argue against. But I’ve been playing around with the slightly-hobbled 1GB version recently and I’ve got to say I barely notice the difference from the halved frame buffer.
You’d think that in the high-resolution stakes, where the 2GB HD 7850 performed surprisingly well itself, this 1GB version would struggle. But the GPU itself taps out before the lack of graphics memory actually has an impact on performance.
For £120 then the HD 7850 1GB is an absolute bargain. So, what if you get a pair of them in your rig?
Graphics card and motherboard manufacturer, EVGA, is offering the chance to upgrade any product you buy in Europe from it for up to six months after purchase. As part of its standard Step-Up program, folk that register their new EVGA graphics card, or mobo, have the option to trade in for a better model and just pay the difference.
Normally you’re only able to take advantage of the service for 90 days after purchase and only if you’ve bought into the extended warranty, but EVGA is upping the ante and doubling that to 180 days and waiving the warranty shenanigans. The offer is only running until the end of December, but that does mean you’re looking at being able to swap out your graphics card anytime up until June.
Should you happen to be in the market for a $500/£420 graphics card, AMD has launched a new contender for your cash today. I say new, but it's more a makeover: the Radeon HD7970 Gigahertz Edition is physically identical to the current AMD flagship, the HD7970, with a couple of minor tweaks to earn it that big sounding suffix.
The first is, as the name suggests, an increase to the core clockspeed from 925MHz to 1000MHz. To go with that the 3GB of memory has been accelerated to a full six gigahertz equivalent speed, putting it on a par with NVIDIA's top end GTX 680. More interesting, however, is the fact that AMD has also caught up with NVIDIA by introducing a feature for accelerating the HD7970 depending on the processing load and chip temperature, something which the latest GeForce cards were well praised for.
Ubisoft's DRM isn't exactly known for its gentle, loving caress in matters near and dear to PC gamers' hearts, but the latest tightening of the cuffs seems a bit overkill-ish even by Ubi's standards. In attempting to review Anno 2070's performance on a range of hardware configurations, Guru3D made an extremely disappointing discovery: The second the site switched out a GTX 580 for a GTX 590, Anno demanded another, separate activation. On top of that, the game gives you a whopping three whole activations to work with, so think carefully before spelunking around in your machine's brittle innards.
I've fired off a mail to Ubisoft asking whether this is an intentional piece of extra armor plating for its DRM Voltron, or merely a glitch the publisher plans on patching out. Fingers crossed for the latter, though precedent's not exactly on our side.
AMD's dropped an almost unexpected Christmas present into our laps this morning: the launch of the company's latest flagship graphics card, the Radeon HD7970. As well as stealing the 'fastest single chip graphics card' title back from NVIDIA for the time being, the HD7970 is the first card manufactured on its microscopic 28nm process and is the first to use the all-new 'Graphics Core Next' (GCN) architecture.
But what does that mean, and is it any good for gaming?
I often get emails asking for advice on upgrades, most of which I try to answer as quickly as I can, but one that came through the other day struck me with a problem that I imagine is more common than you'd think.
The writer wanted to know what the best graphics card would be for his motherboard, and proceeded to list all the bits and bobs inside his PC. Some of them were nearly ten years old. Two things were immediately obvious from his mail. Firstly, that a graphics upgrade alone wasn't going to get Crysis 2 running at full speed. Secondly, that he'd obviously made a mistake identifying his components. According to the email he was running an Athlon FX chip from the middle of the last decade with a Pentium 4 motherboard circa 2001.
Since he also reckoned he was using two GeForce graphics cards in SLI configuration, I surmise that the writer is probably right about the chip, wrong about the mobo (since that predates SLI technology). Or that it was someone deliberately being silly and trying to catch me out.
The serious question the story raises, though, is how do you know what motherboard is inside your machine, and what its compatible with when you come to upgrade?