Illustrations by Marsh Davies
All week long, we're peering ahead to what the future holds for the PC gaming industry. Not just the hardware and software in our rigs, but how and where we use them, and how they impact the games we play. Here's part four of our five-part series; stay tuned all week for more from the future of PC gaming.
We dream of futuristic graphics cards with chrome Hot Rod piping and names as cool as The Pixelator. In reality, future graphics cards won't be human-sized or be styled after 1950s automobiles, but they will be faster than what we're running today. More importantly, APIs like AMD's Mantle will let our computers talk directly to our graphics cards, delivering better performance through more efficient coding. And we're going to need that performance, since 4K monitors are already on the horizon. Here's our look at the 2014 GPU landscape and the future of (entirely too expensive) 4K displays.
By Josh Norem
As we head into 2014, the GPU landscape looks a bit like the beaches of Normandy post-invasion. Nvidia has launched all of its 700-series cards, including the reigning champion GPU, the GTX 780 Ti. AMD is seemingly finished launching its “Titan killer” Radeon R9 290 cards, which posed a serious challenge to Nvidia's supremacy. The company has also finally released its Mantle API update for its R9/R7 series cards in beta driver form, which promises better performance in select games l ike Star Swarm and Battlefield 4. Where do we go from here?
Nvidia has now launched its all-new line of Maxwell GPUs featuring a totally redesigned architecture. Though Maxwell uses the same 28nm lithography process as Kepler, Nvidia's goal was to increase performance by 25 percent while cutting power consumption in half, and it's largely achieved it. Oddly, Nvidia is launching Maxwell with entry-level GPUs priced at between $120 and $150 instead of a flagship board, and both models, the GTX 750 Ti and the GTX 750, consume an unbelievably low amount of power at just 60w and 55w. For comparison, the $150 AMD R7 265 consumes 150w, so Nvidia has a major advantage over AMD at 28nm when it comes to power consumption. Nvidia has stated that it plans to bring a flagship Maxwell board to market at some point, but we have no idea what the time frame for it is.
Meanwhile, Nvidia is strengthening its grip on the high-end of the market with the recently released GTX Titan Black GPU. This is truly “Big Kepler” in that it is the ultimate Kepler card, and represents that architecture's final form. Compared to the OG Titan it has one more SMX unit, higher clock speeds, and faster memory, but the same $1,000 price tag.
AMD's upcoming plans are more mysterious. There are rumors that its R9 290X has a couple hundred unlocked streaming processors, so it's possible AMD will uncork the board's potential. It's also possible that it will release a Hawaii-based dual-GPU board, code-named Vesuvius. Such a card would likely throw down some serious firepower, and require cooling beyond what we would consider possible with a reference design. It might even be water-cooled right out of the box, like the Asus Ares II. It's also possible that AMD could abandon the high-end market altogether in favor of its GPU/CPU hybrid chips, dubbed APUs. These APU are in both the Xbox One and the PS4, and the market is clearly heading in the direction of low-power chips that can power tablets, laptops, and smartphones.
Beyond the cards themselves, both companies will be pushing hard in 2014 for adoption of 4K resolution and vertical sync technologies. 4K requires a beefy setup (think dual GTX 780s or R9 290X boards), but with the recent introduction of sub-$1,000 4K panels, the market could be ready. Nvidia has also announced its G-Sync technology, which eliminates tearing in games by syncing the monitor's refresh rate with the output of the GPU. AMD's version, FreeSync, works by telling the GPU to vary the refresh rate. Of course, existing monitors can't change their refresh rate, but if this technology catches on perhaps display manufacturers will add this feature to its products. It's too early to tell which of these technologies will win, but they could have a huge impact on the future of video.
By Gordon Ung
Hardware vendors are ready to sell you the next big thing: 4K, or Ultra HD. If you don't keep up on current events, 4K monitors can display around 8.3 million pixels. To put that in perspective, a typical 23-inch 1920x1080 panel is about 2 million pixels and a 27-inch 2560x1440 is about 3.7 million pixels.
Increasing the resolution of a monitor offers some very real perks. The first is increased pixel density. Anyone who has picked up a 5-inch smartphone with a 1080p screen knows how wonderful a high dot per inch screen can look. With monitors, you get the same exquisite pixel-packed screen. These high PPI screens simply make high-resolution images sing like they can't on a low PPI panel. A high PPI panel also gives you increased desktop space to work on.
That's all great, but what about PC gamers? That's where it gets tricky. Certainly the increased PPI helps with visuals. There's an argument that 4K televisions are pointless because you sit so far from the television that the increased resolution doesn't give you much. You'd have to sit right up on the television for the 4K to matter. If you're eyeball-to-eyeball with a 32-inch 4K panel, you'll notice the higher DPI during gaming sessions, especially if you're coming off of a lower resolution, lower PPI 1080p panel. If, however, you're thinking about moving from a higher resolution 27-inch 2560x1440 panel though, it won't rock your world the same way.
PC gamers don't suffer from a dearth of content like TV manufacturers—the vast majority of today's games should support 4K. The bad news is that there's still a lot in the negative column. The most obvious is that you'll need a bigger, badder graphics card. If you can comfortably play games with your current GPU on a 1080p monitor, you'll probably find your system gasping for air when you almost quadruple the amount of pixels the graphics card has to push out to hit reasonable frame rates.
Games less sensitive to framerate, such as driving or flying simulations, won't be as bad as twitch shooters that demand 60 fps or higher. Even with today's extremely powerful graphics cards, we'd recommend a dual-card setup to run 4K gaming at a solid 60 fps without turning down graphics settings. There's no reason to buy an expensive 4K panel, only to bottom out all the graphics settings to play.
There's also technical nitty gritty such as how you hook up a 4K panel to your monitor. You'll need DisplayPort 1.2 to support 60Hz gaming at 4K or HDMI 2.0 but no GPU we know of supports HDMI 2.0 today.
And then there's the price of the glass. The exquisite Asus 31.5-inch PQ321Q pushes $3,000 and was $3,500 this summer. The eye opener this year was the Asus PB287Q and Dell P2815Q. Both slightly smaller monitors offer 4K resolution for an amazing price of $699 and $799 respectively. The catch is that both panels aren't great for gaming with refresh rates of 30Hz at native resolution . A 30Hz refresh rate is so low, it's noticeable just moving windows around the desktop. That's a a deal breaker for any PC gamer.
The best compromise may be panels such as Asus' ROG Swift PG278Q with its combination of relatively high resolution (2560x1440), big enough 27-inch panel to fill your vision and its wickedly high refresh rate of 120Hz. The panel also packs Nvidia's G-sync inside, so even if your weak GPU can't push frame rates beyond 40 fps, it will hardly be noticeable.
4K is clearly still in the early adopter phase—too expensive, and with too many drawbacks, for us to recommend. But in two years, we may all have 30-inch 4K monstrosities sitting on our desks.