You've known it was coming, but you are not prepared. Until now you've known it only as "Kepler," but after today you'll know it as the GeForce GTX 680—the fastest and most efficient GPU that Nvidia has ever built, and one of the sexiest we've ever laid eyes upon. (There's also the matter of their new 600m series of mobile GPUs that are capable of running games like Battlefield 3 smoothly off an slim, 9lb laptop with eight hours of battery life, but that's a whole other story to come shortly.)
So let's get right down to it.
First things first, here's the specs on the GeForce GTX 680. I hope you've got an extra set of pants available.
Yes, that's 1536 CUDA (Compute Unified Device Architecture) cores, up from 512 on the GTX 580. I believe that is what we can officially deem, "massively multicore." With Nvidia's new Kepler architecture, each processor contains up to 192 CUDA Cores, six times the number of cores as Nvidia's previous "Fermi" architecture. What does that mean? You get twice the performance per watt. As you can see above, the 680 also achieves memory speeds up to 6 Gbps, which according to Nvidia, is officially the highest speed of any GPU in history. You'd think they'd be making a bigger deal about that.
Nvidia also claims it's the world's fastest GPU, one that trumps the AMD Radeon HD7970 in terms of performance, perf per watt and especially in DX11 tessellation where Nvidia claims to be 4x better than AMD.
However, while the specs and stats are indeed impressive, just how powerful is it in real world terms? I'll give you an example. At GDC 2011, Epic Games unveiled their now famous "Samaritan" tech demo. At the time it was running on three GTX 580 GPUs. At GDC 2012, they ran the exact same demo on just ONE GTX 680, and I saw it with my own eyes.
Moreover, thanks to high efficiency heat piping, improved airflow and sound dampening materials used in the GPU fan, it's also the coolest and quietest high end GPU Nvidia has ever produced. Nvidia claims that not only is it quieter, but the sound that the card does make is more "pleasing" to the ears. I'm not sure how they came to that conclusion, but I'll allow it. We've heard it in action (or struggled to) and felt the heat (or lack thereof) ourselves. They're not lying. There's no mad science behind how they've made this thing so cool and quiet, it's simply better engineering. Impressive is an understatement.
So, it's as powerful as three GTX 580's, yet uses less power, creates less heat, and of course, creates less noise. Nvidia has provided us with this handy image to drive home these points.
Running at 195W (as opposed to the 580's 205W and the 7970's 250W) it also only requires two six pin connectors, which your PSU will thank you for.
Now let's get to the fun stuff, the new stuff, the cool stuff, shall we?
First off is Nvidia's new GPU Boost.
GPU Boost increases GPU performance by dynamically increasing its clock speed. Since it's dynamic, there's no setup or tweaking involved, it's one of those things that just works... or so Nvidia says. Technically, it's a GPU clocking technique that dynamically raises clock speeds based on per app power variation being monitored in real time. So, it monitors your GPU workload and will increase your clock speed whenever it can. How does it do that? I'm going to guess witchcraft.
Nvidia also claims that their FXAA (Fast Approximate Anti-Aliasing), which is becoming available in more and more games, is 60% faster than 4x MSAA and provides even smoother visuals.
However, the 680 also features a new type of anti-aliasing that Nvidia is calling TXAA.
TXAA is a combination of hardware anti-aliasing, higher ("film style") quality AA resolve and optional temporal components for increased quality (in TXAA2) It comes in two flavors. TXAA1 and TXAA2.
TXAA 1 comes at a similar cost as 2xMSAA but with edge quality better then 8xMSAA and TXAA 2 comes at a similar cost as 4xMSAA but with quality that Nvidia says is "far beyond" 8xMSAA. We shall see.
Not every game will support TXAA right off the bat this coming year, (Borderlands 2 and MechWarrior Online are among a few that already will) but you can force it via the control panel to see just how it affects your favorite games such as Arkhamfield Infinite.
Jagged edges be damned, you're not welcome around the GTX 680!
However, while TXAA is certainly interesting, one of the coolest, if not THE coolest feature of the GTX 680 has to be Adaptive VSync. What this does is allow your GPU to dynamically enable or disable VSync depending on your system performance. So, if your frame rate goes above or below your refresh rate, your 680 will turn VSync on or off dynamically. It's that easy!
Nvidia says that this will prevent halting, disconcerting frame rate stutter that's nearly as bad as the screen tearing it's trying to avoid.
All of these new options are now available through the Nvidia control panel you've all come to know and love. (You know, the one that nags you with Windows popup messages every time new beta drivers are available.)
Last but not least, Nvidia has also proclaimed that the GTX 680 can power four monitors all by itself and has even shown off a single 680 doing so. To really rub it in, they gave the same demo with three of the four monitors running 3DSurround. Why anyone would put a fourth monitor on top of their already insane 3DSurround setup (or how for that matter) is beyond me, but it does make for this awesome image below. Fus Ro Dah indeed.
Oh, the price? Well, you can get the GTX 680 from a variety of Nvidia's partners starting today at $500+. Given that a 580 will run you around $400+ and a dual GPU 590 will run you around $700+, I'd say that's fair, if not downright reasonable.
So, should you go out and buy a GTX 680 today? The early adopter in me says yes, yes oh hell yes, but the frugal young man in me stops and asks, "do you need it?" Yes and no. If you've already got a great SLI setup that crushes anything and everything, there's really no need for you to upgrade, unless the thought of dynamic GPU clock boosting and adaptive VSync sound just too good to be true. However, the fact that this card also runs cooler and quieter than the 580 and uses less power is also a huge plus. I've currently got a GTX 590 (akin to two GTX 580s) and a GTX 560ti (dedicated to PhysX) in my rig, but after seeing for myself what the GTX 680 can do all by itself, and just how cool, quiet and dynamic it is? I think I might only have need of one GPU this year—the GTX 680.
Stay tuned for the full review and more.