Did you find yourself yawning at the idea of a new budget-priced GTX 750 Ti yesterday? If you're looking at the top end of the market, Nvidia's GeForce GTX Titan Black might suit. Their new premium card is designed to oust the Titan, and can be yours for the hefty asking cost of £785.
The GeForce GTX 750 Ti is an Nvidia first, in many ways. It's built around the new Maxwell GPU architecture, and I reckon it’s also the first time Nvidia have released a new graphics design without launching a top-end iteration first. The GTX 750 Ti may still be rocking the same 700 series badge, but it's a new generation of graphics silicon.
The GTX 750 Ti is a reasonably priced graphics card - at £115 / $150 it’s designed to sit in the volume end of the market and offer an upgrade to as wide an audience as possible. Thanks to its new design it actually spreads the net far wider than previous cards at the same price.
There's a big showdown happening in the world of affordable graphics cards this week. AMD and Nvidia are releasing the latest editions in their £100 / $150 range, an important battleground, given that cards at that range easily outsell their flashy flagship $1000 tech. AMD are bringing some rebranded and boosted versions of their last-gen GPUs to compete with Nvidia's GTX 750Ti and GTX 750, which will give us our first look at their new Maxwell GPU architecture.
Nvidia is launching a couple of brand new graphics cards in the entry-level arena. Normally that wouldn’t be a particularly exciting event, but this is going to be our first taste of Nvidia’s new Maxwell GPU architecture. It'll be the first time Nvidia have launched new graphics architecture without housing it in a top-end graphics card. You could argue that’s because they simply don’t need to with the likes of the GTX 780 Ti delivering the goods against the hot and hungry Radeon 290X.
Asus are planning to expand their Republic of Gamers line-up with two new high-end Nvidia cards - The Poseidon GTX 780 and the GTX 780 Ti DirectCU II. The Poseidon will add a hybrid cooling solution to the GK 110 GPU at the core of the standard GTX 780.
As the world and their virtual wives get all giddy about a couple of new AMD-based mini-PCs from Sony and Microsoft, Nvidia has set themselves up to compete with the new console generation for the Christmas holidays. In partnership with a bunch of system building folk Nvidia wants to push small form factor gaming PCs, with serious graphics power, into the mainstream. It's called the Art of Gaming.
DinoPC’s Mini Ultimate is one such system and, while the £1,500 sticker price is more expensive than three new consoles together, it’s a mighty fine gaming rig for the money. This is a seriously high-end machine in a snug little chassis.
Nvidia announced they’re hoping to spoil the AMD party by dropping a bomb on the gathered press out in Montreal last week: the GeForce GTX 780 Ti. If the rumours are true and the incoming AMD Radeon R9-290X can beat a GTX Titan in a stand up gaming fight then Nvidia are going to need some sort of riposte. But what exactly?
AMD have repeatedly assured the public the brand new Radeon R9-290X is going to be released this month and there’s not a long time left in October. That’s coming soon and I don’t reckon the new GTX 780 Ti is going to be far behind.
Nvidia is making big announcements in Montreal today. We've got G-Sync, which flips the V-sync idea on its head and synchronizes monitor refresh rates to GPU output; recording and Twitch streaming features coming to GeForce Experience; and finally, the hardware: the GeForce GTX 780 Ti.
The standard display refresh rate is 60Hz—that's 60 images per second—but fancy GPUs can render way more than 60 frames per second. We like more frames. More frames means more responsive input—and screw compromise!—but when out-of-sync rendering traps multiple frames in a single refresh, the Horrible One emerges: screen tearing. The best we can do now is tame the beast with V-sync, but in Montreal today, Nvidia unsheathed a new weapon which it claims will put tearing and stuttering down for good.
Update: Well, that didn't take long. Activision's support Twitter account has just confirmed that these specs are not official. Original story follows inside.
While it's not official, the likely PC requirements for Call of Duty: Ghosts have been posted on Nvidia's website. The minimum requirements are pretty friendly to those without giant rigs, but a slight step up from previous CoDs given the transition to new console hardware.
As it typically does for a major game launch, Nvidia has updated its GeForce card drivers to 314.22 for boosts in performance and stability. It claims recent titans BioShock Infinite and Tomb Raider both get a significant bump in frames-per-second, with the former increasing by 41 percent and the latter by an astonishing 71 percent.
Nvidia released a new beta version of their GeForce driver this week, once again squeezing more incremental improvements from a bunch of games, both new and old. But one prominent release was missing from the list of tweaks: Tomb Raider. Lara's latest outing may continue Square Enix's quality porting form, but, as Chris notes in his settings overview, GeForce cards attempting to use AMD's new fancy hair tech TressFX suffer a drastic performance hit.
When I first saw the Nvidia GeForce GTX Titan a couple weeks before launch Nvidia’s Tom Petersen explained that it was their “love story to gamers. It looks great, it sounds great and it has great performance.” And while he is right on pretty much all fronts, the GTX Titan is likely to only ever be a matter of unrequited love for all but a tiny percentage of PC gamers. When you price up a single graphics card at the same price as a performance gaming PC, that’s immediately most of your audience cut out.
So what is it? This is the fastest single-GPU graphics card on the planet and the very top-end of Nvidia’s Kepler architecture. We knew when the GTX 680 launched that it wasn’t home to the full-fat Kepler core - that was held over for the Tesla range of professional graphics cards, which need the amount of double precision compute performance the top GK110 GPU affords.
Anyone remember Cray unveiling their new Titan supercomputer at the tail-end of last year? Y'know, that vast data-munching machine housing 18688 of Nvidia’s Tesla K20 graphics cards, each of which go for around £2500/$3500. I remember looking down mournfully at the GTX 680 in my test rig and thinking “wouldn’t it be nice to have just one of those graphics cards?”
Well, now you can.
Nvidia has re-engineered the GK110 GPU that sits at the heart of the professional Tesla cards and stuck it in a gaming-focused desktop boards. Thus, the GTX Titan is born.
GeForce Experience is an application that recommends the optimal settings for any game in its database for your exact configuration of hardware. I’ve been chasing Nvidia for months trying to secure access and finally the closed beta has arrived.
It only works on Nvidia graphics drivers, and only then with the cards from the 400 series Fermi-based cards onwards. But it does take into account your motherboard, CPU and memory settings too. It’s an incredibly simple-to-use bit of kit and I think it could become an essential part of any GeForce gamer's software suites.
As is its wont Nvidia has released a new set of beta graphics drivers, hot on the heels of the 310.33 release a couple of weeks back. This is being called an "essential upgrade for all GeForce GTX gamers". Compared with the current WHQL certified drivers the 310.54 release is boasting up to a 26% frame rate boost for anyone playing the just-released Call of Duty: Black Ops 2 or the still-waiting Assassin’s Creed 3.
Ten years ago Nvidia released Dawn, a GeForce FX tech demo featuring a fairy gymnast as seen through the tunnel vision of someone who's been eating unidentified fungi. It was impressive, but in the past ten years extreme use of bloom and depth of field has become less novel. Today, Nvidia released "A New Dawn," an updated demo (announced last month) which shows off the power of Kepler-based GPUs.
Purveyor of graphics chips NVIDIA has followed up on the launch of its GeForce GTX 680 with the second card in the 6-series, the GeForce GTX 670. On the face of it, the two cards are almost identical: both are based on the same Kepler design GK104 chip, with 2GB of GDDR5 memory running on a 256bit bandwidth bus.
That GK104 is, in both cases, produced on the same 28nm process, and the only real difference is that the GTX 670 has 1344 CUDA cores activated compared to the GTX 680's 1536. Plus, of course, it's a bit cheaper. Since we decided the GTX 680 was good but overkill for most people's needs, is the 670 a better bet?
What's better than a brand new GeForce GTX 680 with which to upgrade your PC on a fine and sunny spring morning? Try two GeForce GTX 680s lashed together on one card. That's what NVIDIA has built: they announced a twin chip monstrosity called the GeForce GTX 690 yesterday.
The GTX 690 be on sale by Thursday, apparently, although the lack of online reviews and apparent paucity of sample availability suggests that if you do want to buy one, there may be a bit of a queue.
Or will there?
If you have a GeForce card you might want to grab the latest batch of beta drivers from the Nvidia site. Nvidia say they'll deliver a performance boost in Skyrim of up to 20%, which is nice, but the Nvidia FXAA functionality is perhaps a more interesting addition. That'll allow us to force a faster form of anti-aliasing across hundreds of games from the Nvidia control panel. The new shader-based antialiasing function should help to smooth out edges at speeds "60% faster than 4xMSAA."