Because lots of people paid serious money to buy up all the GTX Titans Nvidia could make, they've decided to push things further. The twin-GPU GTX Titan Z is a $3,000 graphics card
announced at the GPU Technology Conference
(GTC) in San Jose. According to Nvidia CEO Jen-Hsun Huang it exists simply because “the market just wanted so much more performance,” but is it really worth all that money?
The card will be the latest in Nvidia's line of dual-GPU graphics cards and contains a pair of the GK110 GPUs that have featured previously in Nvidia's GTX Titan Black cards. That means it's got a total of 5,760 CUDA cores, with an aggregated 12GB of GDDR5 video memory.
“If you're in desperate need of a supercomputer,” said Huang introducing it, “and you need one close by and handy - one that will sit next to your desk - we have just the card for you: Titan Z.”
While it's pretty impressive that much performance and technology is being squeezed into a single graphics card, it also seems to be far too expensive for what it is. Essentially it's a pair of Titan Black GPUs on one slice of PCB. They're the quickest GPUs Nvidia have available right now, but a pair of those cards will set you back some $2,000. The real kicker is that an SLI setup with twin GTX Titan Blacks is also likely to perform a fair bit quicker than a single Titan Z card.
Why? Because generally a dual-GPU card will have its chips clocked slower than an individual card's GPU to be able to maintain suitable thermals and sit within a certain energy footprint. Nvidia haven't released the full specs of the Titan Z, but if it follows any of the dual-GPU designs that have gone before it then those GK110 chips are likely to be sitting around 800-850MHz compared to the 889MHz of the separate GTX Titan Black GPUs.
So that's a single card which costs $1,000 more than buying a pair of Titan Blacks and likely wont be able to perform as quickly. If this thing used a pair of the new 28nm top Maxwell GPUs I could understand the price disparity, but the GK110 is getting on a bit now. We'll know more when Nvidia releases the card's full specs, and I have the chance to run some benchmarks.