Id Software's John Carmack picks a side in the Nvidia/AMD GPU war

There is wisdom and experience in those eyes

We sat down with legendary John Carmack and picked his brain on a few of our favorite topics. Along the way, we asked him which graphics card--AMD or Nvidia--he would buy right that second and why. His answer might surprise you.

PCG: If you were to buy a graphics card right now, what would you get?

John Carmack : Let me caution this by saying that this is not necessarily a benchmarked result. We've had closer relationships with Nvidia over the years, and my systems have had Nvidia cards in them for generations. We have more personal ties with Nvidia. As I understand it, ATI/AMD cards are winning a lot of the benchmarks right now for when you straight-out make synthetic benchmarks for things like that, but our games do get more hands-on polish time on the Nvidia side of things.

Nvidia does have a stronger dev-relations team. I can always drop an email for an obscure question. So its more of a socio-cultural decision there rather than a raw “Which hardware is better.” Although that does feed back into it, when you've got the dev-relation team that is deeply intertwined with the development studio. That tends to make your hardware, in some cases, come out better than what it truly is, because it's got more of the software side behind it.

You almost can't make a bad decision with graphics cards nowadays. Any of the add-in cards from AMD or Nvidia are all insanely powerful. The only thing that's still lacking—and it's changing—is the integrated graphics parts. Rage executes on an Intel integrated graphics part, but it isn't something you'd want to run it on right now. But even that's going to be changing with the upcoming generations of things.

I mean, the latest integrated graphics parts probably are more powerful in many ways than the consoles. If they gave us the same low-level of access, coupled with the much more powerful CPUs, we could do good stuff there. Of course, that's the worrisome large-scale industry dynamic there, where as integrated parts become “good enough,” it's got to make life really scary for Nvidia on there. If it went that way to its logical conclusion, where Intel parts were good enough and Nvidia was pinched enough not to be able to do the continuous R&D, that would be an unfortunate thing for the industry.

To some degree, it seems almost inevitable where the world of multi-hundred-dollar add-in cards are doing something that's being done pretty well by an on-die chip. Not right now, maybe not next year, but it's hard to imagine a world five years from now where you don't have competent graphics on every CPU die.