You could fry an egg on AMD's Radeon R9 290X chip and it gobbles up power like a volt-starved Pikachu, but its speedy performance has forced Nvidia's hand - find out why in our 290X review. AnandTech report that they're dropping the price of the GTX 780 by $150 to $500, and the price of the 770 by $70 to $330. The former just undercuts AMD's new flagship GPU, and the latter puts the 770 in a competitive range with the AMD 280X.
Look. It’s new. Like actually new, not just old but with a new sticker. Not necessarily new technology, but y’know, a genuine new configuration. Yup, the AMD Radeon R9 290X is the first actually new graphics card they have released in an absolute age. Sure, we’ve seen the R9 280X (actually a HD 7970 GHz), the R9 270X (actually a HD 7870) and the R7 260X (actually a HD 7790), but this is a card with a bona fide new GPU.
The Radeon R9 290X is AMD’s latest flagship graphics card aimed squarely at taking on the top-end of Nvidia’s rivalling graphics lineup. And the scary thing? It manages it.
Even as Valve is trying to ease access to PC gaming in the living room, its plans for the Steam Machine won't be held up by an adherence to a single manufacturer of graphics hardware. The proposed SteamOS-based systems will support a variety of graphics builds with GPUs from AMD, Intel, and Nvidia when they launch next year, according to a report at Maximum PC.
News and rumors are still buzzing around Valve’s battle for your living room. Developers from all walks of life have shared their thoughts on Valve’s flurry of announcements, and now Oculus Rift Chief Technology Officer and id Software co-founder John Carmack has entered the fray, discussing how SteamOS devices might benefit from AMD's new graphics technology.
AMD are in a strong position right now, thanks to the presence of their GPUs in both of the 'next-gen' consoles. Yesterday, they revealed the next step in 'Operation: Make All The Graphics', which I assume is their codename for the global graphical domination they're so clearly chasing. It's called Mantle, and its a new low-level API that gives developers direct access to GPUs using AMD's Graphics Core Next architecture.
Good news for anyone who's experienced the micro-stutter you can get from multi-GPU set-ups, AMD claim to have killed it completely in Crossfire scenarios with their latest Catalyst 13.8 beta drivers.
Micro-stuttering - if you've never encountered this foul effect - is frame rendering delay that occurs despite high average frame rates. Each frame in a sequence is rendered alternately by the connected GPUs and the juddering occurs because there can be variance between how long it takes for a given frame to be delivered. It occurs more frequently in CrossFire than SLI and, curiously, the effect is negated if you add a third GPU to the mix. Have the 13.9 drivers successfully banished micro-stuttering? If you're thinking "the answer had better involve some graphs" then prepare to not be disappointed.
It’s been a good year for AMD. After becoming the sole video card manufacturer for next-gen consoles, the company is touting its hardware as the optimized platform for Battlefield 4, though IGN reported that statement as encompassing every Frostbite 3 game. EA was quick to soften the statement.
We’ve known AMD are the go-to guys for next-gen console silicon for a good while now. The tech press has been speculating since the consoles’ specs were first announced as to how the PC could benefit from Sony and Microsoft opting for the x86, and specifically AMD, architecture. After all the Xbox 360 was running AMD graphics hardware and, from my perspective, the benefits to the PC from that relationship are pretty intangible at best. There are signs that things may be different this time around.
"The consoles are really the target for a lot of the game developers, if it’s a Radeon heart powering that console, like the PS4 or Xbox 360, that means these games devs are going to be designing their games, designing their features and really optimising for that Radeon heart" said AMD's Devon Nekechuk around the launch of the Radeon HD 7990. But why, specifically, will that be the case? I asked AMD's worldwide manager of ISV gaming engineering, Nicolas Thibieroz for the nitty gritty.
Nvidia did some well-deserved strutting at E3 yesterday, showing off the superiority of the PC as a gaming platform. With charts and graphs and maybe just a teeny tiny hint of bitterness that AMD processors are powering both the PS4 and the Xbox One, Nvidia’s Tony Tamasi told the room that “the PC is the most powerful gaming platform out there.”
Intel are heralding their new Haswell processor architecture as a game-changer for gaming ultrabooks and small form factor gaming machines. Their competitors AMD predictably have serious doubts about Intel’s ability to compete when it comes to PC gaming.
I spoke with Intel’s Richard Huddy a few months back about the graphical technology behind their push for Haswell in the gaming market and he was very excited about the progress they were making for PC gamers, but I also put some questions to AMD’s Nicholas Thiebierroz, Senior Manager of its Gaming Engineering division. I’m sure it’s no coincidence I’ve only just heard back as Haswell is launched. Here's what he said about Intel's latest foray into the world of gaming hardware and what the next generation of consoles, which run on AMD architecture, will mean for PC gamers.
Before Nvidia launched the GTX Titan wündercard, AMD held a bullish press briefing to bang the Radeon drum, claiming “performance leadership at every pricepoint”. Not only that, but they were also promising additional silicon in the first half of this year with a new range of products coming around by the end of 2013.
We’re now hearing rumours from varioussources about what exactly that new silicon is going to be, and it’s apparently called the Bonnaire XT and will be our first taste of AMD’s Graphics Core Next (GCN) 2.0 architecture.
Lara's locks are proving a problem for Nvidia customers, whose graphics cards are struggling to handle the AMD-developed hair-rendering technology. Given that Nvidia owns two thirds of the GPU market, that's an awful lot of Tomb Raiders out there suffering from shoddy performance - if they can even get into their game at all.
I’m one of these unlucky folk, the once-proud owner of a GTX 670, and I can’t even get into the options screen, let alone play the game. Of course, loads of games have had dreadful launches, marred by server problems and driver/graphics card issues; even the likes of Half-Life 2 and Diablo III had trouble getting out of the gate. But the current disadvantage experienced by Nvidia customers could go beyond Lara's bounteous bangs. With AMD components sitting in next-gen consoles, this may not be the only time Nvidia's driver team find themselves left behind at a major game launch.
After teasing us all with its TressFX tagline - Render. Rinse. Repeat - AMD have today revealed their (apparently painstaking) collaboration with Crystal Dynamics: the world’s first real-time hair rendering technology in a playable game. Tomb Raider is the first title to get the treatment, with its bedraggled heroine's bonce featuring the most advanced follicle tech ever.
Realistic hair is, according to AMD, one of the most complex and challenging materials to accurately produce in real-time. With so many different strands and physics computations needed to model their interaction with each other, it’s no wonder that we’ve been stuck with chunky polygon make-weight barnets in gaming. But no longer.
The new version of Futuremark’s 3DMark has just been released, offering us all ways to benchmark our graphics cards and PCs by using Matrix-esque squid robots. And who wouldn’t want to be able to do that? I’ve grabbed the current top two single-GPU graphics cards from both AMD and Nvidia and have put them head-to-head in a battle royale to see who Futuremark's squid-bot believes is the quickest graphics card out there.
AMD has shown reluctance to release their own-brand HD 7000 range dual-GPU card, leaving affiliate manufacturers like Asus, Club3D and HIS to cobble together their own polygon-crunching beasts. As a result, we’ve seen a fair number of super-powered graphics cards over the last few months, including the freakishly potent Club3D Radeon HD 7990s - but with the unveiling of the Asus ARES II, things may just be getting a little silly.
Yes, that’s right AMD’s HD 8000 series graphics cards are on their way as we speak - the embargo has lifted and I’m now allowed to tell you that the Graphics Core Next (GCN) 2.0 generation of GPUs is imminent.
But unfortunately they’re coming to a laptop probably not very near you first.
There have been rumours floating around for the last couple days that Intel is going to end the traditional socketed CPU once the Haswell chip is out of the door. Based upon a supposedly leaked processor roadmap, Japanese site, PC Watch, is claiming to show that Intel will be calling time on the CPU upgrade market.
What they are saying is that the Broadwell CPU, the next-generation chip to follow Haswell, will be sold soldered into the motherboard, doing away with the LGA socket altogether. As the Broadwell lineup will represent the die-shrink down to 14nm from the 22nm Haswell variant, it's possible there may be an architectural need for these CPUs to be permanently attached to the motherboard.
Now that AMD has finally released the rest of its new Piledriver line-up into the wild, I've been able to spend a little quality time with the six-core FX-6300 - a CPU that I think offers a sweet-spot in terms of price/performance metrics.
It’s a decent little chip at stock speeds, and in raw CPU computational terms its six cores comprehensively out-play the i3-3225, Intel's similarly-priced dual-core Ivy Bridge chip, thanks to the extra multi-threading performance on offer. Intel’s dominance in the gaming sphere is evident, however: the FX-63000 doesn't compete with the dual-core Intel chip in my Batman: Arkham City or Shogun 2 CPU tests.
Until you get busy with the overclocking that is. Then it's a very, very different story.
There’s an unconfirmed rumour that AMD’s next best-ever APU, code-named Kaveri, has been delayed again. Previously, it was touted for a release early next year, which then reportedly slipped to later in 2013 - now it's claimed that the launch of the next-gen chip is being put back to 2014.
Notoriously reliable (wink) SemiAccurate is reporting insider claims that AMD is rejigging the Kaveri chips so that they are more competitive with the upcoming Haswell chips from Intel. Haswell is the architectural successor to the phenomenally successful Ivy Bridge CPUs, with the big change coming in the graphics performance of the next-gen chip.
I promised earlier in the week to give you a run down of what exactly the driver improvements touted by AMD for its ‘Never Settle’ driver release of Catalyst 12.11. Not to be outdone on the driver front Nvidia has also released a new beta driver for its graphics cards this week too. So, in the battle of the latest graphics card optimisations who wins?