There's a big showdown happening in the world of affordable graphics cards this week. AMD and Nvidia are releasing the latest editions in their £100 / $150 range, an important battleground, given that cards at that range easily outsell their flashy flagship $1000 tech. AMD are bringing some rebranded and boosted versions of their last-gen GPUs to compete with Nvidia's GTX 750Ti and GTX 750, which will give us our first look at their new Maxwell GPU architecture.
AMD’s latest processor design is probably the most interesting new chip from the Texan silicon giant since they released their Bulldozer FX chips on the world. And, on first glance at the performance metrics, it would be just as easy to dismiss the new APU as a bit of a failure.
But there is more to the A10-7850K - the APU formerly known as Kaveri - than meets the eye, though it might be a while before its promise is completely realised. Let’s talk about the actual processor performance first though. It’s pretty unspectacular.
It’s finally happened. AMD have launched their new graphics API, Mantle, to the public and you can pick it up right now in the new Catalyst 14.1 beta driver update. Obviously there are caveats. The first is that it’s a beta driver so don’t expect it to be rock-solid no matter what the situation - I’ve already encountered some glitches in a CrossFireX rig that I didn’t see in a single GPU setup.
The biggest caveat though is that only AMD GPU-owners need apply, and then only those with Graphics Core Next architecture in their cards. That means all HD 77XX and above, and all R7 and above, will be able to support the new API.
If PC gaming is a romance, then DirectX represents the high-school era. It's the thing that's passing notes between your games and your graphics cards, possibly while getting a bit bashful and giggling. Cute as this image is, it's hardly the most efficient way to foster a relationship. Step in AMD's new low-level API, Mantle, which has been designed to allow games to directly access GPUs. That sounds like a good thing, although it's going to be awkward when Battlefield 4 realises that your graphics card has been seeing other games behind its back.
At AMD's CES conference, Battlefield 4 was demoed on-stage running the Mantle API. It was presented alongside the claim that it could run "up to 45% faster than the original version on this same hardware." Meaning, up to 45% faster than the DirectX equivalent.
At this year's Consumer Electronics Show (CES) AMD’s Lisa Su, Senior VP and General Manager, officially introduced their latest, groundbreaking APU, code-named Kaveri. You can pre-order it straight away or wait until it’s officially available to buy on the 14th January.
Sadly my review samples won't be around until after I’m back from the Las Vegas show, which is why I wouldn't recommend a pre-order, but AMD are convinced this piece of tech represents a new dawn for them, and they might just be right.
Nvidia’s big press conference at this year's CES I was given a reason to go green in the ongoing battle between Nvidia and AMD - G-Sync. It enables the GPU and monitor to work together to ensure frames are delivered to the display consistently and smoothly. Your monitor only updates the frame when the GPU is ready, eliminating screen-tearing and reducing stutter.
AMD have excitedly announced they’re going to be shipping the new Kaveri APU just after the Consumer Electronics Show (CES) in Las Vegas a couple of weeks after the new year. But what about their straight desktop FX line of processors?
According to the roadmap AMD released this month 2014 is going to see it’s ‘Performance’ lineup of CPUs sticking with the 32nm Piledriver revision of its wildly unsuccessful Bulldozer architecture. Only the new Kaveri APUs will get the new, updated Steamroller design, starting with the AMD A10-7850K, and that’s a massive shame. One of the big problems with the original Bulldozer design was the lack of single-threaded performance from the new chips - weaker in fact than the processors they were meant to be replacing.
You could fry an egg on AMD's Radeon R9 290X chip and it gobbles up power like a volt-starved Pikachu, but its speedy performance has forced Nvidia's hand - find out why in our 290X review. AnandTech report that they're dropping the price of the GTX 780 by $150 to $500, and the price of the 770 by $70 to $330. The former just undercuts AMD's new flagship GPU, and the latter puts the 770 in a competitive range with the AMD 280X.
Look. It’s new. Like actually new, not just old but with a new sticker. Not necessarily new technology, but y’know, a genuine new configuration. Yup, the AMD Radeon R9 290X is the first actually new graphics card they have released in an absolute age. Sure, we’ve seen the R9 280X (actually a HD 7970 GHz), the R9 270X (actually a HD 7870) and the R7 260X (actually a HD 7790), but this is a card with a bona fide new GPU.
The Radeon R9 290X is AMD’s latest flagship graphics card aimed squarely at taking on the top-end of Nvidia’s rivalling graphics lineup. And the scary thing? It manages it.
Even as Valve is trying to ease access to PC gaming in the living room, its plans for the Steam Machine won't be held up by an adherence to a single manufacturer of graphics hardware. The proposed SteamOS-based systems will support a variety of graphics builds with GPUs from AMD, Intel, and Nvidia when they launch next year, according to a report at Maximum PC.
News and rumors are still buzzing around Valve’s battle for your living room. Developers from all walks of life have shared their thoughts on Valve’s flurry of announcements, and now Oculus Rift Chief Technology Officer and id Software co-founder John Carmack has entered the fray, discussing how SteamOS devices might benefit from AMD's new graphics technology.
AMD are in a strong position right now, thanks to the presence of their GPUs in both of the 'next-gen' consoles. Yesterday, they revealed the next step in 'Operation: Make All The Graphics', which I assume is their codename for the global graphical domination they're so clearly chasing. It's called Mantle, and its a new low-level API that gives developers direct access to GPUs using AMD's Graphics Core Next architecture.
Good news for anyone who's experienced the micro-stutter you can get from multi-GPU set-ups, AMD claim to have killed it completely in Crossfire scenarios with their latest Catalyst 13.8 beta drivers.
Micro-stuttering - if you've never encountered this foul effect - is frame rendering delay that occurs despite high average frame rates. Each frame in a sequence is rendered alternately by the connected GPUs and the juddering occurs because there can be variance between how long it takes for a given frame to be delivered. It occurs more frequently in CrossFire than SLI and, curiously, the effect is negated if you add a third GPU to the mix. Have the 13.9 drivers successfully banished micro-stuttering? If you're thinking "the answer had better involve some graphs" then prepare to not be disappointed.
It’s been a good year for AMD. After becoming the sole video card manufacturer for next-gen consoles, the company is touting its hardware as the optimized platform for Battlefield 4, though IGN reported that statement as encompassing every Frostbite 3 game. EA was quick to soften the statement.
We’ve known AMD are the go-to guys for next-gen console silicon for a good while now. The tech press has been speculating since the consoles’ specs were first announced as to how the PC could benefit from Sony and Microsoft opting for the x86, and specifically AMD, architecture. After all the Xbox 360 was running AMD graphics hardware and, from my perspective, the benefits to the PC from that relationship are pretty intangible at best. There are signs that things may be different this time around.
"The consoles are really the target for a lot of the game developers, if it’s a Radeon heart powering that console, like the PS4 or Xbox 360, that means these games devs are going to be designing their games, designing their features and really optimising for that Radeon heart" said AMD's Devon Nekechuk around the launch of the Radeon HD 7990. But why, specifically, will that be the case? I asked AMD's worldwide manager of ISV gaming engineering, Nicolas Thibieroz for the nitty gritty.
Nvidia did some well-deserved strutting at E3 yesterday, showing off the superiority of the PC as a gaming platform. With charts and graphs and maybe just a teeny tiny hint of bitterness that AMD processors are powering both the PS4 and the Xbox One, Nvidia’s Tony Tamasi told the room that “the PC is the most powerful gaming platform out there.”
Intel are heralding their new Haswell processor architecture as a game-changer for gaming ultrabooks and small form factor gaming machines. Their competitors AMD predictably have serious doubts about Intel’s ability to compete when it comes to PC gaming.
I spoke with Intel’s Richard Huddy a few months back about the graphical technology behind their push for Haswell in the gaming market and he was very excited about the progress they were making for PC gamers, but I also put some questions to AMD’s Nicholas Thiebierroz, Senior Manager of its Gaming Engineering division. I’m sure it’s no coincidence I’ve only just heard back as Haswell is launched. Here's what he said about Intel's latest foray into the world of gaming hardware and what the next generation of consoles, which run on AMD architecture, will mean for PC gamers.
Before Nvidia launched the GTX Titan wündercard, AMD held a bullish press briefing to bang the Radeon drum, claiming “performance leadership at every pricepoint”. Not only that, but they were also promising additional silicon in the first half of this year with a new range of products coming around by the end of 2013.
We’re now hearing rumours from varioussources about what exactly that new silicon is going to be, and it’s apparently called the Bonnaire XT and will be our first taste of AMD’s Graphics Core Next (GCN) 2.0 architecture.
Lara's locks are proving a problem for Nvidia customers, whose graphics cards are struggling to handle the AMD-developed hair-rendering technology. Given that Nvidia owns two thirds of the GPU market, that's an awful lot of Tomb Raiders out there suffering from shoddy performance - if they can even get into their game at all.
I’m one of these unlucky folk, the once-proud owner of a GTX 670, and I can’t even get into the options screen, let alone play the game. Of course, loads of games have had dreadful launches, marred by server problems and driver/graphics card issues; even the likes of Half-Life 2 and Diablo III had trouble getting out of the gate. But the current disadvantage experienced by Nvidia customers could go beyond Lara's bounteous bangs. With AMD components sitting in next-gen consoles, this may not be the only time Nvidia's driver team find themselves left behind at a major game launch.
After teasing us all with its TressFX tagline - Render. Rinse. Repeat - AMD have today revealed their (apparently painstaking) collaboration with Crystal Dynamics: the world’s first real-time hair rendering technology in a playable game. Tomb Raider is the first title to get the treatment, with its bedraggled heroine's bonce featuring the most advanced follicle tech ever.
Realistic hair is, according to AMD, one of the most complex and challenging materials to accurately produce in real-time. With so many different strands and physics computations needed to model their interaction with each other, it’s no wonder that we’ve been stuck with chunky polygon make-weight barnets in gaming. But no longer.