If Intel's latest GPU drivers are delivering a 750% fps boost in Halo imagine how bad it was before

halo 2 anniversary
(Image credit: 343 Industries)

When Intel returned to the discrete graphics card market last year, it promised that the Arc GPU drivers would improve over time. And true to its word, that's exactly what's happened, with regular updates appearing to massively improve frame rates in older games. But I don't think anything will top the one of the claims in the latest set of beta drivers.

Released yesterday, the new version of Alchemist drivers is said to provide up to 54% more performance in Returnal, at 1080p with 'epic' ray tracing enabled. That's a healthy boost and it's not the only game that's been given a wave of the coding magic wand. Guild Wars 2 players might see 53% more fps at 1080p, with Ultra settings, and if you're a fan of Yakuza Zero, then an extra 154% performance (again, at 1080p ultra graphics) has to be welcome.

But the biggest speed jump goes to Halo: The Master Chief Collection. As with all the game performance improvements listed in the driver notes, it's for 1080p at maximum quality settings. Intel's suggests that you could see up to 750% more frames per second. No, that's not a typo, at least not on my behalf.

Just read that figure again: 750% uplift on average. The first thing that came to my mind was just how bloody awful was the performance of earlier drivers to get this kind of improvement. Unfortunately, we only have one Arc card in the office, and getting into a suitable test machine to allow for a thorough retest isn't something we can do overnight (many of us work remotely), so I couldn't check things out myself.

However, a quick browse through YouTube showed me everything I needed to know, especially the testing done by Abe's Mission Control. They ran Halo at 1440p with the Enhanced graphics settings, so not exactly the same as what Intel has in its notes. Even so, the frame rate was utterly abysmal.

Not only did the game run at 15 fps on average but the GPU utilisation hovered around 13%. Before you think that the testing was CPU limited, it definitely wasn't, because on Performance graphics settings, the average frame rate jumped to 50 fps and the GPU utilisation was around 40%.

If you want a highly technical breakdown of exactly why Intel's Arc GPUs have such wildly varying performance, then may I suggest you head over to Chips and Cheese. Their consensus is that Alchemist is great when there's a high demand on the chip's cache and compute ability, but struggles in cases where it's not.

And that's precisely what's going on in older DirectX 11 games, as well as modern games that don't go overboard with the rendering techniques. Intel's driver team will be able to mitigate some of these issues, as we've seen across all of the software updates since Arc launched, but ultimately things are only going to get better with a new architecture.

For Arc graphics cards, that will be Battlemage, and it's generally expected to appear sometime in 2024. It feels a little odd rooting for Intel as the underdog in the GPU market but I'm looking forward to seeing what improvements will come to light. In the meantime, if you have an Arc GPU and want to check out these performance uplifts for yourself, you can grab the new beta drivers from here.

Nick Evanson
Hardware Writer

Nick, gaming, and computers all first met in 1981, with the love affair starting on a Sinclair ZX81 in kit form and a book on ZX Basic. He ended up becoming a physics and IT teacher, but by the late 1990s decided it was time to cut his teeth writing for a long defunct UK tech site. He went on to do the same at Madonion, helping to write the help files for 3DMark and PCMark. After a short stint working at Beyond3D.com, Nick joined Futuremark (MadOnion rebranded) full-time, as editor-in-chief for its gaming and hardware section, YouGamers. After the site shutdown, he became an engineering and computing lecturer for many years, but missed the writing bug. Cue four years at TechSpot.com and over 100 long articles on anything and everything. He freely admits to being far too obsessed with GPUs and open world grindy RPGs, but who isn't these days?