Intel Xe leak suggests DG1 is not much of a gaming chip
Intel Xe leaks give us an idea of how DG1 compares to the best entry-level mobile GPUs from AMD and Nvidia.
Keep up to date with the most important stories and the best deals, as picked by the PC Gamer team.
You are now subscribed
Your newsletter sign-up was successful
Want to add more newsletters?
Every Friday
GamesRadar+
Your weekly update on everything you could ever want to know about the games you already love, games we know you're going to love in the near future, and tales from the communities that surround them.
Every Thursday
GTA 6 O'clock
Our special GTA 6 newsletter, with breaking news, insider info, and rumor analysis from the award-winning GTA 6 O'clock experts.
Every Friday
Knowledge
From the creators of Edge: A weekly videogame industry newsletter with analysis from expert writers, guidance from professionals, and insight into what's on the horizon.
Every Thursday
The Setup
Hardware nerds unite, sign up to our free tech newsletter for a weekly digest of the hottest new tech, the latest gadgets on the test bench, and much more.
Every Wednesday
Switch 2 Spotlight
Sign up to our new Switch 2 newsletter, where we bring you the latest talking points on Nintendo's new console each week, bring you up to date on the news, and recommend what games to play.
Every Saturday
The Watchlist
Subscribe for a weekly digest of the movie and TV news that matters, direct to your inbox. From first-look trailers, interviews, reviews and explainers, we've got you covered.
Once a month
SFX
Get sneak previews, exclusive competitions and details of special events each month!
We've not had high hopes for Intel's discrete DG1 GPU—not since the CES trade show when Intel demonstrated some rather lacklustre performance from its maiden chip. Since then it's also shot down any hope of a high-end Intel Xe gaming GPU or an add-in card option for desktop anytime soon—so just a mobile entry-level chip it is then. That could still make for a decent gaming laptop though, right?
I wouldn't set your expectations high for gaming performance. An Intel chip going by the name "Gen12 Desktop Graphics Controller"—almost certainly Intel's DG1 GPU by another name—has been spotted in the Geekbench benchmark database by serial leaker TUM_APISAK. It posted an OpenCL performance score of 55,373, which puts it a little faster than Nvidia's low-power MX250 dGPU with scores around the 48,000 mark.
The more power-hungry GTX 1650 Mobile dGPU, at 50 watts graphics power for the non-Max-Q variant, easily outperforms the leaked Intel chip with scores upwards of 110,000. As for AMD's lineup, the RX 560 manages to outmanoeuvre Intel's dGPU with scores of roughly 65,000. While the RX 550 falls only a little behind Intel Xe with OpenCL compute scores of 50,000 or so.
Now before we get too far into the comparison, we don't yet know how representative that benchmark is of DG1's performance come launch day as this could still be representative of the Software Development Vehicle (SDV) sent out to devs that was not intended as an accurate measure of performance. Its drivers may be shoddy as heck, too, and Geekbench results can be highly variable.
| Row 0 - Cell 0 | Intel Xe DG1 |
| Architecture | Xe-LP |
| Process node | 10nm |
| Execution Units (EUs) | 96 |
| Memory | 3GB |
| Clock speed (GHz) | 1.5 |
Even with a rather monumental uplift in performance (which we don't expect will be the case) the DG1 GPU alone doesn't look likely to blow today's existing Nvidia and AMD mobile graphics cards out of the water.
DG1 will likely be included alongside a Tiger Lake CPU in some future designs, however. Those chips are also said to be equipped with an equivalent Gen 12 (Intel Xe) iGPU and have displayed performance on par with AMD's mobile Ryzen processors.
When combined, the twin Xe GPUs may bolster performance in some highly-parallel tasks, or even allow for some clever power-sharing between the two. But us gamers shouldn't get our hopes up. Multi-GPU support for gaming is almost non-existent, and while Intel demonstrated how it would go about multi-GPU rendering over at GDC, its approach relies heavily on the implementation outlined within DX12, which is not widely supported.
Keep up to date with the most important stories and the best deals, as picked by the PC Gamer team.

Jacob earned his first byline writing for his own tech blog, before graduating into breaking things professionally at PCGamesN. Now he's managing editor of the hardware team at PC Gamer, and you'll usually find him testing the latest components or building a gaming PC.

