I took a test drive with Intel's DG1 discrete graphics card and wasn't impressed

Intel Xe Graphics
(Image credit: Future)

One of the biggest demonstrations out of CES this year was when Intel showed off an early Tiger Lake laptop running Warframe, followed by Destiny 2 running on a dedicated Intel graphics card. Intel is planning on entering the dedicated graphics card market later this year with Xe Graphics, but prior to CES we've had little to go on other than talk. After the live demonstration of Xe Graphics on stage, I was able to go hands on with a development system using Intel's DG1 prototype. There were no frame counters, and I couldn't even check the settings, but I was able to play Warframe running at 1080p on the GPU. I wasn't impressed.

Let me start with the card, however. The dedicated graphics card looks a bit like what you'd get if you took an Intel Optane 905p add-in card SSD, made it a lot thicker (it's a dual-slot card), and then slapped a fan and video outputs onto it. It's a pretty decent looking graphics card, really—nothing too extreme, and it would fit in just fine for many PC builds. Heck, it even includes RGB lighting, even though it's supposedly only a development kit board and not necessarily a retail product. (I strongly suspect this is close to the final design for what Intel intends to sell at retail.)

Intel didn't reveal anything about the internal hardware configuration. We don't know GPU core counts, though rumors of 96 or 128 EUs have floated around. We don't know clockspeeds or compute performance. We don't know how much VRAM is on the board, what type of VRAM is being used, the bus width of the VRAM, or the clockspeed of the VRAM. GDDR5 or GDDR6 would seem most likely, but Intel isn't saying.

There's no information on power requirements either—all we know for sure is that there are no 6-pin or 8-pin power connectors, so all the power has to come over the PCIe x16 slot. Intel did confirm that it's a sub-75W card, but is it 75W, 50W, or even 30W?

That's a lot of unknowns. But we do know a few things at least. Besides the sub-75W aspect, the heatsink on the card is relatively small. There's a large shroud with a single axial fan, but the heatsink is only about one centimeter thick and doesn't have any heatpipes. There's no need for anything really fancy for a sub-75W card, so the cooling isn't too surprising. The other major thing we know is that the card has a single HDMI port and three DisplayPort connections. Oh, and like I was saying earlier, I got to play a bit of Warframe on it.

Not an actual image of Warframe running on DG1. (Image credit: Digital Extremes)

Based on how the game looked, 1080p and medium or high settings seems likely. That's not a particularly strenuous resolution, especially for a dedicated graphics card, but this is Intel we're talking about. Try booting up most games at 1080p and medium or high settings using current Intel desktop CPUs with UHD Graphics 630 (aka, Gen10 graphics—which is what's in 6th, 7th, 8th, and 9th Gen Intel desktop CPUs), and you're going to be sorely disappointed.

Even 720p at minimum quality is often too demanding. Lighter games like CS:GO and Overwatch might run okay, but Control, The Outer Worlds, Ghost Recon: Breakpoint, Gears 5, Borderlands 3, Wolfenstein: Youngblood, The Division 2, Rage 2, Anthem, and Metro: Exodus range from not running at all to barely playable. Running 1080p is 2.25X as many pixels as 720p, which means typically less than half the 720p performance on Intel integrated graphics.

So how does the DG1 "software development vehicle" board fare? No benchmarks were allowed, and no framerate was displayed, but I've played enough games that I have a pretty reasonable sense of framerates, especially when we're talking about sub-30 fps results. Warframe on the DG1 feels like it's running a bit under 30 fps—playable, but only just. And it wasn't because vsync was turned on, as tearing was clearly visible.

That's not a great result, and it wasn't a great gaming experience. The thing is, I haven't played or benchmarked Warframe before, so I wasn't sure if Warframe is a demanding game or not. I had my suspicions it was the latter. Once back at the hotel, I tried running Warframe on an MSI GL63 laptop—the laptop I brought along to CES. It has a mobile RTX 2060, but more importantly it also has Intel UHD Graphics 630.

(Image credit: Future)

Here's where things aren't so great. The DG1 was undoubtedly faster than the UHD 630, which in testing averaged a paltry 16-22 fps at 1080p high. At 1080p low, it got about 24-28 fps—not a massive improvement. This wasn't using the same Warframe level as the DG1, but performance was consistently poor with the UHD 630. Basically, without hard numbers, it felt like the DG1 performance was maybe 50 percent faster than the now four years old HD/UHD 630—and 720p at medium settings averaged 33-38 fps in testing, which felt clearly faster than what I experienced with the DG1 hands on.

So yeah, Warframe is a super light game in terms of graphics demands. It's perhaps more intense than CS:GO, but not by much. Using it as a demonstration of Intel Xe Graphics performance looks like Intel's trying to hide the poor performance. I get that Warframe is popular, but using it to show DG1 running is a sign of weakness. On the same map with the RTX 2060 enabled at 1080p high, performance on the GL63 laptop averaged around 240 fps.

The good news is that DG1 and Xe Graphics are working. Keep in mind this is a development system for an unreleased product, and early hardware and drivers will undoubtedly lower performance. For all I know, the DG1 might be clocked at half the intended speed, and the drivers might be so early that improvements in the time between now and the retail launch could double the performance or more. Let's hope so, because if not the DG1 is going to fall behind even the budget cards from AMD and Nvidia.

(Image credit: Future)

It's hard to understate how important it is for Intel to get more out of its Xe Graphics than what I saw at CES, at least if it wants to be taken seriously in the dedicated GPU market. Xe Graphics is one architecture, designed to scale from ultra mobile solutions up through PC gaming and into the workstation, cloud, deep learning, and supercomputing markets. If Intel isn't competitive in the discrete graphics card for gaming market, it would likely fall short in the other areas as well.

All indications are that Intel plans to have products that feature multiple Xe GPU chips—sort of like Nvidia's SLI, but better integrated so that using four chips could in theory quadruple the performance. (Yeah, I'll believe that when I see it.) Maybe the DG1 development test vehicle is the minimum configuration with 'only' 96 EUs (roughly equivalent to about 1000 Nvidia CUDA cores, give or take). There have been previous hints that we could see Xe Graphics with up to 512 EUs—enough to potentially match the compute power of an RTX 2080 Super, depending on clockspeeds.

I hope this early demonstration was for the slowest and weakest dedicated GPU variant of Xe Graphics we'll see this year, and that it's not remotely representative of the performance we'll see at launch. If that's the case, Intel could still end up competitive. But if so, why is Intel demonstrating such weak performance right now? Proving that Xe Graphics works at all might be useful, but I would have been much happier to see 1080p and 60 fps in a more demanding game instead of Warframe chugging along at 30 fps.

Jarred Walton

Jarred's love of computers dates back to the dark ages when his dad brought home a DOS 2.3 PC and he left his C-64 behind. He eventually built his first custom PC in 1990 with a 286 12MHz, only to discover it was already woefully outdated when Wing Commander was released a few months later. He holds a BS in Computer Science from Brigham Young University and has been working as a tech journalist since 2004, writing for AnandTech, Maximum PC, and PC Gamer. From the first S3 Virge '3D decelerators' to today's GPUs, Jarred keeps up with all the latest graphics trends and is the one to ask about game performance.