Ashes of the Singularity Beta 2

Ashes 8

Ashes and APIs

When last we looked at Ashes of the Singularity, it was one of the first DX12 titles with a benchmark that we could test, but it was also very early. Six months later, the second playable race is in place, and Oxide has had a lot more time to work on refining the game—and refining the DX12 rendering engine. And it's that latter aspect that we're going to be focused on today.

The idea of a low-level API isn't really new, as the modern 3D graphics era was kicked off by 3dfx and their Glide API. All it really means is that developers have a lot more access to the hardware, and they can take full control over managing resources like memory, shaders, buffers, etc. In theory, that means they can improve performance and increase the graphics quality, but there's a dark side to all of this: Programming a low-level API means you have to be a lot more careful about what you do, as if things go wrong you could end up with all sorts of problems.

Oxide isn't new to the low-level API game; they were one of the first to have a public demo of AMD's Mantle API with their Star Swarm demo. Later, Star Swarm was been ported to DX12, and then the engine got turned into a full-blown game. This is an important step, because where Star Swarm was basically just throwing lots of draw calls at various APIs to show the superiority of a low-level API for certain tasks, an actual game has a different goal. To wit, it must be playable and enjoyable or it is unlikely to be successful.

We'll leave the evaluation of the actual game to our cohorts at PC Gamer (with a review presumably coming once the game leaves beta), but the basic idea is a modern take on Total Annihilation and Supreme Commander, with DX12 offering the potential for even more units and detail during the battles. And of course, since DX12 requires Windows 10, which is currently sitting at somewhere around 10 percent of the total market, Oxide is also supporting DX11 for users of older versions of Windows. We're skipping all of the DX11 stuff this round, however, as the focus is clearly on DX12.

Let's talk adapters

So what does it actually mean to use a low-level API? This is where things get a bit convoluted. On the one hand, we have Oxide talking about optimizing for algorithms rather than hardware, and stating they're not receiving any funding from AMD (or any other hardware vendors). That strikes us as a bit odd, considering their previously mentioned involvement with Mantle, but whatever. Let's just go with the idea that Oxide is making the best gaming engine they know how to make for Ashes of the Singularity; that means they want it to run well on the broadest range of hardware possible, and frankly ignoring Nvidia hardware would be practical suicide considering they have a much larger slice of the graphics hardware pie these days.

Actually working with a low-level API gives the developer the control of all the core hardware resources, and this is where things can get a bit crazy. If you know that every single person has a specific set of hardware—for example, let's say they all have a Core i5-6500, R9 390, 16GB RAM, and a 512GB SSD—you can make use of every one of those resources. That means you have the CPU, discrete GPU, processor graphics, plenty of RAM, and fast storage. This is what it's like to create games for a console like the PS4 or Xbox One, and the things the developers can do with a far more limited set of hardware are quite impressive.

The Xbox One for instance has eight of AMD's Jaguar cores running at 1.75GHz, 8GB of shared system/graphics memory, 32MB of high-speed eDRAM, and 768 graphics cores running at 853MHz. In the world of PCs, that would be about one-third the CPU performance of an AMD FX-8320 (Jaguar cores aren't nearly as fast as Piledriver cores of the same clock speed), with something approaching the performance of an R7 260 graphics card. If developers can make impressive looking games that work reasonably well on an Xbox One, thanks to the low-level hardware access, just imagine what they could do with a low-level API on a modern PC!

That's the basic promise of the low-level APIs like DirectX 12, Mantle, and Vulkan, and at first it all sounds rather compelling—who wouldn't want vastly superior graphics quality and performance? Well, let us tell you who doesn't want that: a lot of developers are more interested in making games that tell fun or amazing stories, or take you to a different place, and they don't need photorealistic graphics to get you there. Case in point: Fallout 4 is still only using a DirectX 9 engine, and Valve likewise hasn't used anything beyond DX9 for their Source games to date (more or less). We could name others as well, like the Mass Effect series (all DX9), StarCraft II (all DX9), and tons of indie titles.

Talos Principle

Which isn't to say that there aren't developers clamoring for low-level APIs, but this is a great time to link to the Croteam developer's post on the Vulkan port of The Talos Principle. Here's a key quote:

Yes, [Vulkan] has downsides. For one, it's quite hard to program for. You have to do a lot of things manually, instead of relying on drivers to do the work for you. This is both good and bad at the same time. Good for performance reasons, because the driver doesn't assume what the game wants to render (I won't go into any more details here, sorry). Bad because there's a lot more coding and in general; it's a more complex approach. You better know what you're doing, because you won't get any help from the driver. You're on your own. It's really great to have that much control. If you know what you're doing!

For the Vulkan port of Talos, performance right now is lower than D3D11 by 20-30 percent. This is a new API running on a just-ported to Vulkan engine, though, so give it some time. Long-term, performance on Vulkan should eventually surpass DX11, hopefully. But that's probably only going to be on some subset of hardware—you can't possibly plan for every piece of hardware you might encounter, unfortunately. Ashes, meanwhile, has been running on DX12 hardware publicly for at least six months, and the earlier work on the Star Swarm demo is over a year old now. So Oxide has had a lot of time to improve things, and they're doing some advanced techniques as well.

Basically, you have to be a rock star developer to get good results with low-level APIs. If you're not pushing the boundaries of computer graphics with your games (see that earlier list of DX9 titles for examples), you probably don't want to bother with low-level APIs. Let the GPU drivers people do the heavy lifting, and focus instead on making a good game. Because even if you want to cover just a subset of the potential hardware combinations, the rabbit hole runs deep. Think of the Xbox One hardware, and now consider the modern PC space: dozens of potential CPUs, dozens more GPUs, memory ranging from 4GB to 32GB and more. Now throw in the potential to mix and match GPUs, and that brings us to today's topic.

Jarred Walton

Jarred's love of computers dates back to the dark ages when his dad brought home a DOS 2.3 PC and he left his C-64 behind. He eventually built his first custom PC in 1990 with a 286 12MHz, only to discover it was already woefully outdated when Wing Commander was released a few months later. He holds a BS in Computer Science from Brigham Young University and has been working as a tech journalist since 2004, writing for AnandTech, Maximum PC, and PC Gamer. From the first S3 Virge '3D decelerators' to today's GPUs, Jarred keeps up with all the latest graphics trends and is the one to ask about game performance.