A Valve engineer fixed 3D lighting so hard he had to tell all the graphics card manufacturers their math was wrong, and the reaction was: 'I hate you'

A custom wrapped Nvidia RTX 4080 graphics card with Half-Life 2 logo and Gordon Freeman.
(Image credit: Nvidia)

The PC gaming icon that is Half-Life 2 recently celebrated its 20th anniversary, and Valve pulled out all the stops with a major new update integrating the game with its episodes and adding a commentary track. The studio also released a two-hour documentary about the making of the game and what was going on at Valve during its development, which is absolutely crammed with fascinating digressions about the challenge it set itself. And one of them was lighting.

The development of Half-Life 2 was rooted in what multiple staff describe as "the tech wishlist", which would take six years to fully realise and was absolutely foundational to what Valve wanted to achieve with the game.

"The math that we were using was wrong," says Birdwell. "And not only that, the math that everybody was using was wrong. And then as I started to correct it I realised just how bad it was… and then I fixed it and suddenly everything looked great!

"I had to go tell the hardware guys, the people who made hardware accelerators, that fundamentally the math was wrong on their cards. That took about two-and-a-half years. I could not convince the guys, finally we hired Gary McTaggart [from 3DFX] and Charlie Brown and those guys had enough pull and enough… I have a fine arts major, nobody's gonna listen to me."

Half-Life 2: 20th Anniversary Documentary - YouTube Half-Life 2: 20th Anniversary Documentary - YouTube
Watch On

Let's just pause on that aside. Birdwell smiles while delivering the last line, which we'll allow because this guy fixed the math being used for lighting so hard that the manufacturers of graphics cards had to change their math. I found this thought too fascinating to leave alone, and sought out Birdwell to ask if he could expand a little.

"It's a bit technical," begins Birdwell, "but the simple version is that graphics cards at the time always stored RGB textures and even displayed everything as non linear intensities, meaning that an 8 bit RGB value of 128 encodes a pixel that's about 22% as bright as a value of 255, but the graphics hardware was doing lighting calculations as though everything was linear.

"The net result was that lighting always looked off. If you were trying to shade something that was curved, the dimming due to the surface angle aiming away from the light source would get darker way too quickly. Just like the example above, something that was supposed to end up looking 50% as bright as full intensity ended up looking only 22% as bright on the display. It looked very unnatural, instead of a nice curve everything was shaded way too extreme, rounded shapes looked oddly exaggerated and there wasn’t any way to get things to work in the general case."

Birdwell says this remains "a super common graphics mistake" and even today certain areas of programming require the coder "to keep in mind that all the bitmap values are probably nonlinear, you can’t just add them together or blend them or mix them with linear calculations without considering what 'gamma space' you're working in."

The good news is that "modern graphics cards know all this now," and at runtime automatically convert any non-linear formats "into a nicely behaved linear floating point value inside the graphics card before the math happens, so it's way easier. But that's now."

The reason it is that way now is probably a fine arts major.

"All through the '90s up to maybe the early 2010s it wasn't the case," says Birdwell. "You had to be super aware of what 'gamma space' you were in at each step of the process or things would look super weird.

"The problem was, when I pointed this out to the graphics hardware manufacturers in '99 and early 2000s, I hit the 'you've just pointed out that my chips are fundamentally broken until we design brand new silicon, I hate you' reaction. That wasn't a fun conversation. It went through the stages of denial, anger, bargaining, etcetera, all in rapid succession with each new manufacturer.

"I was very happy to pass off those conversations to the newly hired HL2 graphics programmers Gary McTaggert and Charlie Brown, who worked through it all step by painful step over the years."

While trying to get in touch with Birdwell I found an unrelated patent granted to Valve in 2007 on which he's listed as the inventor. Patent #20070195090 is for "Determining Illumination of Models Using an Ambient Framing Abstractions" and summarises itself as "a system and method for determining light illumination on a model in a virtual environment." It goes into great detail about Birdwell's exact innovation and, even if this isn't anything to do with what he's describing above, it's clear the man is some sort of god of videogame lighting.

Producer Bill Van Buren says elsewhere in the documentary that there were three key principles to Half-Life 2 and the lighting was part of the first. "To make something that was immersive," says Van Buren, "visually really rich and appealing, something more like you would see in a film, and the art direction and the tech to make that happen."

Valve certainly achieved that: Half-Life 2 still looks amazing today. And there are many, many people responsible for that achievement. But Ken Birdwell is why the lighting looks so good, and he did such an outstanding job that you can probably argue every game since has benefitted from it: Or, to put it another way, our graphics cards certainly have.

TOPICS
Rich Stanton
Senior Editor

Rich is a games journalist with 15 years' experience, beginning his career on Edge magazine before working for a wide range of outlets, including Ars Technica, Eurogamer, GamesRadar+, Gamespot, the Guardian, IGN, the New Statesman, Polygon, and Vice. He was the editor of Kotaku UK, the UK arm of Kotaku, for three years before joining PC Gamer. He is the author of a Brief History of Video Games, a full history of the medium, which the Midwest Book Review described as "[a] must-read for serious minded game historians and curious video game connoisseurs alike."