Advances in cooling could change PCs beyond all recognition

cooler
(Image credit: Corsair)
Tech report

PC Gamer magazine

(Image credit: Future)

This article first appeared in PC Gamer magazine issue 354 in March 2021, as part of our 'Tech Report' series. Every month we explore and explain the latest technological advances in computingfrom the wonderful to the truly weird—with help from the scientists, researchers, and engineers making it all happen.

PCs are ten, maybe even fifteen, years behind where they should be. That sounds a lot, but actually makes a lot of sense if you remember what PCs were like in 2005 (spoiler: the same as they are now). Bell's Law, a companion to Moore's Law, states that every decade, a new and lower-priced computer class forms that leads to the establishment of a new industry. 

This hasn't happened. The desktop PC wasn't out-competed by the laptop, and the laptop wasn't bullied out of its evolutionary niche by phones and tablets. We've got all three, at the same time, and the one you want always costs £1,000. 

This month's Tech Report was meant to be about the future of chip cooling, and it kind of still is, but it also touches on the implications of Bell's Law, and how hot new computer architectures, supplied with new cooling systems, could be about to overturn the hierarchy of our PCs and devices. 

We've been actively cooling our CPUs since the days of the 486 (introduced in 1989) and today's multi-core monsters may have fans all over, with intakes and outflows on their cases, AIO or hard-piped liquid cooling radiators cooled by three fans, three more on the graphics card, and even a few on sensitive parts of the motherboard. The number and size have increased, but we're still basically in the same place we were with Pentium II machines.

"We haven't reinvented the computer, and that is a problem," says Dr Bruno Michel of IBM's Zurich Research Laboratory, Switzerland. "We're using an overaged technology, when what we should use is the newest technology which we have available." 

And that technology? "Mobile phone technology." Michel studies smart system integration, and is inspired by nature. He's trying to make computers more efficient through miniaturisation, in order to reduce the carbon footprint of our processing. His end goal is to reach the kind of efficiency and computational density displayed by the human brain. 

And to that end, he's built a whole data centre out of mobile phones. "It's packed as densely as possible, uses water and microchannel cooling, and uses improved power delivery. The system is about 1,000 times denser than an air-cooled data centre from ten years ago, and about ten times more efficient than if we used this exact same technology today in an air-cooled version." 

To be fair, it's not going to make our hardware team look up from their RGB underpants, but that microchannel water cooling is fascinating and possibly the future of cooling inside our PCs. It's already being used in supercomputers, such as the SuperMUC at the Bavarian Academy of Sciences near Munich. This beast, with over 19,000 Intel Xeons and 340TB of RAM, is cooled by hot water for a 40% saving in energy over an air-cooled system. And that's not a typo: the coolant running through this system is at about 60°C, allowing the processors to run at approx 85°C.

(Image credit: Intel)

Diffusing heat 

The microchannels, with a diameter of less than 1mm, are directly attached to the processors. Water can conduct 4,000 times as much heat as air, and the closed-loop nature of the system means it is drawn through a heat exchanger before being sent back around. The excess heat is then fed into the heating system of the building—the SuperMUC is said to save the campus $1.25 million a year in heating costs. 

The same is true of the phone data centre invented by Michel. His microserver boards, which can use either ARMVv8 chips or IBM's own PowerPC processors (which must have moved on greatly since our PowerPC G5 Macs doubled-up as room heaters many years ago) and run Fedora Linux. Each board is only 133mm x 63mm, making them up to ten times smaller than traditional server blades. The microchannels run from left to right across the boards, but this is only an early design. "In the next generation, we would also have microchannels going from the left to the right and the fluid would then flow in and out and then in and out on the other side," says Michel, hypothesising on a system in which different boards can be plugged in and out of the system to give it flexibility. 

IBM's also working on a similar, but actually far cooler idea: electronic blood. First unveiled in 2013, the 'blood' acts just as the microchannel coolant does, carrying heat away from the computer system, but it does one other thing: it supplies power too. While the coolant in Michel's data centre is normal water with some anti-corrosion stuff mixed in, the electronic blood is an electrolyte in a kind of cell called a flow battery that can reversibly convert chemical energy to electricity. The trick is to get a chip to take its power from this bath of sparky chemicals rather than up the pin grid array that we're used to on the back of CPUs, and for this reason "we will not build that directly into a chip stack," says Michel, "but we will build that into a larger system".

(Image credit: Leibniz Supercomputing Centre)

Luckily, current chip manufacture seems happy to take its power from whatever source it's given, meaning there's no need to muck about with materials or nanotubes to get this kind of efficiency. "The chips are made from silicon, so these are normal processors, we don't really require any deviation from CMOS technology, that still works nicely," says Michel.

Whether we'll see this technology in our homes any time soon is another matter, however. "There is no clear road map," says Michel. "The problem is that the industry has an innovator's dilemma, they're sticking to the current state and don't dare to go to the next generation." What's more likely, he thinks, is that as cloud and edge computing—distributed computing systems that see our computers connected to hugely powerful data centres, in the case of the latter bringing them closer to where they're needed to save on bandwidth and latency—become more common, our devices will finally leave the desktop PC behind and crystallise around that newest available technology, the phone tech. Chromebooks are already part of this, but as even phone technology is ten years old, there should be something new along to take its place. Over our video chat, Michel removes his ring, showing the sensors on the inside to the camera. He talks about sensor packages that fit in your ear canal. The age of smart wearables backed by hugely efficient data centres might just be about to begin.