Elon Musk suggested a novel use for 'bored' Tesla cars during a recent earnings call: combining their processing power to create a huge distributed 100 gigawatt AI inference fleet
'If they're not actively driving', of course.
 
Tesla CEO Elon Musk seems to have been thinking of new and interesting ways to use its products, as the tech billionaire used the Q&A section of the company's recent earnings call to posit the idea of using 100 million Tesla vehicles to create a "giant distributed inference fleet."
Well, it's nothing if not innovative. The idea seemed to spring out of discussions around the capabilities of the upcoming Tesla AI5 chip, which is said to be up to 40x faster than the current Tesla AI4 chip fitted to many of its cars (via Tom's Hardware). Musk openly wondered if the upcoming chip "might almost be too much intelligence for a car", which led to his pitching of a concept he seems to have been rolling around for a while:
"One of the things I thought," said Musk, "If we've got all of these cars that maybe are bored... we could actually have a giant distributed inference fleet... if they're not actively driving"
"At some point, if you've got 100 million cars in the fleet... and let's say they had, at the point, a kilowatt of inference capability... that's 100 gigawatts of inference distributed with cooling and power conversion taken care of. So, that seems like a pretty significant asset."
On the surface, it doesn't seem like such a bad idea. After all, the processing power on offer from a modern car, particularly one fitted with a cutting-edge chip at its heart, is likely wasted for the most part—beyond being used for self-driving systems that Tesla still appears to be struggling to get right.
  
Linking vast numbers of them together, in a similar fashion to say, the distributed computing platforms that are used for SETI@home and others, might well prove to be a good use for their talents. On the other hand, I'm presuming that Musk means privately-owned customer cars linked to a central server by their respective internet connections, not warehouses full of Tesla-owned vehicles humming away powering Grok, or similar.
In which case, I think it's a fairly tough sell to suggest that private vehicle owners use their own battery power (or wall sockets) to spin up the chips inside their cars for the benefit of... well, presumably Musk's other projects, like xAI. Never mind the potential shortening of the lifespan of said chips, if they're frequently crunching away on complicated AI tasks while you've gone to lunch.
Keep up to date with the most important stories and the best deals, as picked by the PC Gamer team.
Perhaps some sort of shared profits scheme? I'm just spitballing, while we're talking about big ideas. Which, it must be said, appears to be the same thing Musk was doing here.
Still, if we're going all in on the idea that everything should have a surprisingly powerful chip inside, from smart fridges to AI pillows, perhaps spinning them up en masse for AI-crunching purposes while we still have clean air left to breathe might lead to some net benefit. All that power would definitely be used for solving the world's greatest problems with the power of AI, right? Answers on the back of a postcard, please.

1. Best gaming laptop: Razer Blade 16
2. Best gaming PC: HP Omen 35L
3. Best handheld gaming PC: Lenovo Legion Go S SteamOS ed.
4. Best mini PC: Minisforum AtomMan G7 PT
5. Best VR headset: Meta Quest 3

Andy built his first gaming PC at the tender age of 12, when IDE cables were a thing and high resolution wasn't—and he hasn't stopped since. Now working as a hardware writer for PC Gamer, Andy spends his time jumping around the world attending product launches and trade shows, all the while reviewing every bit of PC gaming hardware he can get his hands on. You name it, if it's interesting hardware he'll write words about it, with opinions and everything.
You must confirm your public display name before commenting
Please logout and then login again, you will then be prompted to enter your display name.

