DeepMind's Chinchilla AI toasts FLAC and PNG at lossless data compression despite essentially being just a large language model

AI image lady face.
(Image credit: Getty - peepo)

If you think FLAC is the audiophile's friend when it comes to lossless music files, a large language model (LLM) has news for you, as it's now laying claim to compression as part of AI's growing realm of influence, too.

A study titled "Language Modeling Is Compression" (via ArsTechnica) discusses a finding about an LLM by DeepMind called Chinchilla 70B and its ability to perform lossless data compression better than FLAC for audio and PNG for pictures.

Chinchilla 70B could significantly shrink the size of image patches from the ImageNet database, reducing them to only 43.4% of their original size without losing any detail. This performance is better than the PNG algorithm, which could only reduce the image sizes to 58.5%.

Additionally, Chinchilla compresses audio data from the LibriSpeech to just 16.4% of their actual size for sound files. This is impressive, especially compared to the FLAC compression, which could only reduce the audio sizes to 30.3%.

Lossless compression means nothing is lost or left out when data is squeezed into smaller packages. This differs from lossy compression, which is what the image compression format JPEG uses. That removes some data and then guesses at what it should look like when you open the file again, all to make the file size that much smaller.

The study's findings show that even though Chinchilla 70B was mostly made to work with text, it is also surprisingly adept at making other types of data much smaller. And is often better at it than programs specifically made to do so.

Researchers of the study suggest that predicting and compressing data go both ways. This means if you have a good tool for making data smaller, like gzip, you can also use it to create new information based on what it learned during the whole making-data-smaller process.

Your next machine

Gaming PC group shot

(Image credit: Future)

Best gaming PC: The top pre-built machines.
Best gaming laptop: Great devices for mobile gaming.

In one part of their research, they tested this idea by trying to create new text, images, and sound using gzip and another tool, Chinchilla, after giving them a sample of data. As expected, gzip didn’t do great and generated mostly nonsense.

This shows that, while gzip can create data, that data might need to be more meaningful. On the other hand, Chinchilla, which is specifically made for processing language, did much better at creating new, meaningful results.

Almost 20 years ago, researchers argued that compression was a form of general intelligence, saying that "ideal text compression, if it were possible, would be equivalent to passing the Turing test for artificial intelligence."

However, as ArsTechnica points out, this paper has yet to be peer-reviewed. The idea that making data smaller is related to intelligence is a topic we will probably still be hearing about in the future. We are still just scratching the surface of what these LLMs can do.

Jorge Jimenez
Hardware writer, Human Pop-Tart

Jorge is a hardware writer from the enchanted lands of New Jersey. When he's not filling the office with the smell of Pop-Tarts, he's reviewing all sorts of gaming hardware, from laptops with the latest mobile GPUs to gaming chairs with built-in back massagers. He's been covering games and tech for over ten years and has written for Dualshockers, WCCFtech, Tom's Guide, and a bunch of other places on the world wide web.