Skip to main content

AI generated images face Getty ban as privacy and ownership concerns grow

AI microprocessor on motherboard computer circuit
(Image credit: Black_Kira via Getty Images)
Audio player loading…

Getty Images has banned the upload and sale of any images generated by an AI—a bid to keep itself safe from any legal issues that may arise from what is effectively a Wild West of art generation today.

"There are real concerns with respect to the copyright of outputs from these models and unaddressed rights issues with respect to the imagery, the image metadata and those individuals contained within the imagery," Getty Images CEO Craig Peters told The Verge (opens in new tab).

With the rise of AI art tools such as DALL-E, Stable Diffusion, and Midjourney, among others, there have been a sudden influx of AI-generated images on the web. For the most part, we've seen these images come and go as entertaining gaffs on Twitter and other social media platforms, but as these AI algorithms become more complex and effective at image creation, we'll see these images used for a whole lot more.

And that's a business that Getty, one of the leading curated image library providers, wants to stay well clear of.

Getty's CEO refused to say if the company had already received legal challenges regarding AI-generated images, though did assert that it had "extremely limited" AI-generated content in its library.

All AI image generation algorithms require training, and massive image sets are required to do this effectively. As The Verge reports, Stable Diffusion is trained on images scraped from the web via a dataset from German charity LAION. This data set was created in compliance with German law, the Stable Diffusion website states, though it admits that the exact legality regarding copyright for images created using its tool "will vary from jurisdiction to jurisdiction."

As such, it's likely to become increasingly difficult to tell whether artwork is derived from another copyrighted image.

Stable Diffusion image generation examples.

These two images were created in the AI application Stable Diffusion. (Image credit: Stability AI)

There are other concerns regarding image datasets and scraping techniques, as a California-based artist discovered private medical record photographs (opens in new tab), taken by their doctor, within the LAION-5B image set. The artist, Lapine, discovered their images had been used through the use of a website that is specifically designed to tell artists whether their work has been used in these sorts of sets, called 'Have I Been Trained? (opens in new tab)'

These images have been confirmed by Ars Technica in an interview with Lapine, who has kept their identity confidential for privacy reasons. Though clearly privacy was not afforded to the supposedly confidential medical records held by the artist's doctor following the doctor's death in 2018, and it's quite worrying to think of how these ended up in a very public dataset without permission since. 

Lapine is not the only person affected either, it seems, as Ars also states that during a search for Lapine's photos they discovered other images that may have been obtained through similar means.

See more
Sitting comfortably?

(Image credit: Secretlab)

Best chair for gaming (opens in new tab): the top gaming chairs around
Best gaming desk (opens in new tab): the ultimate PC podiums
Best PC controller (opens in new tab): sit back, relax, and get your game on

When asked about the image set the CEO of the company behind Stable Diffusion, Stability AI, said that he couldn't speak for LAION but did state that it might be possible to un-train Stable Diffusion to remove certain images from its algorithm, but that the end result as it stands today is not an exact copy of any information from a given image set.

There are burgeoning privacy and legal concerns that will undoubtedly rise to the surface in coming months and years regarding the production and distribution of AI generated images. What is a fun tool, and perhaps even a handy one at times, is very likely to become a sticky topic for lawmakers, rights holders, and private citizens.

I don't blame age-old image libraries for taking a step back from the technology in the meantime.

Jacob Ridley
Senior Hardware Editor

Jacob earned his first byline writing for his own tech blog from his hometown in Wales in 2017. From there, he graduated to professionally breaking things as hardware writer at PCGamesN, where he would later win command of the kit cupboard as hardware editor. Nowadays, as senior hardware editor at PC Gamer, he spends his days reporting on the latest developments in the technology and gaming industry. When he's not writing about GPUs and CPUs, however, you'll find him trying to get as far away from the modern world as possible by wild camping.