YouTube's FTC compliance system for 'kids content' angers and worries creators

Audio player loading…

Baby Shark Dance has over three billion views on YouTube.

Baby Shark Dance has over three billion views on YouTube (opens in new tab). (Image credit: Pinkfong! Kids' Songs & Stories)

YouTube is changing significantly in January, and video creators are afraid they may lose income and even be fined by the US government for making videos about, among other things, videogames. 

The Children's Online Privacy Protection Act is a federal law in the US which forbids the collection of data about children under 13 without parental consent. Generally, that's simply meant that social media sites like Twitter ask for your date of birth when you sign up, and boot anyone who says they're under 13. A kid can lie, of course, but the Federal Trade Commission allows for that reality.

Starting in January, however, it won't allow (opens in new tab) "content made for kids" on YouTube to include targeted advertising or employ YouTube's social features. The Verge (opens in new tab) reported last week that the new rule is a direct result of Google's $170 million settlement with the FTC for allegedly collecting data on children. That initially happened in September, but as we approach 2020 and the new 'for kids' and 'not for kids' menu options have appeared in YouTube's menu, video creators are getting nervous.

It's not Google and YouTube who will bear the bulk of the responsibility for labeling videos. Google will use "machine learning" to auto-flag videos it thinks are kids content, but where it doesn't, video uploaders themselves will be required to determine what the FTC considers 'for kids.'

Videos designated as kids content will no longer serve personalized ads, will have no likes and dislikes, no comments, and none of the other features which involve data collection, and which also help videos spread. The videos can still serve non-personalized ads, but "this may result in a decrease in revenue for some creators," says Google.  

That's an understatement, as YouTube warns channel owners in the settings that turning off ad targeting can “significantly reduce your channel’s revenue."

In many cases, it's obvious when a video is targeted at kids, and Google lists a few examples, including "child protagonists engaging in common natural play patterns such as play-acting and/or imaginative play" and "popular children’s songs, stories or poems." You know, Baby Shark, that sort of thing.

But what about a retrospective about a '90s kids' game? What about a Fortnite video? These are videos that might contain "children or children's characters," and that kids might be drawn to, but are also meant for teenagers and adults.

Google can't help you much there. "We provide some guidance on what is considered 'made for kids' below, but we cannot provide legal advice," the company writes (opens in new tab). "If you are unsure whether your videos meet this standard, we suggest you seek legal counsel."

In other words: Sorry, your beef is with the FTC

And the FTC did tell Google it had to develop a system like this. "In addition to the monetary penalty, the proposed settlement requires Google and YouTube to develop, implement, and maintain a system that permits channel owners to identify their child-directed content on the YouTube platform so that YouTube can ensure it is complying with COPPA," wrote the agency in September.

See more

Creators of kid-targeted videos are obviously unhappy about the FTC's decision, but other video creators are also worried and irritated by the blanket rule, which might apply to their channels even if they don't think their videos are targeted at kids. If they flag their videos as kids content, they'll lose revenue and promotion. If they don't, and YouTube or the FTC disagrees with their judgement, they could lose their channel and even be fined. In September, Cnet reported that creators who fail to mark kids videos as 'for kids' could be subject to "aggressive" FTC fines. 

(It's likely that the FTC will only go after major channels, though, as it doesn't have the resources to catch every violator, which commissioner Rebecca Kelly Slaughter pointed out in her dissenting remarks (opens in new tab).)

The new regulation "will destroy the general and family friendly content on YouTube and make children no safer on the internet," wrote YouTuber and disability activist GoodTimesWithScar (opens in new tab), who makes Minecraft videos. He added that "the people who will lose their livelihoods are not the large corporations."

It seems clear that the FTC is primarily talking about a specific kind of YouTube video—the videos that five-year-olds watch on repeat, with singing cartoon characters, such as the insufferable Johny Johny Yes, Papa (opens in new tab). Google knew that these videos were directly targeting children under 13, and knew that it was collecting their viewing history to target advertisements. That's what the FTC takes issue with.

But YouTubers who make Minecraft videos are not singing at eight-year-olds, and even if their audience is primarily teenagers and adults, they could get caught up in this sweeping rule—one they may not even know they're breaking, because 'child-targeted' leaves a great deal of room for interpretation.

The FTC is accepting public comments (opens in new tab) regarding COPPA, with a deadline of December 9.

Tyler Wilde
Executive Editor

Tyler grew up in Silicon Valley alongside Apple and Microsoft, playing games like Zork and Arkanoid on the early personal computers his parents brought home. He was later captivated by Myst, SimCity, Civilization, Command & Conquer, Bushido Blade (yeah, he had Bleem!), and all the shooters they call "boomer shooters" now. In 2006, Tyler wrote his first professional review of a videogame: Super Dragon Ball Z for the PS2. He thought it was OK. In 2011, he joined PC Gamer, and today he's focused on the site's news coverage. After work, he practices boxing and adds to his 1,200 hours in Rocket League.