Following a report last week about videos that were exploiting children and "facilitating pedophiles," YouTube announced today that it is implementing new, more aggressive policies to disable comments on a wide range of videos featuring minors "that could be at risk of attracting predatory behavior."
The initial report, from YouTuber Matt Watson, said that YouTube's recommendation algorithm makes it possible for pedophiles to "connect with each-other, trade contact info, and link to actual child pornography" via time-stamped comments of videos featuring children in legal but "compromising" positions.
"These comments are often the most upvoted posts on the video. Knowing this, we can deduce that YouTube is aware these videos exist and that pedophiles are watching them," Watson wrote on Reddit (opens in new tab). He reached that deduction because YouTube established a policy in 2017 to turn off comments in videos where such comments were being made.
"I still see countless users time-stamping and sharing social media info. A fair number of the videos in the wormhole have their comments disabled, which means Youtube’s algorithm is detecting unusual behavior," he explained. "But that begs the question as to why YouTube, if it is detecting exploitative behavior on a particular video, isn’t having the video manually reviewed by a human and deleting the video outright."
The issue attracted widespread attention after major advertisers, including McDonald's, Disney, and Fortnite studio Epic Games began pulling pre-roll ads (opens in new tab) from the platform. Epic said at the time that it had reached out to YouTube and parent company Google to determine what actions the tech company would take to eliminate that type of abusive content.
Today, YouTube revealed its plan in a blog post (opens in new tab) saying that it has disabled comments on "tens of millions of videos that could be subject to predatory behavior" since Watson's report, and promising to continue working to identify potentially exploitative videos in the future.
"These efforts are focused on videos featuring young minors and we will continue to identify videos at risk over the next few months," it said. "Over the next few months, we will be broadening this action to suspend comments on videos featuring young minors and videos featuring older minors that could be at risk of attracting predatory behavior."
Some creators will be allowed to maintain comments on their videos, but their channels will have to be "actively moderated" and "demonstrate a low risk of predatory behavior." Over the long term, YouTube hopes to be able to enable comments on more such channels, "as our ability to catch violative comments continues to improve."
YouTube is also working on a new "comments classifier" that it says will detect and remove twice as many comments as the previous system, and in response to a separate report (opens in new tab) of videos aimed at children containing "suicide tips," reiterated that such videos violate YouTube policy.
"No form of content that endangers minors is acceptable on YouTube, which is why we have terminated certain channels that attempt to endanger children in any way," YouTube wrote. "Videos encouraging harmful and dangerous challenges targeting any audience are also clearly against our policies. We will continue to take action when creators violate our policies in ways that blatantly harm the broader user and creator community."