Disney, Nestle, and Epic Games (makers of Fortnite) have temporarily suspended their ads on YouTube, following a recent video report by YouTuber Matt Watson.
In the 20-minute video, which has been viewed nearly 2 million times and sent to news outlets and other organizations, Watson details how easy it is to find a seemingly innocuous video with teens and children, whose comment sections are full of child predators who time stamp specific scenes, make suggestive comments, and even share child pornography with each other. Watson says: “YouTube’s recommended algorithm is facilitating pedophiles’ ability to connect with each other, trade contact info, and link to actual child pornography in the comments.”
Because the videos themselves aren’t explicitly pornographic, YouTube’s algorithm hasn’t flagged them. Yet predators continue to flourish. Worse still, YouTube’s recommended videos algorithm suggests similar child-starring videos, making it easier for these activities and communities to thrive.
Many of these videos are monetized, including pre-roll ads for things like Fortnite. In response, several companies, including Epic Games, have paused their ad funding, hoping to send a clear message to YouTube.
“We have paused all pre-roll advertising,” said a spokesperson from Epic. “Through our advertising agency, we have reached out to Google/YouTube to determine actions they’ll take to eliminate this type of content from their service.”
The Verge reported on Epic Games pulling their advertising, and received a statement from YouTube: “Any content — including comments — that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube. We took immediate action by deleting accounts and channels, reporting illegal activity to authorities and disabling violative comments. There’s more to be done, and we continue to work to improve and catch abuse more quickly.”