March 1, 2019
After major brands including AT&T, Disney, Epic Games and Nestlé suspended their regular ad spending on YouTube, the popular video platform has made a move to temporarily disable comment sections on most video channels that feature children 13 and younger as well as teenagers that may risk “attracting predatory behavior.” The concern was that advertising was sometimes positioned along videos with minors that included predatory remarks in the comments sections. A few select channels will have comment sections that remain enabled, but will require monitoring for safety.
“These channels will be required to actively moderate their comments, beyond just using our moderation tools, and demonstrate a low risk of predatory behavior,” explained YouTube in a blog post.
“We recognize that comments are a core part of the YouTube experience and how you connect with and grow your audience,” notes the post. “At the same time, the important steps we’re sharing today are critical for keeping young people safe.”
In addition to “disabling comments on videos featuring minors,” YouTube’s plans include “launching a new comments classifier” and “taking action on creators who cause egregious harm to the community.”
“As part of an initial response, YouTube said last week that it had deleted tens of millions of comments and removed more than 400 channels associated with writing predatory comments on videos staring minors,” reports The Verge. “YouTube also sought to clear up confusion around how these changes will impact creators’ ability to run ads. In its blog post, YouTube says none of this will affect creators’ monetization.”
According to The New York Times, “YouTube said it would use a new machine-learning system to identify and remove predators’ comments. The company, which has said it removes hundreds of millions of comments from videos every quarter for violating its rules, estimated that the new approach would double the number of comments flagged for removal.”