YouTube Kids Finds Right Formula to Improve Video Content

Children’s programming has always been some of the most popular content on YouTube, generating billions of views since the platform launched in 2005. But the accompanying advertising and algorithm-driven recommendations proved problematic, sometimes serving material that parents deemed inappropriate. YouTube has taken various steps to address this, becoming in 2015 the first social platform to launch a children’s version of its main product. It later opted to have humans, not algorithms, make the content recommendations for kids, a costly trade-off that seems to have produced positive results.

But the transition took time, with algorithms still mostly calling the shots on YouTube Kids through 2018. In those intervening years, “controversy and negative publicity followed,” notes The Wall Street Journal, explaining that “watchdog groups and parents identified disturbing videos in which popular cartoon characters from ‘Mickey Mouse’ and ‘Paw Patrol’ to ‘Peppa Pig’ were put in obscene or violent situations.” Experts also questioned the educational value of videos showing children (and adults) “unboxing” new toys.

By late 2018, YouTube added more editorial intervention against a backdrop of rumblings of a Federal Trade Commission investigation, which commenced in early 2019, dealing primarily with YouTube Kids’ data-collection practices regarding minors. “The trust-and-safety team expanded from a handful of people to hundreds,” according to WSJ, which says the number of people hired to review content “increased fivefold to about 25 people.”

Noted researcher Alicia Blum-Ross, who specialized in technology for children, was hired to increase program quality while still including the wide range of user-generated content that many feel is YouTube’s strong suit. An advisory board of academic researchers was formed, specializing in child development and media literacy.

As 2019 wound to a close, “YouTube took its most drastic step yet,” WSJ reports: “It removed millions of what it deemed low-quality videos from YouTube Kids, a decision that ultimately cut the library by about 80 percent, according to a former executive.” Numerous cartoon parodies and unboxing videos that violated the company’s commercial content policies were excised from the service.

“The children’s platform is now mostly made up of videos that are created by preapproved content partners, according to a former executive and people who work closely with the company,” WSJ reports. “YouTube’s increased scrutiny of its children’s site coincided with a $170 million fine from the FTC for violating children’s privacy.”

Some problems remain. Wired last month reported that auto-generated closed captioning on YouTube Kids was sometimes misinterpreting innocent words for inappropriate language: “‘corn’ as ‘porn,’ ‘beach’ as ‘bitch,’ and ‘brave’ as ‘rape.” But the service has mostly gotten high marks. As one mother tells WSJ: “They’ve worked on the safety issue, and I feel like they’ve added more things that are of educational value.”

No Comments Yet

You can be the first to comment!

Sorry, comments for this entry are closed at this time.