Social Platforms Under Scrutiny For Rules Related to Kids

YouTube, founded in 2005, has operated outside the advertising rules that regulate television broadcasting. But due to its significant reach and influence, the site is now under scrutiny for potential regulation — which will likely start with children’s programming. A digital influencer like 15-year old JoJo Siwa is a case in point: she draws millions of young female viewers to her quirky videos. But she also inks endorsement deals and sells branded fashion lines with Target, blurring the lines between content and advertising.

Bloomberg reports that, “kids are particularly vulnerable to being manipulated by paid clips that masquerade as legitimate content.” At the Children’s Advertising Review Unit, which is funded by companies including Google, director Dona Fraser noted that, “the uptick in sponsored content and child influencers is very overwhelming.”

“This has exploded in front of our eyes,” she said. “How do you now wrangle every child influencer out there?”

YouTube states that its content creators are “responsible for ensuring their content complies with local laws, regulations and YouTube Community Guidelines, including paid product placements.” Bloomberg said an inquiry led YouTube to remove one video showing Siwa shopping at Target — although this reporter found half a dozen Siwa videos showing her shopping at Target and Walmart and encouraging viewers to shop there, none of them tagged as ads.

Target spokesman Joe Poulos said the company “did not pay directly for either of Siwa’s videos shot at the retailer’s stores” and that the retailer has never paid the teenager for “creating or distributing any content for Target.” Walmart didn’t respond to a request for comment.

Campaign for a Commercial-Free Childhood executive director Josh Golin said that Siwa’s shopping videos “would never clear regulatory hurdles to appear on TV, even without sponsorship from the retailer.” The FCC would consider the entire video an ad on children’s TV, said one expert.

Common Sense Media vice president Colby Zintl, whose organization is pushing Congress to “strengthen oversight” on how children use Google and Facebook services, pointed out that if YouTube “really were honest brokers about whether kids were allowed on the platform, they wouldn’t have so much kids’ content.” YouTube’s current rules do not allow users under 13 to “create or own accounts on YouTube,” and if it finds such an account, it terminates it.

Still, when the YouTube Kids mobile app — which currently has 18 million monthly visitors — rolled out in 2015, “child and consumer advocacy groups” contacted the FTC, saying it contained “inappropriate content, including explicit sexual language and jokes about pedophilia.” Although YouTube requires creators to disclose sponsors, the Campaign for a Commercial-Free Childhood has “found prominent influencers with sponsored videos on YouTube Kids.” More than 80 percent of U.S. children aged 6-to-12 view YouTube.

Elsewhere, Bloomberg reports on how Snapchat is “working with British lawmakers on ways to stop underage users” from signing up to the platform. Snapchat executive Stephen Collins said the solution was “some kind of central verification system.” Snapchat’s minimum age is 13, and users must input their birth date although “there is no system to verify users’ age.”