Advocacy Groups Seek to Enact Online Rules to Protect Kids

A coalition of more than 20 advocacy groups with an interest in child safety is petitioning the Federal Trade Commission to prohibit social media platforms including TikTok as well as online games and other services from bombarding kids with ads and using other tactics that may hook children online. Regulators are being lobbied to prevent online services from offering minors “low-friction rewards” — unpredictably granting positive reinforcement for scrolling, tapping or logging on to prolonged use. The groups say the technique is the same used by slot machine makers to keep gamblers engaged.

“Minors cannot go online without encountering countless engagement-optimizing design practices,” the group says in an FTC petition for rulemaking that alleges the tactics “are implicated in concrete and serious harms.”

In addition to the unpredictable rewards, the petitioners take exception to manipulative navigation design and features that intentionally apply social pressure.

Some popular games “offer children virtual rewards like extra points in exchange for watching ads,” The New York Times reports, noting the petition warns that “such practices might foster or exacerbate anxiety, depression, eating disorders or self-harm among children and teenagers.”

“The FTC can and must establish rules of the road to clarify when these design practices cross the line into unlawful unfairness, thus protecting vulnerable users from unfair harms,” wrote the petitioners, led by nonprofit children’s advocacy group Fairplay and the Center for Digital Democracy.

Other participants include the American Academy of Pediatrics, the Network for Public Education, Public Citizen and the U.S. Public Interest Research Group.

The activists are taking their stand at a time when regulators, lawmakers and health experts have been examining how technology companies may be exploiting children online and the potential harms presented by such treatment.

“These activists are challenging the business model of apps and sites whose main revenue comes from digital advertising,” NYT writes, explaining that “services like TikTok, Instagram and YouTube routinely employ data-harvesting techniques and compelling design elements — like content-recommendation algorithms, smartphone notices or videos that automatically play one after another — to drive user engagement.”

Other potential harms include “autoplaying videos and endless scrolling to keep children and teens online,” according to Bloomberg, which calls the tactics “screen-time boosters” designed to addict kids, and says the advocates are demanding “rules overseeing digital design practices.”

Governor Gavin Newsom recently signed the California Age-Appropriate Design Code Act into law, and earlier this year the bipartisan Kids Online Safety Act was introduced federally.