Online privacy protections for consumers are in focus on Capitol Hill, with the Kids Online Safety Act (KOSA) getting particular attention. A coalition of more than 100 organizations, including Fairplay and the American Psychological Association are calling on senators to advance KOSA this month. Co-sponsored by senators Richard Blumenthal (D-Connecticut) and Marsha Blackburn (R-Tennessee), the legislation would require social media platforms to conduct annual audits to identify risks to minors as well as more concrete steps like opting out of algorithmic recommendations and disabling “addictive” features.
States including Minnesota and California also have their own child safety bills. California has multiple bills pending, including AB 2408, the Social Media Platform Duty to Children Act, which would allow parents to sue companies like Facebook and TikTok if their child has been harmed by addiction to social media. Another, the California Age-Appropriate Design Code Act, AB 2273, requires minors be defaulted to “a high level of privacy protection.”
The Minnesota bill, HF-3724, would require that algorithmic functions be turned off by default for accounts held by minors on platforms with more than one million users.
Wired quotes California Assembly member Jordan Cunningham, co-sponsor of AB 2408, as likening the mental harm Big Tech poses to kids to that of cigarettes, in terms of being an addictive health threat. Although an issue for parents, lawmakers and researchers for many years, the potentially negative impact of social media on children and teens has amplified as a result of COVID-19, as parents sheltering at home witnessed first-hand their children’s social and educational lives revolved around technology.
Of particular concern is what The New York Times reports as “an upswing in social media use among children ages 8 to 12, on platforms such as Instagram, Snapchat and Facebook, even though such platforms require users to be at least 13 because of a law that prohibits companies from collecting data from children.”
Wrestling with “what defines a digital behavioral ‘addiction’ versus other terms like problematic media use,” University of Michigan C. S. Mott Children’s Hospital researcher Jenny Radesky underscores the inherent equity issues involving homes with parents working multiple jobs may result in more digital overuse among children than in affluent households. “Radesky says this is where legislation plays a key role,” Wired writes.
Some of the bills include exceptions for platforms that offer a high-degree of parental control. Accordingly, Wired writes that YouTube has already “turned off autoplay for kids by default, and TikTok no longer sends late-night push notifications to teens.”
State attempts to regulate social media must contend with superseding federal law, namely Section 230, which “protects online platforms (including social media companies) from being held responsible for its users’ posts,” Wired reports, explaining some proposals face resistance from groups like the Electronic Frontier Foundation, which argues that algorithms and notifications are a means of “distributing speech, inextricable from user-generated content — and protected.”