Florida Enacts the Nation’s Most Restrictive Social Media Law

Florida Governor Ron DeSantis has signed a bill into law preventing children under 14 from creating new social media accounts, and requiring platforms to delete existing accounts, with no opportunity for parental consent. For children 14- to 15-years of age, consent of a parent or guardian is required to create or maintain accounts. Without it, or upon request, the accounts and personal data must be deleted, with fines of up to $50,000 per incident per platform. The law, set to take effect in January 2025, is being called the most restrictive passed by any state and is sure to face First Amendment scrutiny by the courts. Continue reading Florida Enacts the Nation’s Most Restrictive Social Media Law

Florida Pushes Forward a Social Media Ban for Kids Under 16

Florida’s legislature has passed a bill banning children younger than 16 from having social media accounts despite some pushback from Governor Ron DeSantis, who said he will be wrestling with whether to sign the measure into law. Due to a procedural requirement, DeSantis will have to sign or veto the proposed legislation before lawmakers conclude the current session in a matter of weeks. He has expressed dissatisfaction with the lack of a provision to let parents override the restriction, which would curtail access to the most popular sites, potentially impacting TikTok, Instagram, Facebook, Snapchat and YouTube. Continue reading Florida Pushes Forward a Social Media Ban for Kids Under 16

OpenAI Partners with Common Sense Media on AI Guidelines

As parents and educators grapple with figuring out how AI will fit into education, OpenAI is preemptively acting to help answer that question, teaming with learning and child safety group Common Sense Media on informational material and recommended guidelines. The two will also work together to curate “family-friendly GPTs” for the GPT Store that are “based on Common Sense ratings and standards,” the organization said. The partnership aims “to help realize the full potential of AI for teens and families and minimize the risks,” according to Common Sense. Continue reading OpenAI Partners with Common Sense Media on AI Guidelines

New York City Classifies Social Media a ‘Public Health Threat’

New York has become the first city in the nation to designate a public health crisis with regard to use of social media by young children. In a State of the City address, Mayor Eric Adams name-checked TikTok, YouTube and Facebook, calling them (and “companies like” them) “addictive and dangerous.” Adams referenced last week’s advisory from the city’s Department of Health as “officially designating social media as a public health crisis hazard in New York City.” The advisory urges adults to establish “tech free times” for kids, and delay smartphone access until age 14. Continue reading New York City Classifies Social Media a ‘Public Health Threat’

FTC Seeks to Bolster COPPA So Firms Can’t Surveil Children

The Federal Trade Commission has proposed new rules to strengthen the Children’s Online Privacy Protection Act (COPPA), further limiting the collection of children’s data, particularly those who seek to monetize the information through targeted advertising. FTC Chair Lina Khan says the proposed changes aim to prevent tech firms “from outsourcing their responsibilities to parents” when it comes to ensuring privacy for children’s data. The FTC says it has issued fines totaling hundreds of millions of dollars to Google’s YouTube, and to a lesser extent, ByteDance’s TikTok, for mishandling the data of children 13-years-old and younger. Continue reading FTC Seeks to Bolster COPPA So Firms Can’t Surveil Children

Second Meta Whistleblower Testifies to Potential Child Harm

A second Meta Platforms whistleblower has come forward, testifying this week before a Senate subcommittee that the company’s social networks were potentially harming teens, and his warnings to that effect were ignored by top leadership. Arturo Bejar, from 2009 to 2015 a Facebook engineering director and an Instagram consultant from 2019 to 2021, told the Senate Judiciary Subcommittee on Privacy, Technology and Law that Meta officials failed to take steps to protect underage users on the platforms. Bejar follows former Facebook whistleblower Frances Haugen, who provided explosive Senate testimony in 2021. Continue reading Second Meta Whistleblower Testifies to Potential Child Harm

Dozens of States Sue Meta for Social Media Addiction in Kids

Meta Platforms has been sued in federal court by 33 states including California and New York that claim its Instagram and Facebook platforms addict and harm children. The action is to date the most sweeping state action to contend with the impact of social media on the mental health of children. The suit, filed Tuesday in U.S. District Court for the Northern District of California, alleges Meta violates consumer protection laws by targeting children and deceiving users about platform safety. Also that day, the District of Columbia and eight states filed separate complaints addressing the same issues. Continue reading Dozens of States Sue Meta for Social Media Addiction in Kids

Illinois Law Protecting Child Vloggers Will Take Effect in 2024

Illinois has become the first state in the nation to pass legislation protecting children who are social media influencers. Beginning in July 2024, children under 16 who appear in monetized video content online will have a legal right to compensation for their work, even if that means litigating against their parents. “The rise of social media has given children new opportunities to earn a profit,” Illinois Senator David Koehler said about the bill he sponsored. “Many parents have taken this opportunity to pocket the money, while making their children continue to work in these digital environments. Continue reading Illinois Law Protecting Child Vloggers Will Take Effect in 2024

Government Advances Online Safety Legislation for Children

The Senate has cleared two children’s online safety bills despite pushback from civil liberties groups that say the digital surveillance used to monitor behavior will result in an Internet less safe for kids. The Kids Online Safety Act (KOSA) and the Children and Teens’ Online Privacy Protection Act (COPPA 2.0) are intended to address a mental health crisis experts blame in large part on social media, but critics say the bills could cause more harm than good by forcing social media firms to collect more user data as part of enforcement. The bills — which cleared the Senate Commerce Committee by unanimous vote — are also said to reduce access to encrypted services. Continue reading Government Advances Online Safety Legislation for Children

Snapchat+ Introduces ‘My AI Snaps’ for Chatbot Snap Backs

Snapchat is rolling out a new feature for its premium Snapchat+ platform that enables users who send Snaps to My AI let the artificial intelligence know what they’re up to “receive a unique generative Snap back that keeps the conversation going” via My AI Snaps. The feature was previewed at the Snap Partner Summit in April as part of a larger push on AI updates, including the ability to invite the My AI chatbot to participate in group chats with friends and the ability to get AI Lens suggestions and place recommendations. In addition, the My AI chatbot — made free to all users this year — was updated to reply to users’ Snaps with a text-based response. Continue reading Snapchat+ Introduces ‘My AI Snaps’ for Chatbot Snap Backs

Utah’s Social Media Law Requires Age Verification for Minors

Utah has become the first state to pass laws requiring social media platforms to obtain age verification before users can register. The law is designed to force social networks to enforce parental consent provisions. As of March 2024, companies including Facebook, Instagram, Snap, TikTok and Twitter will be required to secure proof of age for Utah users via a valid ID instead of just letting people type in their birth date at sign-up. While Utah is out front on the issue, nine other states have proposed legislation that includes age checks, most recently Arkansas. Continue reading Utah’s Social Media Law Requires Age Verification for Minors

Meta’s Penalty Reforms Designed to Be More Effective, Fair

Meta Platforms is reforming its penalty system for Facebook policy violations. Based on recommendations from its Oversight Board, the company will focus more on educating users and less on punitive measures like suspending accounts or limiting posts. “While we are still removing violating content just as before,” explains Meta VP of content policy Monika Bickert, “under our new system we will focus more on helping people understand why we have removed their content, which is shown to help prevent re-offending, rather than so quickly restricting their ability to post.” The goal is fairer and more effective content moderation on Facebook. Continue reading Meta’s Penalty Reforms Designed to Be More Effective, Fair

Biden Challenges Big Tech, Calls for Children’s Online Safety

President Biden’s second State of the Union speech Tuesday night included calls for stronger consumer privacy protections and tougher antitrust laws in direct challenge to what many perceive as the unchecked power of Big Tech. “Pass bipartisan legislation to strengthen antitrust enforcement and prevent big online platforms from giving their own products an unfair advantage,” Biden stated, urging Congress to “stop Big Tech from collecting personal data on kids and teenagers online, ban targeted advertising to children, and impose stricter limits on the personal data these companies collect on all of us.” Continue reading Biden Challenges Big Tech, Calls for Children’s Online Safety

UK Online Safety Bill to Exert Pressure on Social Media Execs

British legislators seem ready to make good on a threat to add criminal liability and jail time for high-level social media executives who fail to protect children from online harm as part of the Online Safety Bill. While the bill also aims to protect adults from fraud and malfeasance, its strictest provisions are geared toward child protection. The current proposal could win approval by the House of Commons within the week, and would then move to the upper chamber, the House of Lords, later in the quarter for further revision. Enactment is anticipated by year’s end.
Continue reading UK Online Safety Bill to Exert Pressure on Social Media Execs

Advocacy Groups Seek to Enact Online Rules to Protect Kids

A coalition of more than 20 advocacy groups with an interest in child safety is petitioning the Federal Trade Commission to prohibit social media platforms including TikTok as well as online games and other services from bombarding kids with ads and using other tactics that may hook children online. Regulators are being lobbied to prevent online services from offering minors “low-friction rewards” — unpredictably granting positive reinforcement for scrolling, tapping or logging on to prolonged use. The groups say the technique is the same used by slot machine makers to keep gamblers engaged. Continue reading Advocacy Groups Seek to Enact Online Rules to Protect Kids