By
Paula ParisiSeptember 3, 2024
In an effort to create a safer environment for teens, social platform Snapchat is providing educators with resources to familiarize them with the app and help them understand how students use it. The company has launched a website called “An Educator’s Guide to Snapchat.” The announcement, timed to the start of the new school year, comes as lawmakers have been pressuring social networks to do more to protect children, with Florida and Indiana going so far as to enact school cell phone bans. Legislators in California and New York have been exploring similar prohibitions. Continue reading Snapchat Puts Focus on Teen Safety Resources for Teachers
By
Paula ParisiAugust 6, 2024
The U.S. Department of Justice has filed suit against TikTok and its parent company, ByteDance, charging they’ve violated the Children’s Online Privacy Protection Act (COPPA) by allowing children to create TikTok accounts without parental consent, and collecting their data. The suit also alleges TikTok retained the personal data of minors who joined prior to COPPA going into effect in 2000, even after parents demanded it be deleted, a right under COPPA. This latest move in the ongoing legal battle with ByteDance follows the Chinese company’s own lawsuit against the U.S. government. Continue reading U.S. Raises Stakes in TikTok Legal Battle, Suing Under COPPA
By
Paula ParisiJuly 11, 2024
Federal regulators have taken the unprecedented step of banning the NGL messaging platform from providing service to users under 18. The action is part of a legal settlement between NGL Labs, the Federal Trade Commission and the Los Angeles District Attorney’s Office. NGL, whose niche is “anonymous” communication and features the tagline “Ask me anything,” has also agreed to pay $5 million in fines. An FTC investigation found that in addition to fraudulent business claims about divulging the identities of message senders for a fee, NGL also falsely claimed it used artificial intelligence to filter out cyberbullying and harmful messages. Continue reading Popular Messaging App Banned from Servicing Young Users
By
Paula ParisiJune 19, 2024
United States Surgeon General Dr. Vivek Murthy has renewed his push for Congress to enact social media warning label advising of potential mental health damage to adolescents. Murthy also called on tech companies to be more transparent with internal data on the impact of their products on American youth, requesting independent safety audits and restrictions on features that may be addictive, including autoplay, push notifications and infinite scroll, which he suggests “prey on developing brains and contribute to excessive use.” His federal campaign joins a groundswell of local laws restricting minors’ access to social media. Continue reading U.S. Surgeon General Calls for Social Media Warning Labels
By
ETCentric StaffMarch 27, 2024
Florida Governor Ron DeSantis has signed a bill into law preventing children under 14 from creating new social media accounts, and requiring platforms to delete existing accounts, with no opportunity for parental consent. For children 14- to 15-years of age, consent of a parent or guardian is required to create or maintain accounts. Without it, or upon request, the accounts and personal data must be deleted, with fines of up to $50,000 per incident per platform. The law, set to take effect in January 2025, is being called the most restrictive passed by any state and is sure to face First Amendment scrutiny by the courts. Continue reading Florida Enacts the Nation’s Most Restrictive Social Media Law
By
ETCentric StaffFebruary 27, 2024
Florida’s legislature has passed a bill banning children younger than 16 from having social media accounts despite some pushback from Governor Ron DeSantis, who said he will be wrestling with whether to sign the measure into law. Due to a procedural requirement, DeSantis will have to sign or veto the proposed legislation before lawmakers conclude the current session in a matter of weeks. He has expressed dissatisfaction with the lack of a provision to let parents override the restriction, which would curtail access to the most popular sites, potentially impacting TikTok, Instagram, Facebook, Snapchat and YouTube. Continue reading Florida Pushes Forward a Social Media Ban for Kids Under 16
By
Paula ParisiJanuary 31, 2024
As parents and educators grapple with figuring out how AI will fit into education, OpenAI is preemptively acting to help answer that question, teaming with learning and child safety group Common Sense Media on informational material and recommended guidelines. The two will also work together to curate “family-friendly GPTs” for the GPT Store that are “based on Common Sense ratings and standards,” the organization said. The partnership aims “to help realize the full potential of AI for teens and families and minimize the risks,” according to Common Sense. Continue reading OpenAI Partners with Common Sense Media on AI Guidelines
By
Paula ParisiJanuary 29, 2024
New York has become the first city in the nation to designate a public health crisis with regard to use of social media by young children. In a State of the City address, Mayor Eric Adams name-checked TikTok, YouTube and Facebook, calling them (and “companies like” them) “addictive and dangerous.” Adams referenced last week’s advisory from the city’s Department of Health as “officially designating social media as a public health crisis hazard in New York City.” The advisory urges adults to establish “tech free times” for kids, and delay smartphone access until age 14. Continue reading New York City Classifies Social Media a ‘Public Health Threat’
By
Paula ParisiDecember 22, 2023
The Federal Trade Commission has proposed new rules to strengthen the Children’s Online Privacy Protection Act (COPPA), further limiting the collection of children’s data, particularly those who seek to monetize the information through targeted advertising. FTC Chair Lina Khan says the proposed changes aim to prevent tech firms “from outsourcing their responsibilities to parents” when it comes to ensuring privacy for children’s data. The FTC says it has issued fines totaling hundreds of millions of dollars to Google’s YouTube, and to a lesser extent, ByteDance’s TikTok, for mishandling the data of children 13-years-old and younger. Continue reading FTC Seeks to Bolster COPPA So Firms Can’t Surveil Children
By
Paula ParisiNovember 9, 2023
A second Meta Platforms whistleblower has come forward, testifying this week before a Senate subcommittee that the company’s social networks were potentially harming teens, and his warnings to that effect were ignored by top leadership. Arturo Bejar, from 2009 to 2015 a Facebook engineering director and an Instagram consultant from 2019 to 2021, told the Senate Judiciary Subcommittee on Privacy, Technology and Law that Meta officials failed to take steps to protect underage users on the platforms. Bejar follows former Facebook whistleblower Frances Haugen, who provided explosive Senate testimony in 2021. Continue reading Second Meta Whistleblower Testifies to Potential Child Harm
By
Paula ParisiOctober 26, 2023
Meta Platforms has been sued in federal court by 33 states including California and New York that claim its Instagram and Facebook platforms addict and harm children. The action is to date the most sweeping state action to contend with the impact of social media on the mental health of children. The suit, filed Tuesday in U.S. District Court for the Northern District of California, alleges Meta violates consumer protection laws by targeting children and deceiving users about platform safety. Also that day, the District of Columbia and eight states filed separate complaints addressing the same issues. Continue reading Dozens of States Sue Meta for Social Media Addiction in Kids
By
Paula ParisiAugust 21, 2023
Illinois has become the first state in the nation to pass legislation protecting children who are social media influencers. Beginning in July 2024, children under 16 who appear in monetized video content online will have a legal right to compensation for their work, even if that means litigating against their parents. “The rise of social media has given children new opportunities to earn a profit,” Illinois Senator David Koehler said about the bill he sponsored. “Many parents have taken this opportunity to pocket the money, while making their children continue to work in these digital environments. Continue reading Illinois Law Protecting Child Vloggers Will Take Effect in 2024
By
Paula ParisiJuly 31, 2023
The Senate has cleared two children’s online safety bills despite pushback from civil liberties groups that say the digital surveillance used to monitor behavior will result in an Internet less safe for kids. The Kids Online Safety Act (KOSA) and the Children and Teens’ Online Privacy Protection Act (COPPA 2.0) are intended to address a mental health crisis experts blame in large part on social media, but critics say the bills could cause more harm than good by forcing social media firms to collect more user data as part of enforcement. The bills — which cleared the Senate Commerce Committee by unanimous vote — are also said to reduce access to encrypted services. Continue reading Government Advances Online Safety Legislation for Children
By
Paula ParisiJune 2, 2023
Snapchat is rolling out a new feature for its premium Snapchat+ platform that enables users who send Snaps to My AI let the artificial intelligence know what they’re up to “receive a unique generative Snap back that keeps the conversation going” via My AI Snaps. The feature was previewed at the Snap Partner Summit in April as part of a larger push on AI updates, including the ability to invite the My AI chatbot to participate in group chats with friends and the ability to get AI Lens suggestions and place recommendations. In addition, the My AI chatbot — made free to all users this year — was updated to reply to users’ Snaps with a text-based response. Continue reading Snapchat+ Introduces ‘My AI Snaps’ for Chatbot Snap Backs
By
Paula ParisiApril 10, 2023
Utah has become the first state to pass laws requiring social media platforms to obtain age verification before users can register. The law is designed to force social networks to enforce parental consent provisions. As of March 2024, companies including Facebook, Instagram, Snap, TikTok and Twitter will be required to secure proof of age for Utah users via a valid ID instead of just letting people type in their birth date at sign-up. While Utah is out front on the issue, nine other states have proposed legislation that includes age checks, most recently Arkansas. Continue reading Utah’s Social Media Law Requires Age Verification for Minors