By
Paula ParisiOctober 1, 2025
OpenAI has added parental controls for ChatGPT’s Web interface, with mobile controls coming soon. The controls give parents the ability to reduce or remove certain content and dial down personalization by turning off ChatGPT’s transcript memories. At the same time, OpenAI has added the ability to restrict image generation with the launch of Sora parental controls for ChatGPT-connected teen accounts. There are also controls for sending and receiving direct messages through the app. OpenAI says the changes aim “to give families tools to support their teens’ use of AI.” To activate control access, parents must have their own accounts and teens will need to opt in. Continue reading OpenAI Rolls Out New Parental Controls to Help Protect Kids
By
Paula ParisiSeptember 17, 2025
New York State Attorney General Letitia James has released proposed rules that would restrict minors from exposure to addictive features on social media pursuant to the Stop Addictive Feeds Exploitation (SAFE) for Kids Act, signed into state law last year to “protect the mental health of children.” The law, which is expected to take effect sometime next year, requires social media companies that supply addictive content to use an age verification system with results “certified by a verified third-party.” Absent parental consent, platform operators must restrict users under 18 from receiving addictive or algorithmically personalized feeds and nighttime notifications. Continue reading SAFE for Kids Act: NY State Attorney Opens Comment Period
By
Paula ParisiSeptember 5, 2025
Following a California teen’s suicide after months of conversation about it with ChatGPT and a wrongful death lawsuit filed by his parents against OpenAI, the company says it will introduce parental controls “within the next month.” New safeguards include parents being able to “control how ChatGPT responds to their teen” and “receive notifications when the system detects their teen is in a moment of acute distress.” OpenAI says it has recently introduced a real-time router that can redirect “sensitive conversations” to its GPT-5 thinking and o3 reasoning models, engineered to respond with greater contextual awareness than efficiency-focused chat models. Continue reading OpenAI Announces Plans for New ChatGPT Parental Controls
By
Paula ParisiJune 6, 2025
Bytedance-owned social video platform TikTok is improving ways in which users can customize their For You feed, letting them increase emphasis on subjects of interest with Manage Topics and adding AI-powered Smart Keyword Filters to suppress content they don’t want to see. While the social platform has featured keyword filters for some time, the new filters are smart enough to catch a broader range of unwanted content. Users can now include up to 200 filtering keywords and let AI do the rest “to capture additional videos featuring similar words, synonyms and slang variations” that reduce unwanted content in the For You feed. Continue reading TikTok Adds Features Letting Users Fine-Tune ‘For You’ Feed
By
Paula ParisiSeptember 6, 2024
YouTube is adding a Family Center hub along with a feature that allows parents to link their accounts to those of their teen children for insight on child use patterns. Linked parents will receive alerts with aggregated information about things like the number of new uploads, subscriptions and comments, or when a teen starts a live stream. What they won’t get are details about the content itself. YouTube calls it “a collaborative approach to teen supervision on YouTube.” The move comes as federal and state legislators get more aggressive about regulating online safety for minors. Continue reading YouTube Adds Family Center, Parent Insights on Teen Viewing
By
Paula ParisiSeptember 3, 2024
In an effort to create a safer environment for teens, social platform Snapchat is providing educators with resources to familiarize them with the app and help them understand how students use it. The company has launched a website called “An Educator’s Guide to Snapchat.” The announcement, timed to the start of the new school year, comes as lawmakers have been pressuring social networks to do more to protect children, with Florida and Indiana going so far as to enact school cell phone bans. Legislators in California and New York have been exploring similar prohibitions. Continue reading Snapchat Puts Focus on Teen Safety Resources for Teachers
By
Paula ParisiJune 19, 2024
United States Surgeon General Dr. Vivek Murthy has renewed his push for Congress to enact social media warning label advising of potential mental health damage to adolescents. Murthy also called on tech companies to be more transparent with internal data on the impact of their products on American youth, requesting independent safety audits and restrictions on features that may be addictive, including autoplay, push notifications and infinite scroll, which he suggests “prey on developing brains and contribute to excessive use.” His federal campaign joins a groundswell of local laws restricting minors’ access to social media. Continue reading U.S. Surgeon General Calls for Social Media Warning Labels
By
ETCentric StaffMarch 27, 2024
Florida Governor Ron DeSantis has signed a bill into law preventing children under 14 from creating new social media accounts, and requiring platforms to delete existing accounts, with no opportunity for parental consent. For children 14- to 15-years of age, consent of a parent or guardian is required to create or maintain accounts. Without it, or upon request, the accounts and personal data must be deleted, with fines of up to $50,000 per incident per platform. The law, set to take effect in January 2025, is being called the most restrictive passed by any state and is sure to face First Amendment scrutiny by the courts. Continue reading Florida Enacts the Nation’s Most Restrictive Social Media Law
By
Paula ParisiJanuary 29, 2024
New York has become the first city in the nation to designate a public health crisis with regard to use of social media by young children. In a State of the City address, Mayor Eric Adams name-checked TikTok, YouTube and Facebook, calling them (and “companies like” them) “addictive and dangerous.” Adams referenced last week’s advisory from the city’s Department of Health as “officially designating social media as a public health crisis hazard in New York City.” The advisory urges adults to establish “tech free times” for kids, and delay smartphone access until age 14. Continue reading New York City Classifies Social Media a ‘Public Health Threat’
By
Paula ParisiOctober 26, 2023
Meta Platforms has been sued in federal court by 33 states including California and New York that claim its Instagram and Facebook platforms addict and harm children. The action is to date the most sweeping state action to contend with the impact of social media on the mental health of children. The suit, filed Tuesday in U.S. District Court for the Northern District of California, alleges Meta violates consumer protection laws by targeting children and deceiving users about platform safety. Also that day, the District of Columbia and eight states filed separate complaints addressing the same issues. Continue reading Dozens of States Sue Meta for Social Media Addiction in Kids
By
Paula ParisiJuly 31, 2023
The Senate has cleared two children’s online safety bills despite pushback from civil liberties groups that say the digital surveillance used to monitor behavior will result in an Internet less safe for kids. The Kids Online Safety Act (KOSA) and the Children and Teens’ Online Privacy Protection Act (COPPA 2.0) are intended to address a mental health crisis experts blame in large part on social media, but critics say the bills could cause more harm than good by forcing social media firms to collect more user data as part of enforcement. The bills — which cleared the Senate Commerce Committee by unanimous vote — are also said to reduce access to encrypted services. Continue reading Government Advances Online Safety Legislation for Children
By
Paula ParisiJune 14, 2023
A bill passed by the Louisiana State Legislature that bans minors from creating social media accounts without parental consent is the latest in a string of legal measures that take aim at the online world to combat a perceived mental health crisis among America’s youth. Utah also recently passed a law requiring consent of a parent or guardian when anyone under 18 wants to create a social account. And California now mandates some sites default to the highest privacy for minor accounts. The Louisiana legislation stands out as extremely restrictive, encompassing multiplayer games and video-sharing apps. Continue reading Louisiana Approves Parental Consent Bill for Online Accounts
By
Paula ParisiMay 1, 2023
A bipartisan bill introduced in the Senate last week seeks to establish a federal age limit for using social media that would prohibit children 12 and under from creating their own accounts as a way to prevent them from independently logging on to social platforms. The Protecting Kids on Social Media Act takes issue with the engagement algorithms Big Tech uses to keep kids glued to their sites and would limit the type of coding that could be deployed to target young users between the ages of 13 and 17. If not logged into an account, users under 13 could still access other online content. Continue reading New Federal Bill Would Restrict Social Media Use for Minors
By
Paula ParisiSeptember 20, 2022
Governor Gavin Newsom signed the California Age-Appropriate Design Code Act into law last week, making his state the first in the nation to adopt online child safety measures. The bipartisan legislation requires online platforms to default to privacy and safety settings that protect children’s mental and physical health. The new law, cosponsored by Assemblymembers Buffy Wicks (D-15th District) and Jordan Cunningham (R-35th District), prohibits companies that provide online services and products in California from using a child’s personal information and forbids collecting, selling, or retaining a child’s geolocation, among other things. Continue reading California Governor Signs Online Child Protection Bill into Law
By
Paula ParisiSeptember 1, 2022
A first of its kind U.S. proposal to protect children online cleared the California Legislature Tuesday and was sent to the desk of Governor Gavin Newsom. The California Age-Appropriate Design Code Act will require social media platforms to implement guardrails for users under 18. The new rules will curb risks — such as allowing strangers to message children — and require changes to recommendation algorithms and ad targeting where minors are concerned. The bill was drafted following Facebook whistleblower Frances Haugen’s 2021 congressional testimony about the negative effects of social media on children’s mental health. Continue reading California’s Online Child Safety Bill Could Set New Standards