Instagram Taps AI to Identify Young Users for Teen Accounts

Instagram is using AI to find teens who have falsified their age and is reclassifying them into “Teen Accounts” settings. The Meta social platform is reaching out to parents to tell them about “the importance of their teens providing the correct ages online, and tips to check and confirm their teens’ ages together,” but letting them know they “don’t have to go it alone — we’re using AI to help.” Instagram launched Teen Accounts last year as a way to enroll users and provide built-in protections. Those under 16 need a parent or guardian’s permission to change the setting. The safeguards are on by default for young users, limiting who can make contact and filtering viewable content. Continue reading Instagram Taps AI to Identify Young Users for Teen Accounts

Discord Testing Facial Scans to Verify Age in UK and Australia

Instant messaging and VoIP social platform Discord is experimenting with a feature that requires some users to verify their age by scanning their face or a photo ID. The technique is being implemented in Australia and the United Kingdom, where recently passed laws seek to crack down on minors accessing potentially harmful online content. The safeguard applies only to users who haven’t previously verified their age on the chat platform. Discord says age verification aims “to help users manage content filter settings and sensitive content visibility,” explaining it is a one-time process that can be completed when users first adjust their settings. Continue reading Discord Testing Facial Scans to Verify Age in UK and Australia

Utah Law Is First in Nation Making App Stores Verify User Age

Utah has become the first state to make app stores responsible for verifying users’ ages. The Utah App Store Accountability Act shifts the burden of proving one’s age from social platforms like Snapchat, Instagram and X to digital storefronts, namely Google Play and Apple’s App Store. Those who create accounts in the state will have to prove they’re over 18 or, if underage, link their account to a parent or guardian’s. Utah Governor Spencer Cox signed the bill into law on Wednesday and it begins taking effect May 7. Google opposed the legislation and lobbied the governor to veto it. Meta, X and Snap applauded the measure and are encouraging other states to follow suit. Continue reading Utah Law Is First in Nation Making App Stores Verify User Age

TikTok Adds Security Checkup to Help Users Secure Accounts

TikTok has rolled out a Security Checkup tool designed to help users secure their accounts. Security settings can now be reviewed and updated from a single screen, similar to security dashboards used by Google and Instagram. A step-by-step guide to the new feature encourages users to make their accounts safer by enabling more security features. The social media platform owned by China-based ByteDance is in the final days of a 75-day extension allowing it to continue U.S. operations after Congress deemed it a national security threat and enacted legislation requiring it to be sold or banned by January 15. Continue reading TikTok Adds Security Checkup to Help Users Secure Accounts

Roblox Tightens Child Safety Guidelines Amidst Media Outcry

Capitulating to outside pressure, after a barrage of media reports citing unsafe conditions for minors, Roblox is implementing new safeguards. Parents can now access parental controls from their own devices in addition to their child’s device and monitor their child’s screen time. New content labels and improvements to how users under age 13 can communicate on Roblox are additional protections that are now baked into the platform. “We’ve spent nearly two decades building strong safety systems, but we are always evolving our systems as new technology becomes available,” explained the Roblox. Continue reading Roblox Tightens Child Safety Guidelines Amidst Media Outcry

Roblox’s New Child Safety Measures Target Hangout Spaces

Online gaming platform Roblox has banned kids from “social hangout” spaces — areas that feature communication through voice chat or text and offer “free-form 2D user creation” experiences where users can do things like share drawings. Roblox has also added safeguards to prevent those under the age of 13 from playing, searching or discovering unrated games. Roblox has imposed the restrictions following allegations that it has failed to protect its younger users. This is the latest such update by a string of social platforms that have imposed guardrails designed to protect young users as lawmakers turn up the heat on child online safety. Continue reading Roblox’s New Child Safety Measures Target Hangout Spaces

Instagram Sets Its New ‘Teen Accounts’ to Private by Default

Nine months after lawmakers grilled social networks for exposing children to harm, Meta Platforms has announced that Instagram’s teen accounts will be set to “private” by default. Instagram Teen Accounts have built-in protections, limiting who can contact the underage account holders as well as the content they see. “We’ll automatically place teens into Teen Accounts, and teens under 16 will need a parent’s permission to change any of these settings to be less strict,” Meta revealed in a blog post. To avoid leaving teens feeling their wings were clipped, Meta says there will also be new features designed expressly for them. Continue reading Instagram Sets Its New ‘Teen Accounts’ to Private by Default

YouTube Adds Family Center, Parent Insights on Teen Viewing

YouTube is adding a Family Center hub along with a feature that allows parents to link their accounts to those of their teen children for insight on child use patterns. Linked parents will receive alerts with aggregated information about things like the number of new uploads, subscriptions and comments, or when a teen starts a live stream. What they won’t get are details about the content itself. YouTube calls it “a collaborative approach to teen supervision on YouTube.” The move comes as federal and state legislators get more aggressive about regulating online safety for minors. Continue reading YouTube Adds Family Center, Parent Insights on Teen Viewing

Judge Blocks Sections of a Texas Law Meant to Protect Minors

A federal judge has partially blocked a new Texas law by disallowing requirements that social platforms identify minors and filter content for their safety. The Securing Children Online Through Parental Empowerment (SCOPE) Act, signed last year, threatens free speech due to its “monitoring and filtering” requirements the court ruled as the basis for a temporary injunction. Under the law, registered users under 18 will be subject to limited data collection, target advertising bans and parental consent for financial transactions. SCOPE would affect a range of online services, with large social platforms a focus. Continue reading Judge Blocks Sections of a Texas Law Meant to Protect Minors

Snapchat Puts Focus on Teen Safety Resources for Teachers

In an effort to create a safer environment for teens, social platform Snapchat is providing educators with resources to familiarize them with the app and help them understand how students use it. The company has launched a website called “An Educator’s Guide to Snapchat.” The announcement, timed to the start of the new school year, comes as lawmakers have been pressuring social networks to do more to protect children, with Florida and Indiana going so far as to enact school cell phone bans. Legislators in California and New York have been exploring similar prohibitions. Continue reading Snapchat Puts Focus on Teen Safety Resources for Teachers

U.S. Raises Stakes in TikTok Legal Battle, Suing Under COPPA

The U.S. Department of Justice has filed suit against TikTok and its parent company, ByteDance, charging they’ve violated the Children’s Online Privacy Protection Act (COPPA) by allowing children to create TikTok accounts without parental consent, and collecting their data. The suit also alleges TikTok retained the personal data of minors who joined prior to COPPA going into effect in 2000, even after parents demanded it be deleted, a right under COPPA. This latest move in the ongoing legal battle with ByteDance follows the Chinese company’s own lawsuit against the U.S. government. Continue reading U.S. Raises Stakes in TikTok Legal Battle, Suing Under COPPA

Popular Messaging App Banned from Servicing Young Users

Federal regulators have taken the unprecedented step of banning the NGL messaging platform from providing service to users under 18. The action is part of a legal settlement between NGL Labs, the Federal Trade Commission and the Los Angeles District Attorney’s Office. NGL, whose niche is “anonymous” communication and features the tagline “Ask me anything,” has also agreed to pay $5 million in fines. An FTC investigation found that in addition to fraudulent business claims about divulging the identities of message senders for a fee, NGL also falsely claimed it used artificial intelligence to filter out cyberbullying and harmful messages. Continue reading Popular Messaging App Banned from Servicing Young Users

U.S. Surgeon General Calls for Social Media Warning Labels

United States Surgeon General Dr. Vivek Murthy has renewed his push for Congress to enact social media warning label advising of potential mental health damage to adolescents. Murthy also called on tech companies to be more transparent with internal data on the impact of their products on American youth, requesting independent safety audits and restrictions on features that may be addictive, including autoplay, push notifications and infinite scroll, which he suggests “prey on developing brains and contribute to excessive use.” His federal campaign joins a groundswell of local laws restricting minors’ access to social media. Continue reading U.S. Surgeon General Calls for Social Media Warning Labels

Florida Enacts the Nation’s Most Restrictive Social Media Law

Florida Governor Ron DeSantis has signed a bill into law preventing children under 14 from creating new social media accounts, and requiring platforms to delete existing accounts, with no opportunity for parental consent. For children 14- to 15-years of age, consent of a parent or guardian is required to create or maintain accounts. Without it, or upon request, the accounts and personal data must be deleted, with fines of up to $50,000 per incident per platform. The law, set to take effect in January 2025, is being called the most restrictive passed by any state and is sure to face First Amendment scrutiny by the courts. Continue reading Florida Enacts the Nation’s Most Restrictive Social Media Law

Florida Pushes Forward a Social Media Ban for Kids Under 16

Florida’s legislature has passed a bill banning children younger than 16 from having social media accounts despite some pushback from Governor Ron DeSantis, who said he will be wrestling with whether to sign the measure into law. Due to a procedural requirement, DeSantis will have to sign or veto the proposed legislation before lawmakers conclude the current session in a matter of weeks. He has expressed dissatisfaction with the lack of a provision to let parents override the restriction, which would curtail access to the most popular sites, potentially impacting TikTok, Instagram, Facebook, Snapchat and YouTube. Continue reading Florida Pushes Forward a Social Media Ban for Kids Under 16