Facebook and Instagram Roll Out New Safety Tools for Teens

Meta Platforms is introducing updates to further protect teens on Facebook and Instagram. Starting this week, those under the age of 16 (or under 18 in certain countries) will be defaulted into more stringent private settings when they join Facebook. A similar default was put into effect on Instagram last year. Meta is also restricting “potentially suspicious adults.” For example, adults will be restricted from messaging teens they aren’t connected to and from seeing teens in their People You May Know recommendations. A “suspicious adult” is one that has recently been blocked or reported by a young person.

As an extra layer of protection, Meta is also testing removal of the message button on teens’ Instagram accounts altogether insofar as being viewable by suspicious adults.

Additionally, “Meta is working with the National Center for Missing and Exploited Children (NCMEC) to build a global platform for teens who are worried their intimate images might be shared online without their consent,” TechCrunch reports, noting that the goal is to help “prevent a teen’s intimate images from being posted online.”

The platform will have a feel familiar to Meta’s current system designs, and once it is built will be made available to other social media companies that are interested in protecting minors.

“We’ve developed a number of tools so teens can let us know if something makes them feel uncomfortable while using our apps, and we’re introducing new notifications that encourage them to use these tools,” Meta writes in a blog post. “For example, we’re prompting teens to report accounts to us after they block someone and sending them safety notices with information on how to navigate inappropriate messages from adults.”

In one month in 2021, more than 100 million people saw safety notices on Messenger, Meta reports, noting the ease-of-use efforts have resulted in “more than a 70 percent increase in reports sent to us by minors in Q1 2022 versus the previous quarter on Messenger and Instagram DMs.”

Meta says it’s “working closely with NCMEC, experts, academics, parents and victim advocates globally to help develop the platform and ensure it responds to the needs of teens so they can regain control of their content in these horrific situations.” Meta is also working with Thorn and its NoFiltr brand to develop educational materials that “will aim to empower teens to seek help and take control,’” TechCrunch explains.

Meanwhile, UK consumer rights activist Tanya O’Carroll is suing Facebook for collecting her personal data for ad targeting in a case The Guardian reports “could set precedent for millions.”

Related:
Amazon’s Twitch Makes Changes to Address Child Predation on Platform, Bloomberg, 11/22/22