By
Paula ParisiFebruary 24, 2023
Meta Platforms is reforming its penalty system for Facebook policy violations. Based on recommendations from its Oversight Board, the company will focus more on educating users and less on punitive measures like suspending accounts or limiting posts. “While we are still removing violating content just as before,” explains Meta VP of content policy Monika Bickert, “under our new system we will focus more on helping people understand why we have removed their content, which is shown to help prevent re-offending, rather than so quickly restricting their ability to post.” The goal is fairer and more effective content moderation on Facebook. Continue reading Meta’s Penalty Reforms Designed to Be More Effective, Fair
By
Paula ParisiFebruary 9, 2023
President Biden’s second State of the Union speech Tuesday night included calls for stronger consumer privacy protections and tougher antitrust laws in direct challenge to what many perceive as the unchecked power of Big Tech. “Pass bipartisan legislation to strengthen antitrust enforcement and prevent big online platforms from giving their own products an unfair advantage,” Biden stated, urging Congress to “stop Big Tech from collecting personal data on kids and teenagers online, ban targeted advertising to children, and impose stricter limits on the personal data these companies collect on all of us.” Continue reading Biden Challenges Big Tech, Calls for Children’s Online Safety
By
Paula ParisiJanuary 19, 2023
British legislators seem ready to make good on a threat to add criminal liability and jail time for high-level social media executives who fail to protect children from online harm as part of the Online Safety Bill. While the bill also aims to protect adults from fraud and malfeasance, its strictest provisions are geared toward child protection. The current proposal could win approval by the House of Commons within the week, and would then move to the upper chamber, the House of Lords, later in the quarter for further revision. Enactment is anticipated by year’s end.
Continue reading UK Online Safety Bill to Exert Pressure on Social Media Execs
By
Paula ParisiNovember 21, 2022
A coalition of more than 20 advocacy groups with an interest in child safety is petitioning the Federal Trade Commission to prohibit social media platforms including TikTok as well as online games and other services from bombarding kids with ads and using other tactics that may hook children online. Regulators are being lobbied to prevent online services from offering minors “low-friction rewards” — unpredictably granting positive reinforcement for scrolling, tapping or logging on to prolonged use. The groups say the technique is the same used by slot machine makers to keep gamblers engaged. Continue reading Advocacy Groups Seek to Enact Online Rules to Protect Kids
By
Paula ParisiOctober 25, 2022
UK watchdog Ofcom has proposed a loosening of the nation’s net neutrality rules so as to not unduly restrict innovation and development. While it is up to government and Parliament to change the law, recommendations from Ofcom — which was created to monitor compliance with net neutrality laws — are influential. “Since the current rules were put in place in 2016, there have been significant developments in the online world, including a surge in demand for capacity,” as well as the rollout of 5G, and the emergence of large players like Netflix and Amazon Prime. Continue reading Online Safety Act Paused as Ofcom Reports on Net Neutrality
By
Paula ParisiSeptember 1, 2022
A first of its kind U.S. proposal to protect children online cleared the California Legislature Tuesday and was sent to the desk of Governor Gavin Newsom. The California Age-Appropriate Design Code Act will require social media platforms to implement guardrails for users under 18. The new rules will curb risks — such as allowing strangers to message children — and require changes to recommendation algorithms and ad targeting where minors are concerned. The bill was drafted following Facebook whistleblower Frances Haugen’s 2021 congressional testimony about the negative effects of social media on children’s mental health. Continue reading California’s Online Child Safety Bill Could Set New Standards
By
Paula ParisiJuly 14, 2022
TikTok is facing blowback for lax advertising disclosures. While the platform offers various ways to identify paid promotion, its marketing policies appear to operate on an honor system, and while some creators label their posts as advertising or partnerships, many do not. Where a financial relationship exists with regard to products mentioned, the truth in advertising rules enforced by the Federal Trade Commission and state attorneys general require media partners to disclose that funds will change hands. As part of a renewed national interest in digital consumer protections, particularly related to child safety, the area is getting increased scrutiny. Continue reading TikTok Draws Criticism for Undisclosed Sponsored Content
By
Paula ParisiJuly 7, 2022
The European Parliament has adopted two digital acts, one focused on leveling the competitive playing field, the other on protecting consumer rights online. The Digital Markets Act and the Digital Services Act are both expected to take effect this fall, after the European Commission signs off. “We are finally building a single digital market, the most important one in the ‘free world,’” EU commissioner for the internal market Thierry Breton said Tuesday. “The same predictable rules will apply, everywhere in the EU, for our 450 million citizens, bringing everyone a safer and fairer digital space.” Continue reading EU Checks Power of Big Tech with Digital Services Regulation
By
Paula ParisiMarch 21, 2022
Meta Platforms is beginning to implement parental controls on Instagram and Quest. Last week, Instagram added a Family Center that will eventually expand to allow parents and guardians to “help teens manage experiences across Meta technologies from one central place.” Meta says parental controls will be added to Quest VR in May, and hinted others, like Facebook, are queued-up to join. The Family Center will allow parents to monitor how much time their teens spend on Instagram, setting limits if they choose. Additionally, accounts teens follow and accounts following them will be trackable. Continue reading Meta Adding Parent Controls for Instagram and Virtual Reality
By
Paula ParisiDecember 13, 2021
Instagram CEO Adam Mosseri spent more than two hours in the Senate hot seat last week, answering questions about the platform’s safety policies and impact on teens’ mental health. A bipartisan phalanx grilled the executive on topics ranging from algorithms to eating disorders. Mosseri, who was appearing in Congress for the first time, defended his social platform, a division of Meta Platforms, which also owns Facebook. He resisted pressure to throw in the towel on launching an Instagram for kids, telling lawmakers only that no child would have access to such a platform “without their explicit parental consent.” Continue reading Senate Tells Instagram CEO the ‘Time for Self-Policing is Over’
By
Paula ParisiNovember 19, 2021
TikTok has added a Safety Center to its platform, simultaneously releasing a 38-page summary of the months-long global research project on the impact its challenges and hoaxes have on adolescent users. The study — which queried more than 10,000 teens, their parents, and teachers across Asia, Europe and the Americas — was written by independent agency Praesidio Safeguarding. The move is a response to negative attention TikTok has received from media and lawmakers involving allegations of “blackout challenges” and slap-a-teacher dares. Critics are saying the social video platform’s new safety features do not go far enough. Continue reading TikTok Debuts Safety Center Following Survey on Teen Users
By
Paula ParisiOctober 28, 2021
Executives from Snap, TikTok and YouTube tried to distance themselves from Facebook and one another in a Tuesday Senate hearing about online safety for young users. In a combative exchange lasting nearly four hours, the participating social platforms tried to make the case they are already taking steps to protect minors, while lawmakers countered that their staff was able to find posts featuring inappropriate content on their sites, sometimes while logged in as teens. “Being different from Facebook is not a defense,” said Senator Richard Blumenthal (D-Connecticut). Continue reading Social Platforms Face Government Questions on Teen Safety