Social Platforms Face Government Questions on Teen Safety

Executives from Snap, TikTok and YouTube tried to distance themselves from Facebook and one another in a Tuesday Senate hearing about online safety for young users. In a combative exchange lasting nearly four hours, the participating social platforms tried to make the case they are already taking steps to protect minors, while lawmakers countered that their staff was able to find posts featuring inappropriate content on their sites, sometimes while logged in as teens. “Being different from Facebook is not a defense,” said Senator Richard Blumenthal (D-Connecticut).

“The problem is clear: Big Tech preys on children and teens to make more money,” Senator Ed Markey (D-Massachusetts) said. “Now is the time for the legislative solutions to these problems.”

“Everything that you do is to add users, especially kids, and keep them on your apps for longer,” said Blumenthal, chairman of the Senate Commerce subcommittee on Consumer Protection, Product Safety and Data Security that held the Tuesday hearing focusing on how the large social media companies can shield children from inappropriate content, including that which is violent or damages self-worth.

Incendiary issues raised at the Tuesday hearing included Snapchat’s now-disabled “speed filter,” alleged to have encouraged teens to drive excessively fast resulting in links to “a number of deadly or near-fatal car crashes,” NPR reports.

Senator Amy Klobuchar (D-Minnesota) noted “cases where young people obtained drugs through Snapchat, including one young man who died after purchasing the painkiller Percocet laced with fentanyl,” according to the NPR coverage. “We are absolutely determined to remove drug dealers from Snapchat,” company vice president for global public policy Jennifer Stout responded.

Questions were raised about content related to self-harm and body-image issues. “We prohibit content that promotes or glorifies such things as eating disorders, but we also realize that users come and share their stories about these experiences,” YouTube VP for government affairs and public policy Leslie Miller said, offering that over 90 percent of non-compliant content is flagged by AI filters.

The October 26 hearing followed the subcommittee’s September 30 hearing designated “Protecting Kids Online: Facebook, Instagram, and Mental Health Harms” at which the Mark Zuckerberg controlled companies took a beating in the wake of damaging allegations by whistleblower and former Facebook employee Frances Haugen. That hearing and preceding “Facebook Files” entries in The Wall Street Journal heightened scrutiny by U.S. legislators, part of a wave of Big Tech regulatory reevaluations taking place globally.

The technology platforms have made clear, including at the Tuesday hearing, that they are more comfortable with the idea of content-control laws as opposed to antitrust action that would force divestiture.

Tuesday’s Senate hearings coincided with new allegations that Facebook’s News Feed algorithm rewarded use of the “angry emoji” by weighting it five times greater than a “like” when doling out shares.

No Comments Yet

You can be the first to comment!

Sorry, comments for this entry are closed at this time.