Congress Grills Big Tech Executives on Accountability Issues

Prior to a House hearing on social media’s role in extremism and disinformation, Facebook chief executive Mark Zuckerberg submitted written testimony on Section 230, suggesting that “platforms should be required to demonstrate that they have systems in place for identifying unlawful content and removing it.” Section 230 of the 1996 Communications Decency Act holds that platforms are not liable for content posted by their users. In a bipartisan effort, lawmakers are pushing for change. “Our nation is drowning in disinformation driven by social media,” suggested Rep. Mike Doyle (D-Pennsylvania). “We will legislate to stop this.”

However, according to Yahoo! Finance, “That seems unlikely, no matter how virulent the problem is, given sharp divisions on the issue.” Beacon Policy Advisors wrote in a recent analysis: “Legislation to reform Section 230 is not going to pass anytime soon. Democrats and Republicans agree on little when it comes to what the real problem is.”

“The principles of Section 230 are as relevant today as they were in 1996, but the Internet has changed dramatically,” Zuckerberg said yesterday in his remarks to a subpanel of the House Energy and Commerce Committee.

Zuckerberg appeared before the subpanel with Twitter chief exec Jack Dorsey and Alphabet CEO Sundar Pichai. “Straight out of the gate, lawmakers expressed their anger at the social media leaders for failing to rein in misinformation on their platforms,” reports ZDNet. “Specifically, they called out content that spread misinformation about COVID-19 vaccines, as well as content that fomented anger and spread misinformation ahead of the attempted insurrection on the U.S. Capitol in January.”

“Instead of being granted immunity, platforms should be required to demonstrate that they have systems in place for identifying unlawful content and removing it,” said Zuckerberg. “Platforms should not be held liable if a particular piece of content evades its detection — that would be impractical for platforms with billions of posts per day — but they should be required to have adequate systems in place to address unlawful content.”

The Wall Street Journal reports that, at Fight for the Future, which opposes limiting Section 230 protections because it “could devastate” smaller platforms, deputy director Evan Greer said Facebook wants changes “because they know it will simply serve to solidify their monopoly power and crush competition from smaller and more decentralized platforms.” The House hearing looked at “social media’s role in promoting extremism and misinformation” and changes to Section 230 “are expected to be a key focus.”

Although the effort is bipartisan, the solution is not, as “many Republicans think social media platforms are removing too much content, while Democrats see them not removing enough and allowing harmful content to spread.” Zuckerberg stated that Section 230 protections should be “conditional on companies’ ability to meet best practices to combat the spread of this [unlawful] content,” but that his company should not be “held liable” if “a particular piece” of such content evades detection.

Sundar Pichai’s advance testimony stated that, “regulation has an important role to play in ensuring that we protect what is great about the open web,” adding that calls to repeal Section 230 “would not serve that objective well.” Jack Dorsey’s testimony noted that, “technology companies have work to do to earn trust from those who use our services,” but did not mention Section 230.

Engadget reports activist network Avaaz revealed that, “Facebook missed billions of opportunities to tamp down misinformation ahead of the 2020 presidential election.” The advocacy group analyzed “100 of the most popular Facebook pages that have repeatedly spread false claims,” finding that the pages were “viewed more than 10 billion times between March and October.”

It added that, “the top 100 false or misleading stories related to the 2020 elections” were viewed 162 million times in three months even when “Facebook’s fact-checkers debunked the claims.” The report noted that, “although Facebook claims to slow down the dissemination of fake news once fact-checked and labeled, this finding clearly shows that its current policies are not sufficient to prevent false and misleading content from going viral and racking up millions of views.”

A Facebook spokesperson disagreed with the Avaaz findings, calling the methodology “flawed” and pointing to its efforts to ban QAnon and other “militarized social movements.” The report added, however, that “Facebook waited too long to implement many of its most important changes, including the ‘emergency’ features that slowed down sharing immediately after the election” and that its crackdown on QAnon and others “came too late,” since they had already gained “significant traction.”

Avaaz stated that, “Facebook again prioritized piece-meal and whack-a-mole approaches — on individual content for example — over structural changes in its recommendation algorithm and organizational priorities.”

Related:
Facebook Wants Washington’s Help Running Facebook, Recode, 3/24/21
Facebook’s Pitch to Congress: Section 230 for Me, But Not for Thee, Electronic Frontier Foundation, 3/24/21
Lawmakers Hammer Tech CEOs for Online Disinformation, Lack of Accountability, The Wall Street Journal, 3/25/21
Lawmakers Grill Tech CEOs on Capitol Riot, Getting Few Direct Answers, The New York Times, 3/25/21