Facebook Struggles to Contain Health Misinformation, QAnon

According to global civic movement Avaaz, over the past year Facebook enabled 3.8 billion views of misinformation related to health, almost four times the views of sites such as the World Health Organization (WHO) and Centers for Disease Control (CDC). This has occurred despite Facebook’s partnership with these organizations to expose users to reliable information. In another effort to squelch misinformation, Facebook removed 790 QAnon groups and restricted another 1,950 groups, 440 pages and 10,000+ Instagram accounts.

VentureBeat reports that, “the data in the new study is the latest evidence that Facebook officials are failing to control rampant disinformation and propaganda on the platform, which has 2.7 billion users.” Avaaz researcher Luca Nicotra noted that, “this is a kind of a pattern in Facebook … kind of going in the right direction, but kind of falling short.”

Earlier this year, when Avaaz “revealed gaps in Facebook’s efforts to fight COVID-19 disinformation … [the Silicon Valley company] announced it would retroactively send alerts to any users who had interacted with content subsequently labeled misleading.” Nicotra responded that its “correction effort is promising but has remained too small and sporadic to be effective.”

Avaaz also urged Facebook to “detox” its algorithm so that it will “downgrade posts by misinformation actors and lower their reach by 80 percent.” He noted that this is more important than removing misinformation, which allows its authors to claim censorship.

“Stop giving these pages free promotion,” he said. “You know that your algorithm loves divisive content, and misinformation is in that category … there is an issue at the DNA of the platform, and they need to have the courage to tackle it.”

Avaaz’s report “drew on data compiled by NewsGuard, a news-trust company that identifies websites and publishers that create misleading content” and focused on activity in the U.S., United Kingdom, France, Germany and Italy.

Avaaz also found that, “only 16 percent of health misinformation identified by Facebook had received a warning label … the other 84 percent had no label and was still circulating widely.” It also identified “42 Facebook accounts that spread health misinformation and have 28 million total followers.”

The New York Times reports that Facebook made its “most sweeping action” against the right-wing conspiracy group QAnon, which has experienced “record growth” on its site since March. According to data gathered by NYT, “activity on some of the largest QAnon groups on the social network, including likes, comments and shares of posts, rose 200 to 300 percent in the last six months.”

Facebook stated that it has “seen growing movements that, while not directly organizing violence, have celebrated violent acts, shown that they have weapons and suggest they will use them, or have individual followers with patterns of violent behavior,” adding that it will also block QAnon hashtags.

The fringe movement was founded four years ago, “but in recent months, the movement has become mainstream … [and] some have committed violence in the name of the movement [while other] members of the group are rising in politics,” including Marjorie Taylor Greene who won a Republican primary in Georgia this month. Trump has “shared information from QAnon accounts on Twitter and Facebook.”

Related:
Facebook Is Reportedly Testing a ‘Virality Circuit Breaker’ to Stop Misinformation, Engadget, 8/21/20
Facebook Is Quietly Pressuring Its Independent Fact-Checkers to Change Their Rulings, Fast Company, 8/20/20
TikTok Has Removed Hundreds of Thousands of Videos for Hate Speech, Engadget, 8/20/20
Reddit Has Banned Nearly 7,000 Hateful Subreddits Since June 29th, Engadget, 8/20/20