As part of a range of efforts to show that it has taken regulator and governmental concerns seriously, Facebook has set up an operations center in its European headquarters in Dublin, Ireland ahead of the upcoming European Union’s parliamentary election, which is scheduled for May 23-26 across 28 countries. Employees will monitor and clear Facebook of misinformation, fake accounts, and any signs of foreign meddling aimed at swaying election results. Facebook recently set up a similar post in Singapore for elections in India.
The social network also set up a similar post in Silicon Valley prior to the 2018 midterms in the U.S. — and with good reason. As has been well-documented, during the 2016 presidential election, Russia used the social network to influence voters.
“We are fundamentally dealing with a security challenge. There are a set of actors that want to manipulate public debate,” said Facebook’s head of cybersecurity policy Nathaniel Gleicher.
The results of the upcoming European Union election will have widespread effects. It will “determine who controls the European Parliament and sets the agenda of the European Union for the next five years. It will influence how the region grapples with issues like Britain’s exit from the European Union, immigration, income inequality and the rise of extremist ideologies. European leaders have warned that foreign groups will use social media to manipulate public opinion,” reports The New York Times.
Initially, Facebook tried to avoid large-scale, aggressive action that would entangle the company in issues related to free speech. But that’s changing, likely due to pressure from regulators and governments around the world, including European leaders who are currently considering new policies that would force tech giants like Facebook to clear all misinformation, hate speech and extremist content from their platforms.
To date, Facebook has “taken down several networks of accounts linked to foreign-influence campaigns, including some targeting users in Europe,” according to The New York Times, and just last week, it “barred the conspiracy theorist Alex Jones and several other divisive figures from its platforms.”
In Dublin, the command post will be open until after the election, staffed by “data analysts, content moderators, engineers and attorneys from across Facebook” who were “flown in from around the world.” The team is alerted to potentially harmful content first via an automated system. Afterward, members review the content and recommend whether or not it should be removed.
Even with these efforts, however, the platform is still vulnerable.
“Researchers recently highlighted the use of WhatsApp to spread misinformation ahead of elections in Spain last week. Another persistent problem is material about news events or politics that don’t technically violate Facebook’s policies but is used by far-right and other groups to exaggerate divisions in countries,” reports The New York Times.