January 23, 2018
As the midterm elections approach, some tech companies are making changes to minimize harm and build credibility. Facebook plans to let users rank news sources they see as most trustworthy, as a means of prioritizing high-quality news. Twitter, which is still cleaning house from the presidential election, reports it has discovered 1,062 more accounts linked to an official Russian propaganda unit. Google and YouTube chief executives have promised to examine videos and other content more closely to ferret out misleading news.
The New York Times reports on Facebook’s new strategy to put news from “lesser-known and less-trusted outlets” under a microscope. “There’s too much sensationalism, misinformation and polarization in the world today,” said chief executive Mark Zuckerberg. “We decided that having the community determine which sources are broadly trusted would be most objective.”
Zuckerberg is responding in part to the charge that the social media platform didn’t do enough “to stamp out fake news and disinformation on its platform” during the 2016 presidential election. The company also admitted that “Russian agents had used the site to spread divisive and polarizing ads and posts.”
Concerns over Facebook’s new move are that “crowdsourcing users’ opinions on trustworthiness might be open to manipulation,” with Digital Content Next chief executive Jason Kint noting that such a system could be hacked or gamed. It might also “potentially favor publishers who are partisan.”
The Wall Street Journal reports that Twitter “identified 1,062 more accounts tied to a Russian government-backed propaganda outfit.” Previously, it “told congressional investigators it discovered more than 2,700 accounts linked to the Kremlin-backed Internet Research Agency, a shadowy so-called troll farm.” The total now is “3,814, posting 175,993 tweets during the 10-week period around the election, of which 8.4 percent were election-related.”
Twitter suspended those accounts and plans to notify the 677,775 U.S. people “exposed to content from these accounts” and “share the relevant information with Congress.” Twitter also “identified another 13,512 bot accounts” with ties to Russia, which brings the total number of Russian-backed automated accounts to more than 50,000.
Bloomberg reports that Google chief executive Sundar Pichai and YouTube chief executive Susan Wojcicki both note the responsibility they have to “scour videos and other content more closely for misleading news and inappropriate messages on their web services ahead of elections in the U.S later this year.”
Wojcicki says her company is hiring “as many employees as possible to scrutinize videos in tandem with computers running artificial-intelligence software to identify and quickly remove offensive and inaccurate material.” Pichai warned that, although “drawing the line” between what is true or false is “increasingly hard,” the efforts to do so are important.
“We don’t want people to reject technology,” he said. “Technology is the source of progress.