In the wake of criticism for its response time with regard to takedowns of videos of the mass shooting in Buffalo, New York, Meta Platforms has released metrics that show it removed 21.7 million pieces of violent or incitement content from Facebook in Q1 2022, nearly doubling the number from the previous quarter. The Buffalo gunman used a helmet-mounted camera to live-stream his killing spree to Twitch, owned by Amazon, and recordings circulated on platforms including Facebook, Twitter, Reddit and Google’s YouTube. Platforms were challenged by the speed of downloads and reposts even after the footage was removed.
“Twitch said it reacted swiftly to take down the video of the Buffalo shooting, removing the stream within two minutes of the start of the violence. But two minutes was enough time for the video to be shared elsewhere,” writes The New York Times.
CNN reports that Facebook said it removed 1.5 million copies of the video in the 24 hours following the attack, which left 10 dead and three wounded.
One Facebook link to the video “was shared more than 46,000 times on Facebook, and the company didn’t remove it for more than 10 hours,” according to The Washington Post.
“As with prior mass shootings like Christchurch, the ability for people to quickly download and make new copies of live recordings has tested Meta’s ability to enforce its policies,” Engadget writes. The March 2019 murder of 51 people at two mosques in Christchurch, New Zealand, was broadcast on Facebook.
“Mass shootings — and live broadcasts — raise questions about the role and responsibility of social media sites in allowing violent and hateful content to proliferate,” says NYT, noting that many gunmen have documented the fact that their violence was inspired “by watching other gunmen stream their attacks live.”
A clip from the original video, which NYT says was watermarked, got posted on a site called Streamable where it was “viewed more than three million times before it was removed. And a link to that video was shared hundreds of times across Facebook and Twitter hours after the shooting,” according to NYT.
Meta, which shared Q1 takedown metrics as part of a quarterly Community Standards Enforcement Report, said Instagram removals were also up slightly, with 2.7 million posts violating the company’s rules about violence coming down, an increase of 100,000 from Q4. Meta also reported taking action on 1.8 billion pieces of Facebook spam content (up from 1.2 billion in Q4 2021). The spam was attributed to “a small number of users making a large volume of violating posts.”
On Instagram, the company moved on “1.8 million pieces of drug content, which was an increase from 1.2 million from Q4 2021.” Meta also reports increased action against “bullying and harassment content” on Instagram, at 67 percent in Q1, versus 58.8 percent in Q4, and says it saw “a slight decrease in prevalence on Facebook.”