Facebook Video Raises Offensive Content, Piracy Concerns

Now that Facebook has become a major player in video, the social media company finds itself tackling new issues: piracy and policing of content. The latter became an issue within minutes after a gunman killed two journalists on live TV; the gunman posted his video on Facebook (and Twitter), which went viral. Content owners are also irate that Facebook has been slow in working to prevent copyrighted videos from being reposted by third parties. Now that Facebook admits it has a problem, the work to fix it begins.

facebook59According to Re/code, Facebook’s piracy problem is immense; “by one estimate account[ing] for more than 70 percent of Facebook’s most popular videos.” Particularly vocal have been Jukin Media, a video licensing agency, and Fullscreen.

Now, both of these companies and ZEFR, a service company that helps content owners track their clips on YouTube, are partnering with Facebook to introduce a “video matching technology” that will let content owners alert Facebook that a video clip belongs to them, and take it down. Re/code calls the system “the first step to creating the equivalent of YouTube’s Content ID system.”

Partners are just beginning to test the system, which requires content owners to upload the clips to be protected. But Re/code notes that, “to fully replicate YouTube’s Content ID system, it will have to create a way for content owners to leave their stuff up on the site, and share ad revenue the clips generate.”

Fullscreen chief executive George Strompolos, a former YouTube executive, says that Facebook has been helpful, listening to advice and requests. “I think Facebook is a product and engineering company. And I think companies like that tend to want to create the experience first and then figure out the rules later.”

Facebook also faces the dilemma of what to do when users post offensive or disturbing videos. Although Facebook deleted the Virginia gunman’s video of the shooting, copies of it are “still floating around on social networks,” says Wired. One copy on Facebook, which garnered 39,000 views, was up for five hours.

Should Facebook, Twitter and other similar services do more to police this kind of content? A Twitter spokesperson referred to its media policy that states, “We do not mediate content. All content should be marked appropriately as per our guidelines.”

“Companies don’t want to be censors,” notes Wired. These companies “will remove certain content if users complain… but they’re wary of going too far.” So perhaps it’s no surprise, says Wired, that Facebook executive Monika Bickert told NYT that it has no plans to use its existing algorithms to find and remove offensive content.

No Comments Yet

You can be the first to comment!

Sorry, comments for this entry are closed at this time.