The Future

Facebook will make more people watch violent videos so you don't have to

Mark Zuckerberg says that adding more moderators will curb illicit content on Facebook. Is he right?

The Future

3000
The number of moderators Facebook is hiring to monitor for illicit content.
The Future

Facebook will make more people watch violent videos so you don't have to

Mark Zuckerberg says that adding more moderators will curb illicit content on Facebook. Is he right?

Over the past several months, Facebook has come under increasing fire for a spree of explicit videos that appeared on the platform. Just last month, a Cleveland man named Steve Stephens uploaded a video of himself gunning down 74-year-old Robert Godwin Sr., on Easter, before broadcasting a live video in which he admitted to the crime.

Today, Facebook CEO Mark Zuckerberg announced a number of changes to the company's community operations team in an attempt to address the issue of violence appearing on the platform, particularly Facebook Live. In a note posted to Facebook, Zuckerberg outlined plans to staff up the already 4,500 person team in charge of reviewing user-submitted reports of explicit content on the site. He wrote:

Over the next year, we'll be adding 3,000 people to our community operations team around the world -- on top of the 4,500 we have today -- to review the millions of reports we get every week, and improve the process for doing it quickly.

But as TechCrunch points out, “What is not clear is whether these are full-time employees or contractors, and how screeners will, in effect, be screened.” According to Zuckerberg, this bolstered community operations team will be able to more effectively root out offensive content on Facebook, before too many people see it. Zuckerberg goes on to say:

These reviewers will also help us get better at removing things we don't allow on Facebook like hate speech and child exploitation. And we'll keep working with local community groups and law enforcement who are in the best position to help someone if they need it -- either because they're about to harm themselves, or because they're in danger from someone else.

Since launching Facebook Live last spring, the company has been mired in controversy over the feature. Everything from sexual assault to suicide has appeared on the platform, and many critics say the company rushed the product to market without adequately equipping itself for these challenges.

The solution may not just be increased moderation. When Facebook faced allegations of bias among its trending stories team last year, the company decided to eliminate the department's entire human staff. Similarly, the Pulitzer prize winning photo of "Napalm Girl" was briefly blocked by Facebook's moderators, causing an uproar. Facebook's leadership, when faced with a public relations problem, has a tendency to make large gestures that signal towards solutions. But when it comes to understanding the content that actually appears on the platform, things are perhaps more complicated than they let on.

Correction: This article has been updated to correctly attribute news sources.