It’s no secret that Facebook has a content monitoring and response issue.
A recent spate of violent episodes that played out live on the platform have drawn attention to the company’s struggle to take down harmful, harassing or violent content quickly enough while leaving alone posts that shouldn’t be removed.
To address these problems, Facebook’s founder and chief executive, Mark Zuckerberg, announced that the company will hire 3,000 new employees.
[FREE DOWNLOAD: 8 Tips on How to Manage a PR Crisis Effectively]
“Over the last few weeks, we’ve seen people hurting themselves and others on Facebook — either live or in video posted later. It’s heartbreaking, and I’ve been reflecting on how we can do better for our community,” Zuckerberg wrote in a Facebook post on Wednesday.
The news comes on the heels of several initiatives to improve this process. On Tuesday, a teen in Macon, Georgia, was saved after attempting suicide while streaming on Facebook Live. Authorities responded within 30 minutes of calls from viewers—and Facebook itself.
Still, Zuckerberg is determined to continue improvements. He wrote:
In addition to investing in more people, we’re also building better tools to keep our community safe. We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help. As these become available they should help make our community safer.
The company also hopes to end over-enforcement of its community rules with appropriate content.
In one example, the platform removed the Pulitzer Prize-winning photograph titled “Napalm Girl” from its site after it was reported as offensive. Facebook eventually allowed it to be published on the platform, but the additional staff is aimed at weeding out those requests that are valid vs. errant complaints.