Facebook reveals 25 pages of takedown rules for hate speech and more

Facebook has never before made public the guidelines its moderators use to decide whether to remove violence, spam, harassment, self-harm, terrorism, intellectual property theft, and hate speech from social network until now. The company hoped to avoid making it easy to game these rules, but that worry has been overriden by the public’s constant calls for clarity and protests about decisions. Today Facebook published 25 pages of detailed criteria and examples for what is and isn’t allowed.

Nothing is technically changing about Facebook’s policies. But previously, only leaks like a copy of an internal rulebook attained by the Guardian had given the outside world a look at when Facebook actually enforces those policies. These rules will be translated into over 40 languages. Facebook currently has 7500 content reviewers, up 40% from a year ago.

Community Standards. These reports are reviewed by our Community Operations team, who work 24/7 in over 40 languages. Right now, we have 7,500 content reviewers, over 40% more than the number at this time last year

Facebook also plans to expand its content removal appeals process, It already let users request a review of a decision to remove their profile, Page, or Group. Now Facebook will notify users when their nudity, sexual activity, hate speech or graphic violence content is removed and let them check a box to request an appeal, which will usually happen within 24 hours. Finally, Facebook will hold Facebook Forums: Community Standards events in Germany, France, the UK, India, Singapore, and the US.



Powered by Blogger.