5 Takeaways From Facebook’s Leaked Moderation Documents

Sometimes an emoji is just an emoji. Sometimes it may be a threat.

And with only a few seconds to spare, Facebook moderators have to make the call — even if the text that accompanies the laughing yellow face is in an unfamiliar language.

To help with those decisions, Facebook has created a list of guidelines for what its two billion users should be allowed to say. The rules, which are regularly updated, are then given to its moderators.

For Facebook, the goal is clarity. But for the thousands of moderators across the world, faced with navigating this byzantine maze of rules as they monitor billions of posts per day in over 100 languages, clarity is hard to come by.

Facebook keeps its rulebooks and their existence largely secret. But The New York Times acquired 1,400 pages from these guidelines, and found problems not just in how the rules are drafted but in the way the moderation itself is done.

[You can read the report by Max Fisher here.]

Here are five takeaways from our story:

Facebook is experimenting on the fly.

The rules are discussed over breakfast every other Tuesday in a conference room in Menlo Park, Calif. — far from the social unrest that Facebook has been accused of accelerating.

Though the company does consult outside groups, the rules are set largely by young lawyers and engineers, most of whom have no experience in the regions of the world they are making decisions about.

The rules they create appear to be written for English speakers who at times rely on Google Translate. That suggests a lack of moderators

Keep reading this article on The New York Times Business.