View Original Article

TECHCRUNCH

Facebook doesn’t want to be the arbiter of decency when it comes to content policy decisions, similar to how it looked to third-party fact checkers rather than becoming an arbiter of truth. Today on a press call with journalists, Mark Zuckerberg announced that a new external oversight committee would be created in 2019 to handle some of Facebook’s content policy decisions. The body will take appeals and make final decisions. The hope is that beyond the influence of Facebook’s business imperatives or the public’s skepticism about the company’s internal choices, the oversight body can come to the proper conclusions about how to handle false information, calls to violence, hate speech, harassment, and other problems that flow through Facebook’s user generate content network.

Users will be able to appeal decisions about content they report or when their content is reported, and Facebook will direct these appeals to the independent body. Zuckerberg said Facebook will be working to get the oversight body up and running over the next year. For now, there are plenty of unanswered questions about who will be on the committee, which of the many appeals it will review, and what ensures it’s truly independent from Facebook’s power.

“I believe the world is better off when more people have avoice to share their experiences . . . at the same time we have a responsibility to keep people safe” Zuckerberg said. “When you connect 2 billion people, you’re going to see all the good and bad of humanity. Different cultures have different norms, not only about what content is okay, but also about who should be making those decisions in the first place.” Zuckerberg explained that over the past year he’s come to believe that so much power over free expression should not be concentrated solely in Facebook’s hands.

[Update: Since we published this report, Zuckerberg has published a 5000 word letter describing his thoughts on Facebook policy, and the oversight body.]

The past year has seen Facebook criticized for how it handled calls for violence in Myanmar, harassment and fake news by conspiracy theorists like Alex Jones, election interference by Russian, Iranian, and other state actors, and more. Most recently, the New York Times published a scathing report about how Facebook tried to distract from or deflect criticism of its myriad problems, including its failure to prevent election interference ahead of the 2016 Presidential race.

The oversight committee could both help Facebook make smarter decisions that the world can agree with, and give Facebook a stronger defense to this criticism because it’s not the one making the final policy calls. The approach could be seen as Facebook shirking its responsibility, or as it understanding that the gravity of that responsibility exceeds its own capabilities.