Activist and writer Ijeoma Oluo is the latest to suffer for Facebook’s inability — or perhaps unwillingness — to improve its reporting and moderation infrastructure. After receiving hundreds of racist and threatening messages in response to a joke she made on Twitter, Oluo began posting screenshots when it was clear that days of reporting did nothing. Facebook’s response was to suspend her account.
You can read Oluo’s account of things here, including some screenshots of the type of abuse she was receiving. Twitter, she said, was responsive. Facebook, not so much.
Facebook later reinstated her account, calling the suspension a “mistake.” I’ve asked the company for the rationale behind the suspension.
We talked with another activist recently, Leslie Mac, who like Oluo spoke out on racism using the platform, and like Oluo was suspended from it. It happened to Shaun King, too, after he posted a racist email he had received.
The pattern isn’t hard to figure out: when a person (often a person of color, often a woman, often both) is singled out by popular accounts and pages for something they’ve said or done, the mob descends. Abusive messages, comments, and tweets arrive by the truckload — and while the target can only block and report so fast, groups of hundreds or thousands can flag a post or account so voluminously that it is taken offline.
Sure, that’s a “mistake.” The way the entire system Facebook has established for moderating the global conversation is a mistake. It is, at the very least, fundamentally flawed and inadequate.
Facebook and other platforms love to talk about the empowering nature of a one-to-many platform. They fail to address the problem that occurs when the platform is inverted and becomes many-to-one. There isn’t really a solution for the constant dogpiles that occur when someone incurs the wrath of an entire contingent of highly vocal abusers. Just let it happen and then sweep the “mistake” under the rug with the rest of them.
Maybe the mistake is people thinking these platforms are able to protect them at all. If so, the echo chamber will intensify as Facebook and other services may find themselves hosts to chilled speech under the de facto sway of countless angry mobs.
I’ve asked Facebook and Oluo for more information and will update the post if I hear back.