To keep offensive content out of our newsfeeds, social networking sites can employ one of two strategies: they can “active moderate”(screening every single post uploaded), or they can rely on their users to report anything suspicious or unsavory, and pass those reports over to content moderators.
Larger sites like Twitter and Facebook tend to use the latter strategy and, given the sheer number of reported posts daily, it’s understandable that they’d decide to outsource moderation of reported content.
Many of the people who spend their days looking through reported content are horrendously underpaid international contractors, making as little as one dollar per hour plus commissions (estimated to bring their average rate of pay up to four dollars an hour). They’re often highly educated and must pass a stringent English test in order to gain the role. Most content moderators end up leaving the role due to the psychological damage caused by hours of looking through incredibly disturbing content, from beheadings to animal torture. On-shore workers are better paid and can have very good physical working conditions, but still end up suffering greatly from what they have to look through each day: in an interview with Wired, a US based former content moderator describes developing depression and problems with alcohol as a result of the videos he was moderating for YouTube.
While Facebook’s public documentation keeps its content guidelines relatively vague, they’re laid out in explicit detail for its content moderators. A Moroccan contractor recently released his copy to Gawker, and its seventeen pages are divided into sections like “sex and nudity”, “hate content” and “graphic content.” Cartoon urine is okay, real urine is not. Deep flesh wounds and blood are okay, mothers breastfeeding is not. Some posts are judged on their context, rather than their content (eg, videos of animal abuse are okay as long as the person who posted it clearly thinks animal abuse is wrong). Strangely, all photoshopped content (whether positive, negative or neutral) is approved for deletion.
When you think about it, it’s concerning how little most social media users know about the rules they are expected to follow, or about the people and processes involved in enforcing those rules. One of the major benefits of starting your own social network is that you’re playing by your own rules – and you know exactly what those rules are. You decide what is acceptable, and what is not; both in terms of common decency, and keeping your community on-message.
Reactions & comments
Due to increased security measures, you can now only log-in using your e-mail address.
If you do not remember your e-mail address, send us an email at support@peepso.com
Comments