The Horrifying Truth of Facebook Moderators

4

November

2014

No ratings yet.

Almost on Facebook is now aware of the “Report for Abuse” button for when you see something disturbing posted. From naked pictures to inappropriate content, the button allows users to alert Facebook employees that there is something on Facebook that shouldn’t be seen. Who takes care of this? I think many of us just assume that a Facebook employee or team is in charge of deleting inappropriate content. “Pretty much any social media site you can think of uses some sort of moderation to keep abusive content off its page.”

What’s interesting is that many people don’t know how disturbing content on the internet can get. Most people think of videos that involve violence or nudity, but there’s everything from child pornography, beheading’s and brutal violence that is posted on social media sites. This sort of content takes a toll on content-moderating workers – of whom there are an estimated 100,000 worldwide.

In the November issue of Wired, Adrian Chen offers a peek into one of the darkest aspects to the social media industry. A scary fact is that the average length of employment for content moderators is between three and six months. Most of them don’t even work for that long, and end up quitting much sooner.

There is a lot of negative content online and for an estimated 100,00 moderators worldwide, you would think most of the content would be strictly regulated. However, one of the best things of the web is the freedom to post whatever you want under whatever account you like. That causes some positive and most definitely, some negative aspects, to social media.

Source: http://gizmodo.com/the-horrifying-lives-of-facebook-content-moderators-1649825388

Please rate this

Leave a Reply

Your email address will not be published. Required fields are marked *