Hello and thank you for being a DL contributor. We are changing the login scheme for contributors for simpler login and to better support using multiple devices. Please click here to update your account with a username and password.

Hello. Some features on this site require registration. Please click here to register for free.

Hello and thank you for registering. Please complete the process by verifying your email address. If you can't find the email you can resend it here.

Hello. Some features on this site require a subscription. Please click here to get full access and no ads for $1.99 or less per month.

Facebook Moderators Are Suing the Company for Trauma

This sounds fucking awful and disturbing but would you not assume the worst? The job description is literally, "you're the line of defense against fucked up shit on the internet"

********

Sean Burke spent years working for big tech like Cisco and SAP. But nothing could prepare him for the work he did for Facebook.

"My first day on the job, I witnessed someone being beaten to death with a plank of wood with nails in it and repeatedly stabbed. Day two was the first time for me seeing bestiality on video — and it all escalated from there.”

It took only two weeks before he "started seeing actual child porn.”

Burke, a native of New Jersey, is one of the thousands of low-paid content moderators worldwide who don’t work directly for Facebook but are employed by third-party companies who provide content moderation services to the tech giant — in this case, CPL Resources in Dublin.

Burke, along with other CPL employees, is gearing up to sue Facebook and CPL in Ireland’s High Court, saying they suffered “psychological trauma” as a result of poor working conditions and a lack of proper training to prepare employees for viewing some of the most horrific content seen anywhere online.

"I've had to go on antidepressants because of working on the job," Burke told VICE News. “At times I was solving my problems with alcohol to get to sleep because at least I wasn't dreaming when I slept after having a few drinks on me.”

Offsite Link
by Anonymousreply 12December 4, 2019 4:11 PM

Facebook has some Fucked up people to try and post this shit.

by Anonymousreply 1December 4, 2019 9:00 AM

Horrifying. Remember the early 2000s when mods would have meltdowns on fandom messageboards? That was child's play.

by Anonymousreply 2December 4, 2019 9:06 AM

Good. Here’s hoping they launch a class action and bankrupt the orange haired aspie sack of shit.

by Anonymousreply 3December 4, 2019 9:27 AM

I was about to roll my eyes after reading the title. Oh, Zuckerberg was kind of a dick to you or something? But then I read the article... lots of sick people in the world and most of them are probably on facebook.

by Anonymousreply 4December 4, 2019 9:39 AM

It’s the moderators who should be sued for trying to censor LGB voices for speaking up against the transhet menace.

by Anonymousreply 5December 4, 2019 10:46 AM

I hope these people get compensated and it (somehow) hurts the Reptile God, Zuckerberg.

by Anonymousreply 6December 4, 2019 11:50 AM

I did moderation work through Amazon's Mechanical Turk for small forums and a few porn websites, and even the smaller websites had filters to catch lots of the worst stuff before human eyes even saw it, and many had zero-tolerance policies that would immediately permaban anyone who posted something like CP or murder videos.

Facebook doesn't. There are tons of examples of people who have posted horrible things and they get a temporary ban and then let back on. Happens on IG and Twitter, too, probably on all big social media, because they want to maximize profits and that means keeping as many users as possible, even the psychopaths.

Think about all the people you see on FB or Twitter who use obvious racist slurs and nothing ever happens. That's because they won't use basic filtering technology to weed out the worst stuff. So people working as moderators end up seeing it and getting traumatized.

by Anonymousreply 7December 4, 2019 11:59 AM

That seems like the worst job possible. Being exposed to the worst of humanity all day, everyday. Of course they're traumatized. They should all get regular, mandatory counseling.

R7 I was just thinking of that when I read the article, that maybe they could use some filter to spare the mods, but nope. This is fucked up.

by Anonymousreply 8December 4, 2019 12:33 PM

They’re shit moderators. I saw a picture of two dead kids from a murder scene on someone’s FB page, reported it and was told it didn’t violate guidelines and it stayed on the page.

TWO DEAD KIDS.

by Anonymousreply 9December 4, 2019 12:48 PM

...

by Anonymousreply 10December 4, 2019 12:50 PM

Eat shit and die spammer cunt fuck.

by Anonymousreply 11December 4, 2019 12:51 PM

Twitter seems to be better at zapping shitty accounts than FB.

Facebook is handling this all wrong imho. The more people leave FB (I’ve more or less gone off, don’t use Insta or What’s App) the more it will get taken over by Putin and his troll farms. Then people will really leave in droves and it will lose more and more money. Fox news is about to implode and FB is not far behind. Oh well, that little aspie shit can’t say he wasn’t warned.

by Anonymousreply 12December 4, 2019 4:11 PM
Loading
Need more help? Click Here.

Yes indeed, we too use "cookies." Take a look at our privacy/terms or if you just want to see the damn site without all this bureaucratic nonsense, click ACCEPT. Otherwise, you'll just have to find some other site for your pointless bitchery needs.

×

Become a contributor - post when you want with no ads!