International Skeptics Forum

International Skeptics Forum (http://www.internationalskeptics.com/forums/forumindex.php)
-   Social Issues & Current Events (http://www.internationalskeptics.com/forums/forumdisplay.php?f=82)
-   -   Catastrophic effects of working as a Facebook moderator (http://www.internationalskeptics.com/forums/showthread.php?t=338914)

arthwollipot 17th September 2019 08:26 PM

Catastrophic effects of working as a Facebook moderator
 
From The Guardian:

Revealed: catastrophic effects of working as a Facebook moderator

Quote:

Exclusive: Job has left some ‘addicted’ to extreme material and pushed others to far right

The task of moderating Facebook continues to leave psychological scars on the company’s employees, months after efforts to improve conditions for the company’s thousands of contractors, the Guardian has learned.

A group of current and former contractors who worked for years at the social network’s Berlin-based moderation centres has reported witnessing colleagues become “addicted” to graphic content and hoarding ever more extreme examples for a personal collection. They also said others were pushed towards the far right by the amount of hate speech and fake news they read every day.

They describe being ground down by the volume of the work, numbed by the graphic violence, nudity and bullying they have to view for eight hours a day, working nights and weekends, for “practically minimum pay”.

A little-discussed aspect of Facebook’s moderation was particularly distressing to the contractors: vetting private conversations between adults and minors that have been flagged by algorithms as likely sexual exploitation.

Such private chats, of which “90% are sexual”, were “violating and creepy”, one moderator said. “You understand something more about this sort of dystopic society we are building every day,” he added. “We have rich white men from Europe, from the US, writing to children from the Philippines … they try to get sexual photos in exchange for $10 or $20.”

Gina, a contractor, said: “I think it’s a breach of human rights. You cannot ask someone to work fast, to work well and to see graphic content. The things that we saw are just not right.”

The workers, whose names have been changed, were speaking on condition of anonymity because they had signed non-disclosure agreements with Facebook. Daniel, a former moderator, said: “We are a sort of vanguard in this field … It’s a completely new job, and everything about it is basically an experiment.”

John, his former colleague, said: “I’m here today because I would like to avoid other people falling into this hole. As a contemporary society, we are running into this new thing – the internet – and we have to find some rules to deal with it.

“It’s important to create a team, for example in a social network, aiming to protect users from abusers, hate speech, racial prejudice, better pornographic software, etc. But I think it’s important to open a debate about this job. We need to share our stories, because people don’t know anything about us, about our job, about what we do to earn a living.”
Worth clicking through for the full article. It's pretty disturbing.

uke2se 18th September 2019 12:59 AM

Maybe this whole social media thing was a bad idea.

rjh01 18th September 2019 01:00 AM

Maybe moderators should be doing that job for only a few hours per week. They should do other jobs for the rest of the time.

arthwollipot 18th September 2019 01:34 AM

Quote:

Originally Posted by rjh01 (Post 12824050)
Maybe moderators should be doing that job for only a few hours per week. They should do other jobs for the rest of the time.

They'd need a LOT more moderators in order to do that, and I can't imagine they're getting their doors beaten down by applicants, given how stressful the job is.

cullennz 18th September 2019 01:36 AM

Sounds like Facebook aren't giving these workers enough wrap-around support.

arthwollipot 18th September 2019 01:51 AM

Quote:

Originally Posted by cullennz (Post 12824080)
Sounds like Facebook aren't giving these workers enough wrap-around support.

I think Facebook didn't anticipate the impact that the job would have on those who did it.

It's a job that should be done by an AI, really. Train a deep-learning algorithm to recognise moderatable content and allow it to take immediate action to prevent such content from being posted. People can appeal if they feel they've been unfairly moderated, and those appeals can be considered by human moderators. That's how I'd do it.

uke2se 18th September 2019 01:54 AM

Quote:

Originally Posted by arthwollipot (Post 12824099)
I think Facebook didn't anticipate the impact that the job would have on those who did it.

It's a job that should be done by an AI, really. Train a deep-learning algorithm to recognise moderatable content and allow it to take immediate action to prevent such content from being posted. People can appeal if they feel they've been unfairly moderated, and those appeals can be considered by human moderators. That's how I'd do it.

I disagree. I think Facebook has demonstrated that it isn't capable of having an AI perform the task. Human moderation is something that is sorely needed, not just on facebook, but other forms of social media as well. That opening up human beings to the absolute worst of humanity - posts on social media - has an adverse effect on the psyche is just another drawback of social media.

cullennz 18th September 2019 01:57 AM

Quote:

Originally Posted by arthwollipot (Post 12824099)
I think Facebook didn't anticipate the impact that the job would have on those who did it.

It's a job that should be done by an AI, really. Train a deep-learning algorithm to recognise moderatable content and allow it to take immediate action to prevent such content from being posted. People can appeal if they feel they've been unfairly moderated, and those appeals can be considered by human moderators. That's how I'd do it.

Tend to agree

Heard an interview with NZ's chief censor a while ago. Long serving dude.

A lot of the stuff they see is horrific apparently, but they have psychologist/couciling support, evaluations and forced breaks if necessary.

Was quite interesting. Doubt I would do it.

Suppose it is a bit like ambulance workers whose job it is, is rocking up to car crashes, with kids and babies etc.

Roboramma 18th September 2019 03:03 AM

Quote:

Originally Posted by arthwollipot (Post 12824099)
I think Facebook didn't anticipate the impact that the job would have on those who did it.

It's a job that should be done by an AI, really. Train a deep-learning algorithm to recognise moderatable content and allow it to take immediate action to prevent such content from being posted. People can appeal if they feel they've been unfairly moderated, and those appeals can be considered by human moderators. That's how I'd do it.

I think that's what they are doing already. But they have 2.4 billion users. Even if the majority of moderation is done by AI, there is still an immense amount of oversight from actual humans necessary.

Belgian thought 18th September 2019 03:40 AM

BBC Radio 4 The Digital Human series had a programme about the problems faced by content moderators nearly 2 years ago - "Sin-eaters". Well worth a listen if not rather depressing.

https://www.bbc.co.uk/programmes/b096h775

Darat 18th September 2019 05:45 AM

Thought I'd made a post about this this morning, probably in the middle of a thread about sheep.

There is no excuse for this, it is well known that operating in a stressful environment will have negative impacts on people.

Facebook should have been ensuring proper enforced breaks, scheduling, downtime and provided counselling. But of course that costs money so they'd rather use contractors as disposable resource units. They aren't your staff, so you can pretend it's nothing to do with Facebook what happens with them.

Puppycow 18th September 2019 06:19 AM

When they see illegal activity, what do they do? Do they report it to the police? Or just delete it and move on?

Puppycow 18th September 2019 06:21 AM

What would stress me is if I see something terrible and there's nothing I can do to stop it. Nothing besides deleting it or at most banning an account.

rjh01 18th September 2019 03:58 PM

Quote:

Originally Posted by Puppycow (Post 12824316)
When they see illegal activity, what do they do? Do they report it to the police? Or just delete it and move on?

It is not up to them to decide if it is illegal. Only a judge can say it is illegal. Is there any law that says you have to report crimes to the authorities anyway? Apart from child abuse.

Puppycow 18th September 2019 04:11 PM

Quote:

Originally Posted by rjh01 (Post 12824980)
It is not up to them to decide if it is illegal. Only a judge can say it is illegal. Is there any law that says you have to report crimes to the authorities anyway? Apart from child abuse.

And even that's only certain people in particular jobs.

pgwenthold 18th September 2019 04:54 PM

They don't have to decide if it is illegal, but they can certainly decide if it is not allowed at Facebook.

And Facebook can say screw this "fair treatment and free speech" nonsense. It's their platform, they can say no. They can absolutely ban pictures of breastfeeding just as they do topless beach bimbos, on the grounds that they aren't going to have any of it.

I see nothing wrong with a hardline policy.

Let Facebook be a place where people share pictures of their kids for grandparents to see. It works great for that. I like to look at pictures of doggies (and not read the comments). It works great for that, too.

They don't have to worry about where the line is for creepy old guys paying Phillipino teenagers for nude selfies. There is no reason to be anywhere near it. And don't bother listening to the whining of people complaining that, I didn't really offer money for those nude selfies. Who gives a ****** Go away.

wasapi 18th September 2019 05:26 PM

Not long ago, a man kept sending me porn films. While I don't have an issue with porn (unless children are involved), but these showed brutality and were graphic and disgusting.

Where were these guys? It took some time after I reported the guy to FB, but I know he wasn't banned, and he actually sent me a friendship request afterwards.

mgidm86 19th September 2019 08:22 PM

Reading hate speech and fake news pushes one to the far right? Was anyone pushed to the far left? <shrug>

BobTheCoward 19th September 2019 09:54 PM

As a libertarian, I always say taxes are theft. The some people reply that taxes are the price of society and we should share in it.

But I have always been fascinated by the emotional detritus created as the price of society. And people like these moderators are the ones having to catalogue it. If we care about everyone contributing to society, it seems we should evenly distribute the burden of exposure to human suffering.

Chanakya 19th September 2019 10:05 PM

Why just the exposure? Following that reasoning, why not think of ways to equally share in the actual suffering itself?

And incidentally: Do you think of society in terms of your country? If not, then there'll be lots of suffering to share in, if you've a mind to take it on.

Doghouse Reilly 22nd September 2019 04:43 PM

Quote:

Originally Posted by wasapi (Post 12825076)
Not long ago, a man kept sending me porn films. While I don't have an issue with porn (unless children are involved), but these showed brutality and were graphic and disgusting.

Where were these guys? It took some time after I reported the guy to FB, but I know he wasn't banned, and he actually sent me a friendship request afterwards.

Why didn't you just block him? Why do people think it's the job of Facebook to do all the moderating? It's extremely simple to block people who are doing things you find objectionable.

BobTheCoward 22nd September 2019 05:31 PM

Quote:

Originally Posted by Chanakya (Post 12826478)
Why just the exposure? Following that reasoning, why not think of ways to equally share in the actual suffering itself?

And incidentally: Do you think of society in terms of your country? If not, then there'll be lots of suffering to share in, if you've a mind to take it on.

I'm open to the idea.

wasapi 22nd September 2019 05:50 PM

Quote:

Originally Posted by Doghouse Reilly (Post 12828987)
Why didn't you just block him? Why do people think it's the job of Facebook to do all the moderating? It's extremely simple to block people who are doing things you find objectionable.


I did block him and continued to block him 5 more times, he kept finding a way to still send me more pornography, right after banning him.

CaptainHowdy 22nd September 2019 09:23 PM

Quote:

Originally Posted by arthwollipot (Post 12823902)
From The Guardian:

Revealed: catastrophic effects of working as a Facebook moderator



Worth clicking through for the full article. It's pretty disturbing.

One approach Facebook could take is remove itself from the business of moderating content. Make it easy for users to block other users and websites or even broad categories of content they find offensive and let individuals sort it out among themselves. Facebook is not held responsible if a terrorist attack was organized on its platform any more than AT&T is held responsible if the terrorists used the telephone. There's really no reason for Facebook to decide what is OK for you to see.

If Facebook wants to assume responsibility for users by moderating content, then they shouldn't hire people who have weak constitutions. Any law enforcement officer who volunteers to join the child sex crimes unit knows they might be exposed to disturbing material. Similarly, a Facebook moderator shouldn't be shocked--shocked I tell you--by being exposed to Black men sending dickpicts to underage girls.

Elagabalus 22nd September 2019 09:28 PM

Quote:

Originally Posted by CaptainHowdy (Post 12829121)
One approach Facebook could take is remove itself from the business of moderating content. Make it easy for users to block other users and websites or even broad categories of content they find offensive and let individuals sort it out among themselves. Facebook is not held responsible if a terrorist attack was organized on its platform any more than AT&T is held responsible if the terrorists used the telephone. There's really no reason for Facebook to decide what is OK for you to see.

If Facebook wants to assume responsibility for users by moderating content, then they shouldn't hire people who have weak constitutions. Any law enforcement officer who volunteers to join the child sex crimes unit knows they might be exposed to disturbing material. Similarly, a Facebook moderator shouldn't be shocked--shocked I tell you--by being exposed to Black men sending dickpicts to underage girls.

But White men are totally OK?

Information Analyst 23rd September 2019 02:23 AM

Quote:

Originally Posted by arthwollipot (Post 12823902)
From The Guardian:

Revealed: catastrophic effects of working as a Facebook moderator

Worth clicking through for the full article. It's pretty disturbing.

Hardly surprising, given that trained police officers have fallen into a similar trap. In the UK there have been moves to use AI to assess and cross-reference suspect videos, rather than having officers trawl through them.


All times are GMT -7. The time now is 01:25 AM.

Powered by vBulletin. Copyright ©2000 - 2019, Jelsoft Enterprises Ltd.
© 2015-19, TribeTech AB. All Rights Reserved.