TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

The Worst Job in Technology: Staring at Human Depravity to Keep It Off Facebook

138 pointsby mefover 7 years ago

20 comments

Waterluvianover 7 years ago
I found this juxtaposition unsettling:<p>“I was watching the content of deranged psychos in the woods somewhere who don’t have a conscience for the texture or feel of human connection,”<p>&quot;...If the managers noticed a few minutes of inactivity, they would ping him on workplace messaging tool Slack to ask why he wasn’t working.&quot;<p>The texture of human connection is severely diminished when you are managing what is essentially a drip-fed trauma survivor remotely using a metric of trauma exposure per minute.
评论 #16034466 未加载
quantummkvover 7 years ago
The real problem is not that such content exists, the problem is that Facebook makes it horrifically easy to distribute such content. Back when Facebook was more of interacting with people rather that liking and subscribing to whatever these content and joke pages shoved down these concerns where not present much. Clickbait, etc is just a natural progression of like and subscribing culture.<p>My Facebook feed about a year ago, when i last opened facebook was filled with the same memes and &quot;inspirational messages&quot; shoved down by a handful of pages being shared by dumb idiots. I had to search and go directly to a person&#x27;s profile to see what they were up to.<p>The real problem is that Facebook started behaving like a TV channel that a social network.<p>If Facebook goes to only keeping profiles of real persons and removes any and all of these pages and blogs and news agencies and the lot, a lot of its woes with regard to content will be solved.<p>These pages give a sense of anonymity to the people behind them. Take that anonymity away. Once you know that your personal image will be directly tied to whatever you post and held responsible by everyone in your friend list, you will begin to curb your tendencies in public.
评论 #16047550 未加载
评论 #16035960 未加载
mbrumlowover 7 years ago
This sort of stuff at one point was handled by hosting providers. It now seems that the internet is &quot;facebook&#x2F;google&quot; so it is no wonder they are getting the brunt of this sot of work.<p>I can tell you back in the day when I worked for Rackshack I never envied the abuse department. They always looked stressed out.<p>8k post in a day is just too many for one person. Its not about how much work they are doing, its about the content they are subjected to look at and review. To do that you have to actually think about it and make a decision -- and that takes a toll on people -- you don&#x27;t get to forget.<p>I am not one for regulation but if there is on place in tech that should be considered for regulation, then I think this is a good place to start.<p>These workers need paid more, access to therapy and much more time off. I also think there are technical solutions to help ease the work needed, but that cost, and nobody seems to want to pay with human workers left holding the bag. This would also include stricter rules to make filtering easier.<p>Good job, you guys can put a dancing hotdog on the screen why not use that talent to work and make back-end systems to automate this sort of horrid work away. I know it will be hard, but if NN and deep learning is all cracked up to what keeps being preached then it should be within the realm of possible.<p>Also, while with the when it comes to the law, I am for the notion that it is okay if 10 bad people get away if it means not falsely convicting 1 good person. However on the internet with regards to what normally amounts to pointless shit that people post on the internet. I am okay with 100 good post being automatically removed it if means 1 bad post is also removed.
评论 #16034795 未加载
评论 #16034381 未加载
评论 #16034876 未加载
评论 #16035678 未加载
评论 #16034277 未加载
评论 #16034311 未加载
rpmcmurphyover 7 years ago
The internet really is an incredible cesspool. It would be interesting to see how the public reacted if YouTube and Facebook turned off their content moderation for a week. It would make the goatse meme look like a Sunday school picnic.
评论 #16035064 未加载
Santosh83over 7 years ago
Why are we surprised or shocked? Hasn&#x27;t society always used servants, police, soldiers, miners, loggers, garbage men, wardens, and such to do tasks that the rest of us are loath to do, and to keep certain stuff away from &#x27;civilised&#x27; society?<p>Why would online be different suddenly? Analogues of all the above are needed online too. And somebody who has no other option will be unfortunate enough to fill these roles.
评论 #16034388 未加载
评论 #16034772 未加载
gonzo41over 7 years ago
What they don&#x27;t tell you, you can get PTSD from being a witness to trauma. It happens to cops and lawyers investigating child abuse all the time.
radmarshallbover 7 years ago
I would expect that social media websites whose content is largely decided democratically (via votes, shares, or the like) would relegate the majority of this content to a place where it is not seen by many. I would argue that the best way to handle this issue is to let the sites mechanisms deal with the content accordingly and then focus efforts on developing processes that will be able to detect and remove it automatically.<p>The article implies that they are forcing moderators to view the content at a high clip. Why, so as to get false positives back online as quickly as possible? Maybe moderators should only review content that reaches a certain threshold of complaint, and other content is left as is?
评论 #16034420 未加载
评论 #16035709 未加载
tyingqover 7 years ago
Can&#x27;t help but being reminded of this Silicon Valley episode: <a href="https:&#x2F;&#x2F;youtu.be&#x2F;dvn-hpZdElo" rel="nofollow">https:&#x2F;&#x2F;youtu.be&#x2F;dvn-hpZdElo</a>
wglbover 7 years ago
In the very early days of the internet being used at a very large corporation, I had the task of reviewing proxy logs to monitor what was euphemistically called &quot;non-business use of the internet&quot;. I started by scanning for URLs with &quot;XXX&quot; in them, then pivoted to make a more extensive list.<p>I never looked at the content itself. Just the seeing the URLs was corrosive enough.
versteegenover 7 years ago
A Kaggle competition just started a week ago, hosted by Jigsaw (part of Alphabet) for classifying toxic content (insults, threats, vulgarity, etc) in online comments. $35,000 prize pool. <a href="https:&#x2F;&#x2F;www.kaggle.com&#x2F;c&#x2F;jigsaw-toxic-comment-classification-challenge" rel="nofollow">https:&#x2F;&#x2F;www.kaggle.com&#x2F;c&#x2F;jigsaw-toxic-comment-classification...</a><p>Also interesting, they already have an API for doing this sort of classification: <a href="https:&#x2F;&#x2F;perspectiveapi.com&#x2F;" rel="nofollow">https:&#x2F;&#x2F;perspectiveapi.com&#x2F;</a>
mefover 7 years ago
non paywall link <a href="http:&#x2F;&#x2F;archive.is&#x2F;tmFps" rel="nofollow">http:&#x2F;&#x2F;archive.is&#x2F;tmFps</a>
guuzover 7 years ago
It&#x27;s a good space for regulation. The treatment these workers are subjected to is ignominious.
lyra_commsover 7 years ago
One of our team members used to do this job; luckily, she managed to do it with deep learning, so didn&#x27;t have to spend too much time looking at unpleasant images.<p>This experience is one of the main drivers that pushes our team to develop an open, nonprofit conversation platform on which harrassment is difficult by design.<p>www.hellolyra.com&#x2F;introduction
mikehinesover 7 years ago
I wonder what AI thinks and then does to us when we have one.
aglavineover 7 years ago
The problem is more the job conditions than the job itself.
merittover 7 years ago
It&#x27;s not a popular opinion but remove the anonymity&#x2F;fake accounts and you eliminate a very significant portion of these issues.
megamindbrian2over 7 years ago
I want to do this job.
katasticover 7 years ago
South Park did an episode on this. Where everyone was so afraid of &quot;reality&quot; they elected someone to censor it all and keep their tweets only positive.<p>Butters had to see every depraved thing. And he ended up trying to kill himself.<p>And in the end, when he almost died, they blamed <i>him</i> for failing to be the perfect filter.
评论 #16036338 未加载
Pica_soOover 7 years ago
If there ever was a ISIS recruiting pool, this is it.
jorgecover 7 years ago
It starts censoring topics that are illegal, then topics that aren&#x27;t family friendly and finally, it will end censoring point of view and free speech.<p>Its 1984 all over.<p>Im not an anarchist however, if something is illegal then go ahead and cut it but, censoring without a legal cause is a crime.
评论 #16034909 未加载
评论 #16034897 未加载