TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

The young people sifting through the internet's worst horrors

155 点作者 quick_brown_fox超过 1 年前

24 条评论

john-doe超过 1 年前
<a href="https:&#x2F;&#x2F;archive.is&#x2F;AT6JK" rel="nofollow">https:&#x2F;&#x2F;archive.is&#x2F;AT6JK</a>
extheat超过 1 年前
A lot of attention gets paid to sex abuse content, which it should, but there is much less I feel for all sorts of other abhorrent content. Gore, violence (physical&#x2F;mental), terrorism, death, destruction, shock content, etc. It&#x27;s not just NSFW, but also NSFL content that I think is under discussed at times. Seeing the worst output of humanity non-stop can (and _will_) completely break you mentally if consumed at a sufficient amount, as we are at the end of the day humans. It&#x27;s unfortunate the world is a messed up place, and maybe it will stay that way as long as humans are around.<p>This is the exact type of content that I think AI is so crucial for detecting, and it&#x27;s a bit sad that we often hear so about all the bad things you can generate with AI and so little about all the good. If we need to train models on this type of content to make it easier to detect and remove, to prevent mentally scarring people, then personally I&#x27;m all for it -- regardless of the &quot;it can also generate that bad content&quot; cost.
评论 #38989045 未加载
评论 #38989036 未加载
评论 #38988825 未加载
评论 #38988800 未加载
评论 #38989997 未加载
评论 #38991030 未加载
评论 #38990926 未加载
评论 #38995326 未加载
评论 #38988991 未加载
评论 #38988827 未加载
评论 #38989031 未加载
评论 #38997226 未加载
评论 #38990705 未加载
评论 #38989724 未加载
评论 #38990565 未加载
评论 #38990297 未加载
评论 #38990053 未加载
评论 #38990988 未加载
iamflimflam1超过 1 年前
I worked for a dating company for a while. The moderation team was mostly young people predominantly women.<p>Fortunately, compared to what the people in the article deal with, it was relatively tame. But it still took its mental toll - having to look at dick pics all day is not healthy.
评论 #38988884 未加载
评论 #38988677 未加载
评论 #38989066 未加载
评论 #38988873 未加载
评论 #38989020 未加载
评论 #38988752 未加载
评论 #38998845 未加载
评论 #38989614 未加载
akudha超过 1 年前
While reading stories like these, one thing that depresses me the most is this - how little humanity&#x2F;empathy these suits at corporations have. Sure, their primary (probably only) goal is to make money, but that doesn&#x27;t mean they have to treat their employees&#x2F;contractors like trash.<p>Costco has shown repeatedly that it is possible to build a good business, while treating employees&#x2F;suppliers etc decently. If Costco can do it in retail business (where the margins aren&#x27;t as lucrative as software business), why can&#x27;t Meta?<p>It feels as though they go out of their way to be as nasty as possible. Or live in some kind of weird bubble - like this arsehole for example <a href="https:&#x2F;&#x2F;www.businessinsider.com&#x2F;panera-founder-workers-not-motivated-making-money-shareholders-ceo-therapy-2023-12" rel="nofollow">https:&#x2F;&#x2F;www.businessinsider.com&#x2F;panera-founder-workers-not-m...</a>
评论 #38991156 未加载
评论 #38990567 未加载
评论 #38990958 未加载
评论 #38991119 未加载
评论 #38990986 未加载
justinl33超过 1 年前
Am I the only one wondering how any half capable AI classifier would be unable to identify a literal beheading video or man having s*x with a fish? And especially at a cost that would justify using manual labellers?<p>I get that these human agents are most likely providing labelled training data as opposed to actual filtering but still, surely on more edge-case footage where the model is not confident: not things that could very easily and confidently be classified as inappropriate.
评论 #38991027 未加载
评论 #38989901 未加载
评论 #38993474 未加载
评论 #38996892 未加载
hsuduebc2超过 1 年前
This sounds like early industrialization in England. The way they approach their workers. It&#x27;s awful and ridiculous that Facebook don&#x27;t want to pay them 2.2$. Corporations are really necessarily evil. The depersonalization of people which came with these big structures generally is making most of the problems. People usually are not psychos when dealing with people. Usually.
评论 #38990906 未加载
mschuster91超过 1 年前
Personally, as someone active in Twitter&#x27;s Birdwatch &#x2F; Community Notes, I don&#x27;t even want to imagine what the <i>actual</i> paid moderators get to see. What CN gets is a ton of propaganda (mostly from Russian-backed troll farms and their Western accomplices) and only a few bits of gore (most of that purported to be from the current I&#x2F;P conflict, but actually from other, sometimes many years old wars and conflicts). And that&#x27;s bad enough.
lukas099超过 1 年前
I imagine that moderating content online can take a real toll on one&#x27;s mental health. Sort of like police who have to look through evidence of child porn day after day.
评论 #38989074 未加载
评论 #38988669 未加载
NoZebra120vClip超过 1 年前
I am no longer permitted to post images to my Facebook feed. I am also experiencing glitches with Messenger. I am not sure why, but I can&#x27;t expect Support to be of any help here.<p>Back in 2015, I participated in a genuine knighting ceremony, and it was re-enacted for photo opportunities. So there I was, on my knees receiving a sword to my shoulders from a knight in full regalia. Of course I eagerly posted the photo to Facebook and it was immediately rejected, I assume by AI only, because it was &quot;male pornography&quot;. Yeah, I suppose I could see how the confusion arises there.<p>Unfortunately the false-positives of Trust &amp; Safety can be a real hindrance to those of us who have genuine moments to share with others.
评论 #38991836 未加载
camillomiller超过 1 年前
I would like to point out that this has been a typical recurring high profile reportage, reporting on exactly the same problem, for at least a decade. ABSOLUTELY NOTHING has changed. Not even in Europe, where the case exploded in Germany in 2016.<p>You can translate this piece from the SZ from 2016 to have an idea.<p><a href="https:&#x2F;&#x2F;sz-magazin.sueddeutsche.de&#x2F;internet&#x2F;im-netz-des-boesen-85994" rel="nofollow">https:&#x2F;&#x2F;sz-magazin.sueddeutsche.de&#x2F;internet&#x2F;im-netz-des-boes...</a><p>As someone who reports on media and tech, I have to say: the way nothing changes with Meta and social networks even when their unethical and evil practices are exposed is one of the most frustrating aspects of my career.
jackfoxy超过 1 年前
Damn long-form journalism takes forever to get to the point, so my skimming sometimes misses something crucial.<p>With that disclaimer out of the way, (I think) it is distressing and damaging for young people to be doing this job. (And maybe the rest of the article goes on to say something like that.)<p>Still in today&#x27;s world it is a job that has to be done. If it is to be done it should be by older more mature people with more life experience under their belts. And it should be revealed up front what the job is and how damaging it can be. I think you would have to be very spiritually grounded and take many breaks to refresh your spirit (mental health) to do this.
wslh超过 1 年前
Only a brief comment: I imagine many (and specially the top) social media companies will be sued in the near future like Tobacco companies in the past [1]. It is just that regulations, laws, and politics are slow to adapt to an extremely dynamic world. Here I am not pointing to specific politics, politicians or ideology spectrum but the field of politics itself.<p>[1] <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Tobacco_Master_Settlement_Agreement?wprov=sfti1" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Tobacco_Master_Settlement_Agre...</a>
hsuduebc2超过 1 年前
<a href="https:&#x2F;&#x2F;archive.is&#x2F;AT6JK" rel="nofollow">https:&#x2F;&#x2F;archive.is&#x2F;AT6JK</a>
bertil超过 1 年前
This article, and every comment on here, every discourse about that problem, takes this tsunami of shocking images as a natural, exogenous phenomenon.<p>No one ever asks: Who shares those? Why? Do they ever stop? Can we discourage them from doing that?<p>People uploading this have reasons to do so. You can talk to them. I did.
评论 #38990401 未加载
评论 #38991815 未加载
max_超过 1 年前
In highschool a friend of mine&#x27;s dad worked in traffic police.<p>He opened his dad&#x27;s computer and found so many pictures of dead road accident victims.<p>He described some to me and I found it very disturbing just via verbal description. But he didn&#x27;t seem disturbed at all.
评论 #38989133 未加载
评论 #38990252 未加载
nuc1e0n超过 1 年前
It seems to me the algorithm being followed where these moderators are introduced to this horrible content maximises the anguish it causes. There&#x27;s no proper training in advance to fortify them against it and they&#x27;re straight into the most horrible things from the first day on the job, which is inhumane. Separating the wheat from the chaff of content is difficult and admirable work and these people should be properly supported and rewarded for the risks they take from moderating it on our behalf. A lot of these people are also quite nieve about horrors of the world. Are they really the best ones to do this?
Denote6737超过 1 年前
To everyone saying AI is the solution to this. You are right but only kind of. Much of the work of modern AI is due to backend labelling and tagging of datasets by humans on services like Mechanical Turk.<p>So to train the model you have to get hundreds of thousands, or even millions of images of the kind you wish to filter. Then get thousands of humans to look at them and label them. These people will suffer just the same as those interviewed in this article.
Havoc超过 1 年前
One of the few jobs which I&#x27;d just never do fullstop no matter how desperate.<p>Going to war seems to mess up a good % of people. This stuff seems to have an even higher rate (near 100%?).<p>&gt;Like other sacked moderators, she confessed to a feeling of withdrawal at being deprived of the graphic content she had grown accustomed to.<p>Quite surprised about that part - hadn&#x27;t heard that before in other articles about this topic. Maybe that&#x27;s what drives the people who post all this crap in the first place
ogurechny超过 1 年前
Young people suffering “Internet&#x27;s worst horrors”? You mean the social network services?<p>Those exploitative weepy articles about content moderation are just multi-layered hypocrisy.<p>First of all, it is supposed that people are somehow entitled to “good experiences”, and the arbitrary (and moving) cultural boundary between “suitable” and “unsuitable” is as unquestionable as if it was God-given, when in fact, it&#x27;s just a feature of an entertainment direct-to-screen service available to those from the luckier parts of the world&#x2F;society. Others deal with the feces for them, as shown here.<p>Then there&#x27;s the hypocrisy of the reader, who enjoys the thrills articles like those give, but also enjoys the service too much to stop, and leave the system of exploitation. Like, “It&#x27;s so awful, so awful, but I <i>need</i> my daily dose of filtered cat pictures, so you&#x27;re gonna get that sad dickpick spam in my stead. <i>It&#x27;s just the world we live in!</i> <i>The algorithm</i> makes me continue doing that!”, etc.<p>Then there&#x27;s the talk about values, correctness, and so on, but those decisions are not even personal in the first place. The “SFW” facade is not supported by some die-hard conservatives in power, it is just a business requirement. Say, breastfeeding is considered “problematic” not because of some “clash of cultures”, or “gender conflict”, or “religious opposition”, but because you can&#x27;t use someone&#x27;s body in that context to hold advertisements, as stated in contracts. Now go hide yourself in a ditch somewhere, don&#x27;t spoil our pretty picture. Money happens to be religion here (what an original thought).<p>Some time ago, people naively believed in the future that eliminates travails and death, now we pretend really hard that the “future” is now, and build all kinds of media and social contraptions to make someone else do the “dirty” work (real or “emotional”).
评论 #38995254 未加载
giantg2超过 1 年前
I&#x27;ve wondered about a career in digital forensics, but I assume it&#x27;s at least 50% child porn shit and I don&#x27;t want to deal with or see that.
评论 #38990652 未加载
zoklet-enjoyer超过 1 年前
Goatse, tubgirl, rotten, ogrish were viewed by most of my friend group in middle&#x2F;high school. Sometimes on school computers.
coldblues超过 1 年前
What companies are hiring for this kind of stuff? Do they pay well? Serious question. Email on my profile.
geraldhh超过 1 年前
could we like, pin archive links for paywalled content?
rapidaneurism超过 1 年前
I would consider any opportunity that pays per hour as much as one makes per day, a highly paid one.<p>Perhaps the higher pay reflects the horrible parts of the job.