TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Behind TikTok's boom: A legion of traumatized, $10-a-day content moderators

237 点作者 nixcraft超过 2 年前

33 条评论

club_tropical超过 2 年前
Content moderators and their pay should be a footnote - the real story is Tiktok's treasure trove of criminal evidence, presumably with a lot of PII. It is telling that there isn't any effort to refer evidence automatically to local authorities; the crimefighting would be like shooting fish in a barrel. It would even help the content moderators by reducing caseload after a few highly publicized cases of Tiktok criminals automatically getting caught and prosecuted. That this is not done indicates Tiktok is a giant kompromat honeypot.
评论 #33315294 未加载
评论 #33315635 未加载
评论 #33318767 未加载
z9znz超过 2 年前
What&#x27;s most disturbing is the realization that a measurable percentage of humans are just f*cked up.<p>Considering how many reports there are of Facebook, Youtube, Tiktok, and other sites where content moderators are suffering from having to perform their jobs, it suggests that there&#x27;s a great deal of really terrible stuff going on - and worse, that the people involved are filming and attempting to share it.<p>This is a serious thing, and it paints humanity as being far darker than it would seem from the surface. It also suggests that the apocalyptic movies may not be so far off when they suggest that humanity will revert to open barbarism if we&#x27;re faced with a catastrophy large enough.
评论 #33317555 未加载
评论 #33317537 未加载
评论 #33317049 未加载
评论 #33316950 未加载
评论 #33318618 未加载
评论 #33317457 未加载
评论 #33317039 未加载
评论 #33318127 未加载
评论 #33320581 未加载
评论 #33317979 未加载
评论 #33318426 未加载
评论 #33319239 未加载
评论 #33320052 未加载
评论 #33318483 未加载
Zealotux超过 2 年前
Just watching regular TikTok for a few days was enough to make me feel depressed. I can&#x27;t imagine seeing the dark side of it.
评论 #33321677 未加载
评论 #33314914 未加载
评论 #33316821 未加载
thenerdhead超过 2 年前
The sad reality is that these $10-a-day content moderators make more money than medium sized creators who don&#x27;t know how to monetize and are hoping the TikTok creator fund will do them justice.<p>It&#x27;s a very manipulative platform on all sides. Creator, Consumer, and Moderator.
评论 #33318552 未加载
评论 #33322066 未加载
hodgesrm超过 2 年前
The scenes from this article exceed any dystopian novel I&#x27;ve ever read. Humans have spent centuries building up social structures to eliminate our worst excesses. Social media is tearing those structures down again.
评论 #33315292 未加载
评论 #33315476 未加载
评论 #33315246 未加载
arthurofbabylon超过 2 年前
What a horror. We have an underclass citizenry subject to abuse or at best terribly life-changing work conditions, without the resources (personal, cultural, societal) to protect themselves or change position. It’s horrific.<p>It is a shared problem. Colombia is not so far away, neither geographically nor as a gig-heavy economic model.<p>We should be able to solve this – not from a “content moderation” perspective but from a human work conditions perspective. Counterintuitively, solving content moderation probably requires solving the human work conditions. If the workers were paid considerably better and well-supported, the costs might rise to the point of successfully incentivizing the ideal automated solutions.
yalogin超过 2 年前
Wow, I thought I was expecting nasty stuff to be mentioned there, but what is mentioned I would not even imagine in my wildest. Humans are capable of truly nasty and even more so put it on line for every one to see?<p>I would love to know from people working in this space how they handle illegal videos. Are they responsible to report it to the authorities in the respective countries? The company will not have a presence in many of the countries the app is used in, so even if they wanted to report to the authorities it&#x27;s impossible to keep up. Do they just delete and move on? The legal, ethical implications here are staggering.
评论 #33318681 未加载
more_corn超过 2 年前
At YouTube the content moderation team (site quality) had mandatory group therapy. The therapist recommended not bottling up seeing traumatic things so the moderators would sometimes share between each other. One moderator mentioned that he would never open a link sent by a particular other moderator. I asked why and he said “he’s brown listed”. People suck and the internet brings out the worst in us.<p>They were also on one month contracts. If you made it through your initial period you could choose to convert to full time. Most didn’t.<p>I shudder to think what the next generation of moderators is facing.
bpicolo超过 2 年前
How will services like tiktok ever move past this without strict identity verification and a supporting global legal system? This seems like a common issue for every photo &#x2F; video content provider. We&#x27;ve seen this exact same article about Facebook content moderation.<p>They have talented AI developers. They have a massive amount of harmful known content from moderation. They still need to surface harmful content to humans to check.
评论 #33316397 未加载
评论 #33316736 未加载
评论 #33317379 未加载
georgebarnett超过 2 年前
I find the generic “workers are important” quotes remarkable, in the sense that they do nothing but contribute to the sense that these companies really don’t care.<p>It would almost be better to say nothing.
评论 #33314327 未加载
therusskiy超过 2 年前
Looks like it more or less corresponds to the minimum wage in Colombia, which is roughly $10&#x2F;day*<p>[1] <a href="https:&#x2F;&#x2F;biz30.timedoctor.com&#x2F;average-salary-in-colombia&#x2F;" rel="nofollow">https:&#x2F;&#x2F;biz30.timedoctor.com&#x2F;average-salary-in-colombia&#x2F;</a>
评论 #33314419 未加载
MisterTea超过 2 年前
This is all unnecessary. Social media is a soulless advertising platform that destroys peoples sense of self and being. I don&#x27;t use them and I live a nice life devoid of vapid self promotion and other nonsense.<p>Resist. Delete your social media accounts.
twoeruuoi234超过 2 年前
Erm, don&#x27;t think many of Facebook&#x27;s &quot;moderators&quot; in India are paid any better. I think most BPO&#x2F;call-centre jobs pay as much (it&#x27;s about the GDP-per-capita of &quot;rich&quot; states like Karnataka).
评论 #33314541 未加载
raspyberr超过 2 年前
I watched some TikToks recently by scrolling on the site and it was easily some of the lowest quality stuff I&#x27;ve ever seen in my life. They were literally just someone&#x27;s face, a caption, and some random music for about 15 seconds. I&#x27;ve tried a few times now to search for tags of things I like and it is just the lowest denominator stuff I&#x27;ve ever seen on the internet. It&#x27;s one step away from AI generated (and even that has started doing pretty great things now). I&#x27;m sure it&#x27;s algorithm is amazing but is using TikTok any different to reading reddit or watching YouTube really? Does anyone actually transfer the stuff they read&#x2F;watch from short term to long term memory?
评论 #33315062 未加载
评论 #33316774 未加载
评论 #33317026 未加载
评论 #33317256 未加载
评论 #33315085 未加载
评论 #33319140 未加载
评论 #33317393 未加载
评论 #33317409 未加载
评论 #33317822 未加载
评论 #33316775 未加载
评论 #33317011 未加载
评论 #33315339 未加载
Havoc超过 2 年前
Every time content moderation comes up I’m horrified anew.<p>So sad that this sort of job is necessary.<p>I’m curious though why YouTube never seems to feature? Most of the stories I’ve seen seen to be FB
评论 #33316716 未加载
评论 #33316562 未加载
评论 #33317340 未加载
carrolldunham超过 2 年前
Is it too trite to say that if you only need to check for a video being deletion-worthy, you can do it without watching the video? I click around a paused video if in doubt, and&#x2F;or make it really small. This has saved me several ruined days. Couldn&#x27;t they do the job effectively with a preview strip? playing the video for clarification if unclear
arthurofbabylon超过 2 年前
Question – at what point does content moderation become necessary?<p>Small publishing tools successfully use passive mechanisms to prevent illegal&#x2F;horrific content (for example: community enforcement and platform culture, identity verification, cost&#x2F;pricing). At what point do bad actors and unwanted content arrive? At what point do the passive mechanisms fail?
评论 #33319927 未加载
nojvek超过 2 年前
&gt; “If you’re looking at this from a monetary perspective, then content moderation AI can’t compete with $1.80 an hour,” Carthy said, referring to a typical wage for content moderators based in the global south. “If that’s the only dimension you’re looking at, then no content moderation AI company can compete with that.”<p>I didn&#x27;t realize human labor was that cheap. I assume Teleperformance is taking a decent chunk of $1.8&#x2F;hr. 15 seconds to view each video = 240 videos&#x2F;hr.<p>On AWS, p4d.24xlarge is $32&#x2F;hr with 8 A100 GPUs or $4&#x2F;A100 GPU hr.<p>&gt; “The human brain is the most effective tool to identify toxic material,” said Roi Carthy, the chief marketing officer of L1ght, a content moderation AI company.<p>The learning on my end, is better cheaper computer vision would be able to alleviate human traumatization. The flip side is we&#x27;d also be able to generate 1000s of human traumatizing videos.
ck2超过 2 年前
One step up from China and North Korea&#x27;s prisoners forced to &quot;mine gold&quot; online<p><a href="https:&#x2F;&#x2F;www.theguardian.com&#x2F;world&#x2F;2011&#x2F;may&#x2F;25&#x2F;china-prisoners-internet-gaming-scam" rel="nofollow">https:&#x2F;&#x2F;www.theguardian.com&#x2F;world&#x2F;2011&#x2F;may&#x2F;25&#x2F;china-prisoner...</a>
DrNosferatu超过 2 年前
This &quot;Teleperformance&quot; company keeps showing up at the mention of sweatshops. Remember that name.
dirtyid超过 2 年前
AKA the Chinese&#x2F;PRC model, you need a shit load of actual mk1 eyeballs to filter internet into hugbox for political&#x2F;domestic serenity. Why western platforms had to pull out of PRC (again, not banned) after minority riots, because they couldn&#x27;t stomach the onerous moderation costs required to operate in PRC that other PRC platforms had to endure. A few years later, radicalization on western platforms compelled the same measures - CCP has consistenly been prescient in domains of mass communication. Fringe voices are loudest in an unrestricted enviroment, it takes a lot of resources to put up walls and prune weeds for popular yet harmless mainstream stuff to stand a chance.
jmyeet超过 2 年前
It doesn&#x27;t surprise me there&#x27;s really disturbing content. This is the case for any platform.<p>But how much do moderators spend dealing with that rather BS reports? If you spend any time on Tiktok you&#x27;ll quickly notice that reporting content is heavily brigaded and weaponized. The general process is:<p>1. Mass reports, probably from automated fake accounts, mass report one or more videos and&#x2F;or the creator themselves;<p>2. As with almost all such systems, a certain number of reports triggers an automated response, such as taking down the video;<p>3. The creator then has to manually appeal the takedown. Sometimes they win, sometimes they lose. It seems to be really inconsistent;<p>4. Regardless of the outcome of the appeal, a certain number of reports will trigger a community guidelines violation, possibly locking or even completely nuking the account. The fact that every appeal is won seems to be irrelevant.<p>This problem is so bad that you can duet a video, say nothing and get reported for hate speech or harassment and lose your account.<p>What Tiktok doesn&#x27;t do (like pretty much everyone else) is identify false reports and people who make false reports. It tends to be pretty easy too. Defining characteristics tend to be:<p>1. A default username (&quot;user123239842&quot;);<p>2. Only follows 1 person;<p>3. No PFP.<p>Other sites (eg Twitch) make efforts to at least identify ban evaders. If you don&#x27;t like someone&#x27;s content Tiktok should make it hard for you to find it again and when you do, they should just shadowban you. Your comments don&#x27;t appear to anyone else (which reduces harassment) and your reports are essentially ignored (while appearing to have gone through).<p>I&#x27;m actually amazed at just how easily and how often these systems, which supposedly exist to take down offensive content and protect users, are weaponized for simply saying something someone just doesn&#x27;t like. This would surely reduce the time wasted for moderators (who often rule inconsistently on these issue) so the truly abhorrent content gets immediately nuked.
calme_toi超过 2 年前
Similarly, behind every pair of cheap jeans&#x2F;t-shirts&#x2F;any other &quot;fast fashion&quot; stuff, there are many more workers paid even less than 10USD per day.
评论 #33317510 未加载
pupppet超过 2 年前
Whenever I browse TikTok it&#x27;s like flipping through the channels during the 80&#x27;s, except every channel is only ever showing commercials.
dvngnt_超过 2 年前
I did a paper in undergrad years ago about facebook moderation it was pretty much the same with underpaid foreign contractors with ptsd
prvit超过 2 年前
$10&#x2F;day is probably a lot for this, I wonder how much they could cut their moderation costs by just launching a public platform where anyone anywhere in the world can get paid to do moderation.
评论 #33315556 未加载
JadoJodo超过 2 年前
I’ll say first that I think it’s horrible that we have this problem at all.<p>I wonder if a way to avoid traumatizing workers with entire videos would be to capture single frames from videos, and divide them across all the workers with them still rating the frame for content violations.<p>Obviously, some frames would be worse than others to see, but I would have to imagine it would be drastically lower in terms of impact for each employee than watching video.
评论 #33315513 未加载
_trampeltier超过 2 年前
How they make sure, they don&#x27;t have just gore fans and other strange persons as moderators?
bvoq超过 2 年前
Meanwhile 4chan moderators do this for free, haha
whywhywhywhy超过 2 年前
Disingenuous to just name TikTok for this as if it’s some new extra evil social network.<p>FB&#x2F;IG&#x2F;Twitter all do this and have for over a decade.
评论 #33315738 未加载
评论 #33316656 未加载
评论 #33315645 未加载
groestl超过 2 年前
How about:<p>* Identity verification for all uploaders<p>* Suing for punitive damages for every upload that traumatizes a worker
评论 #33316846 未加载
评论 #33316539 未加载
BrainVirus超过 2 年前
It&#x27;s almost as if building a hypercentralized faux-communication system that tries to encourage mindless behavior and doesn&#x27;t have any notion of locality is an inherently bad idea that will always lead to misery and suffering. Hm. Nah, that&#x27;s can&#x27;t be true. That would mean a lot of things I&#x27;ve read on the internet recently are lies, which is completely impossible. Yeah, I think this is just a sign that people suck and we need more centralization to control them. If we tied all TikTok accounts to some kind of social credit score, that would probably fix everything.
评论 #33318415 未加载
powerapple超过 2 年前
TikTok is the best thing happened in the internet. Why do I say so? Remember Facebook allowing you to reconnect with your lost friends. TikTok is doing this by connecting the world. I have seen many content I would not have seen from any other platform. Interesting food, interesting places, interesting personality. Unfortunately, the same side effect happened to Facebook is also happening on TikTok. But I have to say, TikTok made me seeing more things around the world. It is great in this aspect.