TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Facebook to add 3k people to community operations team to improve moderation

159 pointsby tinodotimabout 8 years ago

33 comments

hartatorabout 8 years ago
I wonder if they will push some politic agenda or if they will manage to stay neutral.<p>Even if I have own opinions on the subject, I think the whole &quot;fake news&quot; thingy seems to politicaly motivated. Doesn&#x27;t mean they are wrong by doing this, but it&#x27;s hard to be claiming they are still somehow neutral.
评论 #14257712 未加载
评论 #14258090 未加载
评论 #14257519 未加载
评论 #14257260 未加载
评论 #14257462 未加载
评论 #14257295 未加载
评论 #14259656 未加载
评论 #14257111 未加载
评论 #14257099 未加载
评论 #14258307 未加载
评论 #14258845 未加载
评论 #14257167 未加载
blfrabout 8 years ago
What&#x27;s weird is that Facebook cannot rely on their users to report blatantly criminal acts witnessed by thousands of people. It probably says more about Facebook users than the platform and makes me doubt that doubling this or that team size can make a meaningful difference.<p>Especially with this approach of manual monitoring which will probably just result in more questionable deletions Facebook is already known for.
评论 #14256578 未加载
评论 #14257150 未加载
评论 #14256529 未加载
评论 #14257984 未加载
评论 #14256632 未加载
评论 #14256465 未加载
评论 #14256857 未加载
评论 #14257010 未加载
kensaiabout 8 years ago
3000 persons on top of the current 4500 is a big addition. If all these persons are dedicated to the prompt elaboration of complains and violations of TOS, it might indeed make the difference.<p>I don&#x27;t ask if it is economically viable, I guess he knows what he is doing. Facebook is not losing money anytime soon.
评论 #14256510 未加载
评论 #14256460 未加载
评论 #14256601 未加载
评论 #14256306 未加载
评论 #14256403 未加载
malandrewabout 8 years ago
Since social media use itself contributes to lower self image and depression, how much are they going to look into their own product as contributing to a problem getting worse for an individual. It would seem that work being done to drive engagement is most problematic for those at risk.
评论 #14256801 未加载
评论 #14258771 未加载
评论 #14256443 未加载
unklefolkabout 8 years ago
I suspect the long term plan is to create a training dataset labelled by the 3000 people and, when they have sufficient training data, let machine learning &#x2F; AI take over.
评论 #14256855 未加载
评论 #14256927 未加载
评论 #14257207 未加载
评论 #14256812 未加载
评论 #14256790 未加载
评论 #14263313 未加载
评论 #14257184 未加载
评论 #14256805 未加载
olivermarksabout 8 years ago
<a href="https:&#x2F;&#x2F;www.buzzfeed.com&#x2F;reyhan&#x2F;tech-confessional-the-googler-who-looks-at-the-wo?utm_term=.voVYAdXJ7#.fjplEpV02" rel="nofollow">https:&#x2F;&#x2F;www.buzzfeed.com&#x2F;reyhan&#x2F;tech-confessional-the-google...</a><p>The horrible reality of trying to keep offensive materials from appearing online.<p>1.86 billion FB active users divided by 3000 thought police equals 620k accounts per clean up team operative...
评论 #14258318 未加载
bluetwoabout 8 years ago
Was just talking last week that the last thing any of these large internet companies wants to do is hire a large room filled with low paid workers to do anything, especially here in the US.<p>If they are making this move they must see some large liability looming on the horizon.
评论 #14256328 未加载
评论 #14257140 未加载
评论 #14256352 未加载
评论 #14256330 未加载
评论 #14257346 未加载
blauditoreabout 8 years ago
I wonder if this will reduce the problem of fake accounts. I regularly get such friend requests, and it starts to get annoying.<p>Also, those seriously affect the attractiveness of ad campains. I dipped my toes into it once, but it looks like a large percentage of gained &quot;users&quot; are just fake ones...
评论 #14256606 未加载
socrates1998about 8 years ago
Long overdue. I sort of get these online social media companies skimping on moderation while they are growing and don&#x27;t have cash.<p>Facebook is rolling in cash and this has clearly hurt their brand. Hiring moderators to take out the worse of Facebook could help a lot to dealing with the utter bullshit that goes on.<p>Reddit and Twitter have similar problems, they want to either farm out moderation to volunteer users (Reddit) or automate everything and only step in when the NY Times gets a hold of something (Twitter).<p>Either way, their moderation leaves a lot to be desired.<p>Reddit is particularly strange. I don&#x27;t know how this could be true, but they claim that without the volunteer mods, they couldn&#x27;t exist. Either they are lying or are just awful at running a business, neither would surprise me.<p>Does anyone know if Reddit actually makes money? And if they do, how? Ads seem sparse and selling &quot;gold&quot; just doesn&#x27;t seem like much.
zoulabout 8 years ago
Slashdot’s moderation and meta-moderation system (<a href="https:&#x2F;&#x2F;slashdot.org&#x2F;moderation.shtml" rel="nofollow">https:&#x2F;&#x2F;slashdot.org&#x2F;moderation.shtml</a>) always comes to mind. Could something like that work for Facebook self-moderation?
评论 #14257810 未加载
flexieabout 8 years ago
In parts of Eastern Europe, Asia, South America, and Africa, Facebook could hire 3,000 college educated employees for less than $3M monthly, including taxes. So while this from a human point of view is a very decent thing to do, it&#x27;s not necessarily as costly as one might think, especially not when considering the possible liability or regulatory backlash they may run into if they do nothing and Facebook becomes the place for suicides and violence.<p>Now, if they add all those jobs in SF, it&#x27;s another thing entirely.
problemsabout 8 years ago
This is sort of ridiculous, they can&#x27;t seriously be expected to be held responsible for every video served on their platform. Nor should there be anything inherently worse about live-streaming of violence than that violence occurring in the first place.<p>I can only see this as a good thing if they manage to catch people before the act and intervene. Is this a primary goal of the program?
评论 #14256485 未加载
评论 #14257193 未加载
GCA10about 8 years ago
Can&#x27;t help but think of Deming&#x27;s famous advice on how to deal with quality issues: &quot;Eliminate the need for inspection on a mass basis by building quality into the product in the first place.&quot;<p>If Mark Zuckerberg and his product teams could travel back in time a decade or so, would they still have built everything the way they did?
评论 #14257935 未加载
devdoomariabout 8 years ago
I don&#x27;t expect much from this...<p>I&#x27;ve reported countless spammy comments like &quot;free ladies!&quot; and &quot;free $50 per click!&quot; But FB replies are 100%: &quot;...but our community deemed that comment to be ok with FB guidelines&quot;<p>FB should definitely see what Korean FB is like recently...
avarabout 8 years ago
I predict that Facebook&#x27;s current PR problem related to this is going to be replaced by a swatting problem before long. I.e. someone reports someone on Live as suicidal, SWAT&#x2F;police show up, shoot their dog etc, public outrage against Facebook ensues.
paraditeabout 8 years ago
Western social media seems to be one step behind the Chinese counterparts in terms of moderation strategies.<p>Living streaming scene in China has already went through this entire discussion and various forms of implementations on moderations of offensive and inappropriate contents sometime last year when it was rapidly growing.<p>Again with government intervention, it is so much faster and easier to enforce standards of moderations for private companies to follow.
评论 #14257439 未加载
评论 #14257378 未加载
fortyninersabout 8 years ago
Machine Learning and AI are dropping the ball here?
6stringmercabout 8 years ago
I get the feeling the job itself might have some side-effects, if this article from quite a while ago has any relevance to the subject at hand:<p><a href="https:&#x2F;&#x2F;www.buzzfeed.com&#x2F;reyhan&#x2F;tech-confessional-the-googler-who-looks-at-the-wo" rel="nofollow">https:&#x2F;&#x2F;www.buzzfeed.com&#x2F;reyhan&#x2F;tech-confessional-the-google...</a>
user5994461about 8 years ago
What is &quot;CO&quot;?
评论 #14256358 未加载
评论 #14256334 未加载
narratorabout 8 years ago
China has 30,000 people who work full time moderating the Internet: <a href="http:&#x2F;&#x2F;certmag.com&#x2F;the-great-firewall-how-china-polices-internet-traffic&#x2F;" rel="nofollow">http:&#x2F;&#x2F;certmag.com&#x2F;the-great-firewall-how-china-polices-inte...</a>
avivoabout 8 years ago
They already had 4500 people. And use of the moderated features is likely still growing rapidly.<p>It&#x27;s great that Facebook is increasing it&#x27;s moderation numbers, but it&#x27;s unclear if this was already planned and simply used (successfully) as a PR response to recent events.
deeglesabout 8 years ago
I wonder if Facebook made much money on the ads displayed alongside this type of content (or perhaps pre- or mid- roll ads). Do they have a responsibility to treat this income differently?
genkaosabout 8 years ago
Recomended minidoc &quot;Field of Vision - The Moderators&quot; <a href="https:&#x2F;&#x2F;vimeo.com&#x2F;213152344" rel="nofollow">https:&#x2F;&#x2F;vimeo.com&#x2F;213152344</a>
zeptoabout 8 years ago
Does anyone else think it is dangerously Orwellian to describe speech as people &#x27;hurting themselves and others&#x27;?<p>We seem to have a serious problem resulting from people living in bubbles of information sources that only confirm their own viewpoint.<p>How can the solution to that problem be to have a single corporation design the bubble for everyone?<p>(Note: I know he&#x27;s taking about actual videos of violence taking place. However my point is that violence is already happening, and hiding that from public view is &#x27;out of sight, out of mind&#x27;)
评论 #14256212 未加载
评论 #14256228 未加载
评论 #14256197 未加载
评论 #14256211 未加载
评论 #14256779 未加载
评论 #14256376 未加载
评论 #14256223 未加载
davexunitabout 8 years ago
Not optimistic but hopefully things will improve. As of now Facebook can&#x27;t even remove an account for a hate blog local to my area.
chinathrowabout 8 years ago
We laughed when China added x thousands to their great firewall content moderation team.
zoewabout 8 years ago
Hopefully things will improve and we will see better things in our feed
anigbrowlabout 8 years ago
Why doesn&#x27;t Facebook give people a way to have input to their &#x27;community standards&#x27;? Basically it&#x27;s a black box that&#x27;s presumably stuffed with lawyers, marketers, and some analytics people. I see zero evidence that there is any actual input from the people who use FB. It&#x27;s essentially a dictatorship dressed in a costume of democracy, and I would far prefer it if the &#x27;community standards&#x27; were called what they are, &#x27;Rules of Mark&#x27;s Club.&#x27;<p>This is a sore point for me as an artist. It&#x27;s tedious when posts are removed because they depict or seem to depict nudity and you have to go through and assure some anonymous and wholly unaccountable person that they&#x27;re not. One of my friends teaches art history at UCLA and - surprise - he posts lots of fine art on his wall. He has to have 8 or 10 accounts because he is constantly getting temp banned for posting famous paintings of people with no clothes.<p>It also bothers me on a more general level, eg it&#x27;s fine if I take a picture of myself with my shirt off but if one of my female friends does the same thing she risks being restricted from posting or having her account terminated because her breasts are apparently a worse thing than extreme gory graphic violence that comes with a warning but is nevertheless acceptable to post.<p>That&#x27;s sexist bullshit that turns women into second class citizens. I utterly fail to understand how it&#x27;s OK to share pictures of just about any violent subject matter, but any kind of nudity, sexual or not, is grounds for having your account terminated.<p>Here&#x27;s a list of some of the things I&#x27;ve seen on FB over the last year, some with an automatic clickthrough content warning (which is a good idea and mostly well implemented) and some not. As far as I&#x27;m aware none of these have resulted in account terminations for people who posted them:<p>Beheadings (video, multiple examples); hanging; people being shot&#x2F;have been shot; serial killers and their refrigerators stuffed with human meat; disembowelments; autopsy photos. In each of these cases I don&#x27;t mean grimy thumbnails where you can sort of imagine what was going on, but photos and video of sufficient clarity to be used in a news broadcast if not for the disturbing nature of their subject matter.<p>I&#x27;m leaving out other stuff that I found sufficiently disturbing that I prefer not to even describe it. I&#x27;m not into gore, beyond watching a few horror movies in a given year. But I&#x27;m pretty open with my friends list and allow people to join me to groups, so I&#x27;m exposed to a certain amount from trolls and of course there are episodes of violence in the real world that are newsworthy, and I prefer my news without censorship of any kind.<p>You&#x27;ll notice that I&#x27;m not calling for this stuff to be removed or banned from FB. I think the &#x27;graphic content, are you sure?&#x27; warning strikes a sensible balance between protecting people&#x27;s sensibilities and allowing free discussion and information. We live in a world that is often violent and I believe that concealing the ugliness of violence often allows it to proceed unchecked. It&#x27;s also true that some people become obsessed with or celebrate violence, and that admitting it as cultural currency risks desensitization or normalization of violence. Those are tricky questions to which I do not believe any one person, firm, or society has a perfect answer, but given that the instinct of criminal persons and regimes is generally to conceal rather than reveal transgressions, exposure and condemnation is probably a more effective response than obscurity and censorship.<p>After that unpleasant detour into the pits of human awfulness, I <i>really</i> want to hear from someone at Facebook:<p>a) why it&#x27;s OK to engage with the reality of people inflicting horrible violence on others, but it&#x27;s not OK to let people engage with the reality of sexual or aesthetic expression, and<p>b) why the 51% female majority of the population are subject to tighter restrictions than the male minority, and<p>c) why the &#x27;community standards&#x27; don&#x27;t offer any formal mechanism for community input and decision-making.<p>Think about it, Folks. A picture of a healthy naked body is grounds for account suspension or a ban, but it&#x27;s totally OK to show that same body hacked to pieces? That&#x27;s some grade A bullshit, and platitudes about how &#x27;we try to reflect the prevailing standards of society&#x27; isn&#x27;t going to cut it.<p>Automation <i>intensifies</i> whatever process you choose to automate, and if you automate a standard whereby erotic desire and self-expression are constrained but extreme violence and interpersonal aggression are less constrained, guess which you&#x27;ll end up with more of? Likewise if men are allowed freedoms that are systematically withheld from women, guess whose freedoms are going to be expanded and whose are going to be reduced?<p>I demand answers on this. Facebook is one of the most powerful political entities on the planet and those who own it need to explain why, within Facebook, there is greater tolerance for violence than nudity or sexuality, and why one half of the population is subject to greater restrictions than the other half.
mmahemoffabout 8 years ago
Dear mods, I enjoy TLAs as much as the next geek, but can we please change CO to Community Operations?<p><i>edit - that was quick, thanks!</i>
d1ffuz0rabout 8 years ago
Censure is coming
gm-conspiracyabout 8 years ago
Are they hiring pre-cogs from Mars?
OedipusRexabout 8 years ago
The issue with work like this is it is very distressing. Having to look at videos and pictures of horrific acts (suicides, child porn, etc) is not a pleasant job. Many of those who do this in law enforcement (mainly child porn) have high rates of PTSD and other issues.
losteverythingabout 8 years ago
Mr zuckerberg&#x27;s involvement signals it is a huge problem, one unable to spin away..<p>People always figure out a way around the latest attempt to control - and these measures will be overcome by bad intenders.<p>So if FB admits it has a community safety and fake news problem - by hiring 3000 additional enforcement agents, why would one stay with FB if there is an alternative. Which there isn&#x27;t.<p>I would like a FB lite. Photo sharing and comments and discovery of old friends. No news. No menu of features.<p>I wish there could be an alternative. (&quot;Its just like. fB but without the news and crap)
评论 #14256750 未加载
评论 #14256717 未加载
评论 #14257588 未加载