Having worked with journalists, these sound like typical entitlement complaints. Frankly, a lot of writers have an attitude that they're "artists" who shouldn't be rushed and don't have performance requirements.<p>Frankly, it does not sound very hard at all. They have to write 20 posts a day, but each post is only a headline and a brief summary. A focused writer can finish that in 15 minutes.<p>> We had to write in the most passive tense possible. That’s why you’d see headlines that appear in an alien-esque, passive language.<p>Oh no! How dare Facebook strive to be neutral and passive.<p>> After doing a tour in Facebook’s news trenches, almost all of them came to believe that they were there not to work, but to serve as training modules for Facebook’s algorithm<p>Of course they were. Facebook is quite explicitly trying to apply ML throughout their site. Why should they permanently be in the business of employing writers to do something which computers could do reliably and effectively?
"After doing a tour in Facebook’s news trenches, almost all of them came to believe that they were there not to work, but to serve as training modules for Facebook’s algorithm." ha! aren't we all.
'Mark Zuckerberg has been transparent about his goal to monopolize digital news distribution. “When news is as fast as everything else on Facebook, people will naturally read a lot more news,”'<p>Oh lord. That's not news. Those are sensationalist, click baity headlines making us all collectively dumber. It's anecdotal, but I'm quite sure the 'news' my girlfriend has gotten from FB has made her less informed.<p>I actually wrote an FF plugin to hide that right bar, since I would now and then get sucked in by that crap and waste 15 minutes reading about Lindsey Lohan or whatever.
When the employees of Facebook collectively ask "What responsibility do we have to stop the election of Donald Trump" at an internal meeting, it gives me Orwellian chills. Today, it's not governments we need to fear the most, it's data hungry, fascist internet corporations.
> “It was degrading as a human being,” said another. “We weren’t treated as individuals. We were treated in this robot way.”<p>Strange that people at Facebook would feel apathetic toward individuals that can't code.
<i>almost all of them came to believe that they were there not to work, but to serve as training modules for Facebook’s algorithm</i><p>If Facebook is doing it's job and utilizing their biggest talent acquisition of Yann LeCun, then this is exactly what they should be doing.<p>A critical part of the path towards AGI is using humans to teach it.
So, I got interested in this article's description of horrid working conditions and decided to read about it carefully. But I noticed that it gives emotional descriptions of it long before the actual facts. I wouldn't go as far as calling this a manipulation, but it's certainly a disturbing writing style.<p>3rd paragraph:<p>> grueling work conditions, humiliating treatment, and a secretive, imperious culture in which they were treated as disposable outsiders.<p>6th paragraph:<p>> “It was degrading as a human being,” said another. “We weren’t treated as individuals. We were treated in this robot way.”<p>And then, finally, on 10th paragraphs, we get a glimpse on the facts:<p>> they received benefits including limited medical insurance, paid time off after 6 months and transit reimbursement<p>(BTW, is it usual for contractors to receive such perks?)<p>> A company happy hour would happen at 8 p.m., and we’d be working<p>Horrible, inhumane treatment indeed.<p>> Over time, the work became increasingly demanding, and Facebook’s trending news team started to look more and more like the worst stereotypes of a digital media content farm. Managers gave curators aggressive quotas for how many summaries and headlines to write, and timed how long it took curators to write a post. The general standard was 20 posts a day.<p>20 posts during 8 hour work day is almost half an hour on one post. It is considered too little? Seriously?<p>So — apart from all the pretty words, I didn't really see any especially bad treatment. Hell, I'm pretty sure that your average newspaper employees have more nightmare stories.
“It was degrading as a human being,” said another. “We weren’t treated as individuals." hmmm. Trying hard to feel sympathy/emapthy for these victums but my algorithm is throwing an exception.
Reducing the number of journalists implies that supervised learning[0] was effective enough to begin to replace some human effort.<p>[0] <a href="https://en.wikipedia.org/wiki/Supervised_learning" rel="nofollow">https://en.wikipedia.org/wiki/Supervised_learning</a>
I don't think the assessment that they were part of a future algorithm is too far off. Given enough expert input picking a fitting image/video or headline should be doable. Worst case you reduce the number of humans needed to one quality assurance person. Algorithm says "this headline, this image"...yes/no;fix.