None of this would be important if social media gave us what they originally sold us: See updates from your friends, family, and people you want to see updates from, in chronological order, rather than based upon weird engagement algorithms and privacy-destroying ad networks.
I am surprised this segment (admittedly picked from Ars's secondary writeup) hasn't made a splash:<p><i>"It's why I've seen priorities of escalations shoot up when others start threatening to go to the press, and why I was informed by a leader in my organization that my civic work was not impactful under the rationale that if the problems were meaningful they would have attracted attention, became a press fire, and convinced the company to devote more attention to the space," Zhang wrote.</i><p>That is a damage control role. Perhaps more tellingly, it highlights the entire organisation's priorities: if it isn't drawing press attention, ignore it. Of course that's not the phrase FB would use in a press release. They'd deploy a convenient euphemism, such as "dedicate the resources elsewhere".
The pattern of hiring young, passionate, ambitious workers, then telling them their job is of critical importance to the company (and, in this case, society at large) while simultaneously underfunding their team and providing them with completely inadequate leadership is REALLY common in Silicon Valley companies. These same companies will actively stigmatize saying "it's not my job," and so you have very green employees who end up doing work that's wildly outside their zones of competence and comfort, internalizing all the stress that builds up along with being put in that position and not even understanding that speaking up is an option.<p>Many of these people lack the experience required to see the forest for the trees and they draw similar conclusions to the ones in this memo. "There's no bad intent, we're just overworked and underresourced" (paraphrased) is something I've heard time and time again from people working on supposedly important problems at companies making money hand over fist.
This is pretty frustrating, clearly she said that she wanted her privacy respected, they even acknowledge that in the article, why did they publish her full name and a short description of her linkedin just to make it even easier to find her? What motivation did they have to do this?<p>But they hid the name of the software engineer that spoke on her credibility? Something seems a little off, either on the source's side or on the distributor's side.
> Still, she did not believe that the failures she observed during her two and a half years at the company were the result of bad intent by Facebook’s employees or leadership. It was a lack of resources, Zhang wrote, and the company’s tendency to focus on global activity that posed public relations risks, as opposed to electoral or civic harm.<p>> “Facebook projects an image of strength and competence to the outside world that can lend itself to such theories, but the reality is that many of our actions are slapdash and haphazard accidents,” she wrote.<p>> “We simply didn’t care enough to stop them”<p>This is the key takeaway, IMO. Not as an excuse for Facebook, but as an indictment of "slapdash" information technology in general, particularly social media. It's becoming more and more clear that "bringing the world closer together" is a pandora's box, one that Facebook is not equipped (motivated?) to deal with the consequences of. Maybe no company ever could be. Maybe this is simply a thing that shouldn't exist.
"“I have personally made decisions that affected national presidents without oversight, and taken action to enforce against so many prominent politicians globally that I’ve lost count,” she wrote."<p>The scale of how the platform's being used for political manipulation in every country is enormous, and it's clear that if a junior data scientist is having to independently make these decisions, that there's little interest in proactively dealing with this.
What is the public interest in publishing her name after she has expressed concerns about her safety? Shame on Buzzfeed.<p>"In her post, Zhang said she did not want it to go public for fear of disrupting Facebook’s efforts to prevent problems around the upcoming 2020 US presidential election, and due to concerns about her own safety. BuzzFeed News is publishing parts of her memo that are clearly in the public interest."
I get frustrated over these pairs.<p>What are Facebook supposed to do? They could spend billions moderating every comment and like, but they'd piss off every politicians world wide and the users would all cry censorship (and that's if they get it perfectly correct). They could pick a side, but the same would apply with slightly fewer pissed off people. They could do nothing and save billions and piss off less people.<p>And in the background, a small number of people continue to manipulate everything you see in legacy media, and no one really cares because we're used to it. Seriously. What the fuck?
I can see it in my least tech savvy, least educated friends.
As Facebook users they seem to be radicalising the longer I leave them to the devices.<p>But what’s the alternative?<p>If people want family & friends social media, where to go?<p>Aren’t the open/alternative platforms just as open to abusive, if not more so, as no-one like the whistleblower is even hired when it comes to open platforms?
What's scary is that with all the resources that FB has, it still has to prioritize enforcement, which means that platforms like reddit or even HN have no chance of catching this.
Missing from article is any causality between Facebook bot farms and any real world effects, election outcomes or deaths. It just says, oh there were a million fake likes on a post in this country... months later some political unrest. Like this has never happened before Facebook?
Something I have always failed to understand is why there are people who still work for this company. She states “I know that I have blood on my hands by now”; doesn't everyone who works there? At this point, it is well known by everyone that this is a product flawed to the core. It is maintained by a company that insists is not a media company to evade all social responsibility, and insists that its AI will solve the unsolvable problem of moderation at scale. Ethical alternatives of federated social networks already exist. Why do people still work there? Do they not care?
<i>“One of the big tools of authoritarian regimes is to humiliate the opposition in the mind of the public so that they're not viewed as a credible or legitimate alternative,” she told BuzzFeed News. “There's a chilling effect. Why would I post something if I know that I'm going to deal with thousands or hundreds of these comments, that I'm going to be targeted?”</i><p>That's not just a tool for authoritarian regimes, that's pretty much the most used tool in any form of political conflict, in any country.
This sounds really bad. In searching the web for this title it seems like small news services are running this. To be fair to Facebook, I am willing to wait a day and see what they and other organizations/news/people say about this disclosure.
Is it just me, or does it seem like, with both social media and “tech” in general, that the ‘regulation axe’ is grinding - and that it is only a matter of time that these algorithms core to these companies’ business models ‘suffer’ from likely blunt, harsh regulatory instruments that will broadly stop this kind of influence and manipulation.<p>By doing so, it will also significantly harm these business models (and valuations) as we know it today.
The article claims that these kind of manipulation caused them to be reported by international news but this is the first time I ever hear about any of the examples listed by the article, which leads me to believe that these kind of manipulations doesn't really have that much power.
The only way anything will happen to Facebook is if these three things actually happen in sequence and within a short period of time of the first event occurring.<p>1) Facebook wittingly or unwittingly ignores political manipulation on its platform within the United States of America that demonstrably affects US political outcomes.<p>2) All necessary parts of the US government required to hold a corporation like Facebook accountable for 1) act in concert to do so.<p>3) The US mainstream media extensively reports on 1) and 2).
>Zhang said she turned down a $64,000 severance package from the company to avoid signing a nondisparagement agreement.<p>You really have to ask yourself what kind of place it is you're working for and what you're building, if a totally regular employee basically is paid hush-money to not speak about their job.<p>This isn't a private business any more, it's the mafia. People talk a lot about the culture of free speech and the rights of end-users, but we live in a world where a private company that builds a social media website, this isn't the NSA or anything, can stop an employee from <i>speaking the truth</i>.<p>It's time policy makers throw all of this out of the window, together with the anti-competitive non-competes that at this point affect IIRC, almost a fifth of the American workforce.
Cognizant employees (contracted Facebook Moderators) are on camera admitting that they censor specific people based on their political leanings.<p>Facebook sent them specific memos about which violent images towards which politicians were not to be removed.<p>That particular set of facts escapes most coverage of this topic for obvious reasons.<p>These conversations about this topic will never be seen as sincere since they themselves are biased.
Facebook is trying to get attention worldwide by playing this master political role amongst other things, which is just a decoy because everyday it's losing real world relevance - less people use it, less and less.
Do any of the major platforms really have a handle on how to deal with these challenges? I'm not excusing the lack of oversight. But most companies that grow this quickly are a complete cluster inside. Imagine having to battle well funded state actors on top of trying to build a business.<p>Again, not saying Facebook shouldn't be held accountable. But it's always easy from the outside looking in.
~~Can we get the actual report Zhang published, rather than a BuzzFeed link? I mean is Buzz Feed really considered news?~~<p>EDIT:<p>I retract my comment, I was unaware of the distinct nature of Buzzfeed news from Buzzfeed proper,