This is a case of government authorized genocide. I just don't understand how shifting blame toward Facebook as a data sharing platform solves the problem. If someone sends hate filled letters to public via USPS, is USPS complicit? Yes, FB could have probably done better by monitoring the content, and muted hostile posts.
I hail from a small industrial town in eastern India, where very recently racial tensions erupted between pockets of Hindu/Muslim communities. One of the first things local government did was to turn off internet (data and ISP), to control proliferation of rumors and anything that could incite further violence.
"A couple of hours outside Yangon, the country’s largest city, U Aye Swe, an administrator for Sin Ma Kaw village, said he was proud to oversee one of Myanmar’s 'Muslim-free' villages, which bar Muslims from spending the night, among other restrictions.<p>'Kalar are not welcome here because they are violent and they multiply like crazy, with so many wives and children,' he said.<p>Mr. Aye Swe admitted he had never met a Muslim before, adding, 'I have to thank Facebook because it is giving me the true information in Myanmar.'"<p><a href="https://www.nytimes.com/2017/10/24/world/asia/myanmar-rohingya-ethnic-cleansing.html?smid=tw-nytimesworld&smtyp=cur" rel="nofollow">https://www.nytimes.com/2017/10/24/world/asia/myanmar-rohing...</a>
<a href="https://docs.google.com/document/u/1/d/e/2PACX-1vQgMzRBc6P2mmlbqkF70gz2dIwK3ucj5JQTx6ygxjPqveHmT4bR41N_txC38X1ZW1pZ51DgdrEgwbkT/pub" rel="nofollow">https://docs.google.com/document/u/1/d/e/2PACX-1vQgMzRBc6P2m...</a><p>As someone who spent a lot of time in Myanmar in the past 6 years, I saw the country go from near zero internet to 50-60% penetration in a couple of years. The strange thing is that most people don't really use the web - it's all Facebook. The internet, for a vast majority, means Facebook. The way news spreads in Myanmar is gossip and rumours - probably because of very limited press freedom in the past. This combined with some ugly undercurrents of nationalism, extreme poverty (for all ethnicities) in the conflict areas, and decades of poor education, has sometimes turned Facebook into an amplified medium of hate speech.<p>I don't think Facebook is to be blamed for the violence (it has existed, and still does, in many parts of Myanmar against other ethnic minorities - Karen, Kachin, etc - before Facebook, and without the same amount of press given to Rohingya) but it has most likely amplified the hate speech.
I think a big problem is that the modern communication platforms provide the power to assemble public anytime anonymously, and as a society we are yet to realise that. In a real world gathering, you have media and government obseevers, and anybody making incorrect statements is most of the time scrutinized and exposed.<p>However, now that anyone can publish information fast and to a large number of people, false facts will circulate fast on any communication platform.<p>Here in India in my village, people forward each other long false statements and rumours, and people believe it blindly and propaagte to their contacts. As a society, I don't think we know how to handle this.
It's incredible how fast the sentiment has changed regarding Facebook. Back in 2010/2011 Facebook was being praised for fueling the Arab Spring, which toppled dictators and spread democracy (also lead to the horrific war in Syria).<p>Today, stories bounce from fake news, to Russian propaganda, to user data mishandling, to fueling genocide in Myanmar.
From the article:<p>"He recalled one incident where Facebook detected that people were trying to spread “sensational messages” through Facebook Messenger to incite violence on both sides of the conflict. He acknowledged that in such instances, it’s clear that people are using Facebook “to incite real-world harm.” But in this case, at least, the messages were detected and stopped from going through."<p>Notable: "through Facebook Messenger"<p>So one extrapolates that FB actively monitors private conversations carried on Messenger.
> "there’s about to be an uprising of the Buddhists, so make sure that you are armed and go to this place"<p>> Facebook's systems detected what was going on and stopped the messages from going through<p>Wwwwhaaat?! Some people may have just seen that message and interpreted it as "shit hit the fan, let's hide my family in a safe place until this cools down", even if it was intended as a "call to violence". Censoring messages like those could have just as well <i>caused deaths</i> because innocent people just didn't get the heads up.<p>Corporations should clearly define themselves as either <i>"medium companies"</i> and stay completely neutral to whatever flows through their platform as long as it not "explicit content" (yes, this include allowing "hate speech" as long as it's toned down, because that "hate speech" can also contain useful information, and it's not something clearly identifiable), or <i>"message companies"</i>, in which case they can clearly take sides in conflicts, but also be responsible (legally) for their actions.<p>This muddy "middle ground" position that some companies take is "the root of all evil". Either <i>let anything happen</i> (including bad things), or <i>pick a side,</i> so that you can later be judged according to the side you picked. It's condescending to imagine that you're actually smart enough to "properly filter" information. You're not, or you're a tyrant imposing his value system on others.<p>I have more sympathy for a corporation that does evil deeds in the service of profit, than for one that interferes in "muddy" ways in social issues and prevents clarity and free flow of information. Sometimes this flow of information cause blood to be spilled, but sometimes problems get solved this way, if a society is not evolved enough to solve them in more peaceful ways. Toning down discussions and letting tensions accumulate is worse.
I once reported a group created solely to harass & defame my FB friend. The creator of the group clearly violated multiple Facebook policies, 3.3 You will not bully, intimidate, or harass any user, 3.9 You will not use Facebook to do anything unlawful, misleading, malicious, or discriminatory, 5.1 You will not post content or take any action on Facebook that infringes or violates someone else's rights or otherwise violates the law.<p>Facebook reaction? They found it doesn’t violate their community standards: <a href="http://const.me/tmp/fb-policy.jpg" rel="nofollow">http://const.me/tmp/fb-policy.jpg</a><p>Right, “no place for hate speech”.
This FB Mynamar issue reminded me of the looming crisis at Telegram:<p>'Telegram has told Russian regulators that it is technically unable to hand the encryption keys to user accounts to the country’s secret services, just weeks after the messaging platform was ordered to do so or risk being banned in the country
Roskomnadzor, Russia’s communications watchdog, told the company last month that it had two weeks to give the FSB, successor to the KGB security agency, access to the company’s encrypted messages or face the possibility of being blocked'.
<a href="https://www.ft.com/content/84a878da-3664-11e8-8b98-2f31af407cc8" rel="nofollow">https://www.ft.com/content/84a878da-3664-11e8-8b98-2f31af407...</a><p>Iran (where Telegram have 40 m users) is on the verge of banning it too as it was 'used to organise mass protests last year'.<p>We seem to have gone from FB being an enabler and hero of 'arab spring' to now being accused of being a tool of darker forces against states. Telegram have raised over 2 billion USD (of probably dodgy money given terms of ICO)and may now be crippled by state interference...
A slighly alt-analysis here.<p>These hatreds (if you will) are not new, at all. They have been "mismanaged" and/or conveniently exploited for as long as any of us can remember.<p>Certainly, the communications tool (aka FB) can play an enabling role. That is, none the less, a symptom; a symptom of a disease that predates the tool by eons.<p>The UN is confused and distracted; and it seems willing to let - once again - the true guilty parties off the hook. Yes, FB played a role. But to ignore the historic context is silly and dangerous.<p>The disease will persist. Because it can. Because it's easier to blame a symptom.
It seems more likely that the people of Myanmar turned into a beast, and Facebook reflects that. It's not some fringe ideology. Everyone from politicians to monks have voiced their support for this.
> "This work includes a dedicated Safety Page for Myanmar, a locally illustrated version of our Community Standards, and regular training sessions for civil society and local community groups across the country." [1]<p>Obviously I don't speak Burmese, but given Facebook has been weaponized for hatred and ethnic cleansing, a one-page cartoon PDF seems more than a little inadequate.<p>[1] <a href="https://www.facebook.com/safety/resources/myanmar" rel="nofollow">https://www.facebook.com/safety/resources/myanmar</a><p>[2] <a href="https://scontent-lht6-1.xx.fbcdn.net/v/t39.2365-6/15516483_387974314883053_3511041979574124544_n.pdf?_nc_cat=0&oh=b7540e97779373df37dcc3a60d7122d4&oe=5B6BC88C" rel="nofollow">https://scontent-lht6-1.xx.fbcdn.net/v/t39.2365-6/15516483_3...</a>
Social-news platforms like Facebook and YouTube are automated media machines, without the checks and balances of a people heavy news media company. In traditional news, employees maintain stronger consciences because they deliver the news manually day after day, they are not separated from that news by automation unlike the engineers working on social-news platforms.<p>Because of this, social-news platforms are more free to recommend stories to users that promote anything, as long as the user clicks it's a win. With this anything goes approach to news, we get sensational stories, conspiracy stories and hate stories because people click on them. Obviously the consumers are also responsible, but being perhaps accustomed to journalistic standards, maybe they are culturally unprepared for the level of bullshit-dressed-as-news that we are seeing online. I also notice the problem is compounded by the aggressive evolution of head-faking in media. For example it's becoming really hard to know what is real news vs what is marketing-dressed-as-news. How many commenters are real people? Basically I think algorithm-driven-news and marketing is outpacing traditional society and creating something new, time will tell what but it might be a monster...
Zuckerburg today:<p>"I remember, one Saturday morning, I got a phone call and we detected that people were trying to spread sensational messages through — it was Facebook Messenger in this case — to each side of the conflict, basically telling the Muslims, “Hey, there’s about to be an uprising of the Buddhists, so make sure that you are armed and go to this place.” And then the same thing on the other side."
'In Myanmar today, Facebook is the internet'<p><a href="http://foreignpolicy.com/2017/11/07/facebook-cant-cope-with-the-world-its-created/" rel="nofollow">http://foreignpolicy.com/2017/11/07/facebook-cant-cope-with-...</a>
There's no better example than this of technology's unintended consequences. Tech has two sides but we only like to focus on the good one. We don't like to face the hidden beast.<p>We all need to keep that in mind.
I'm no fan of Facebook. I quit using it in September last year . But for this particular issue, inciting of hate, they are no more or less guilty than the 'traditional' tabloid press.<p>Look at the comment section of your local 'populist' newspaper (over here it is <a href="https://www.hln.be/" rel="nofollow">https://www.hln.be/</a>) and see how hate-speech rules supreme and how sensationalist articles are milking for those comments and like.
Few times I reported hate speech on Facebook, but every time it was rejected. It seems like hatred towards certain groups of people is allowed on Facebook.
As long as Facebook still aggressively and effectively policies its platforms for violations of its reactionary puritanical agenda, nothing they say about the difficulties of policing hate speech and other hateful propaganda is in any way credible.
Does anyone here actually know how it is possible to contact Facebook's legal department? Who else could someone contact to whistle-blow something?
I honestly wonder how Mark and SS sleep at night? Can’t they afford to do better than this? Does their greed and shame know no bounds? I don’t get and it seems like it’s something that will bite them in the long run and open them up to competition, whereas there might not really have ever been a reason for a natural competitor to emerge if FB had just treated its eco-system with a bit more respect and stewardship. Why don’t they want Facebook to be like a curated garden, not a landfill?
This is too witch hunty, facebook didn't outwardly do anything to encourage this. By the same logic we can also blame the internet as a whole. There were similar messages on twitter and chat apps. Facebook is big, so it gets the attention, but it doesn't mean it had outsized influence per capita. We need to be careful putting full blame on the platforms that enable communication.
A deeply connected world is a more volatile world [1].<p>We're on the cusp of understanding that - if we're lucky.<p>[1] <a href="http://www.niallferguson.com/journalism/miscellany/why-twitter-facebook-and-google-are-the-antisocial-networks" rel="nofollow">http://www.niallferguson.com/journalism/miscellany/why-twitt...</a>
<a href="https://www.bloomberg.com/news/articles/2018-04-02/missouri-attorney-general-opens-probe-into-facebook-s-data-usage" rel="nofollow">https://www.bloomberg.com/news/articles/2018-04-02/missouri-...</a><p>"I influenced three senators for $477.85"<p>"The goal of the ad campaign was to convince people to call their Senate offices and tell them to vote No on a confirmation. I registered the domain dumpdevos.com anonymously, set up a Facebook page, and we were off."<p>Source:<p><a href="https://medium.com/@colinsholes/i-influenced-three-senators-for-477-85-c0256e8ba66c" rel="nofollow">https://medium.com/@colinsholes/i-influenced-three-senators-...</a>
"Indeed, when I asked the company whether it would permit an external audit of its News Feed workflow and algorithms to prove that there are no hidden or inadvertent biases against stories critical of itself, a company spokesperson repeated its statement that it believed there were no biases, but did not respond to two separate requests asking whether it would permit an external audit to prove it.<p>...<p>Machine learning approaches are especially troubling, as the company continues to <i>refuse to release any information</i> about the functioning and accuracy of its models, even as they play an ever-greater role in shaping what two billion people can see and talk about in its walled garden.<p>Most recently, when asked about its efforts to train machine learning models to autonomously decide what is "fake news," the company responded that it was using a large number of signals (though it declined to elaborate on the full list of signals used) to train computerized models to fully autonomously scan what is being posted and discussed on Facebook and identify new stories the algorithms believe are false - <i>all without any human intervention</i>.<p>...<p>Despite controlling what nearly a quarter of the earth's population sees and says online in its walled garden, the company has survived nearly a decade and a half of privacy outcries <i>without ever having to open up and give its users even the slightest insight</i> into how they are being manipulated, moderated and commercialized.<p>...<p>Putting this all together, Facebook's utopian vision has devolved into a surveillance dystopia in which even its programmer creators can't be certain how or why it makes the decisions it does.<p>In the end, the telescreen's of Orwell's 1984 only surveilled the citizenry at random, while Facebook's unblinking algorithms never let us out of their sight, silently shaping what we are able to see and say without us having any right to understand the rules they quietly enforce, while even their engineer creators are not fully aware of the ramifications of the myriad inadvertent decisions that went into their programming."<p>Source:<p><a href="https://www.forbes.com/sites/kalevleetaru/2018/04/02/facebook-keeps-saying-trust-us-is-it-finally-time-to-say-no-more/" rel="nofollow">https://www.forbes.com/sites/kalevleetaru/2018/04/02/faceboo...</a>
"Facebook says "authenticity" is key to the social network and rigorously policed, and that false information violates the terms of service agreement.<p>...<p>Computer engineer Ryan Barrett fills in online forms with 0000s whenever a number is required and uses dashes for words. He says it is mostly out of principle: he wants to be in control of his information. Also, it's fun to try to fool the marketers. He has used a dozen different spellings for John Doe rather than entering his name. He even misspells his name when reserving airplane tickets and says it has never created a problem going through security.<p>...<p>He says he has friends who work at companies that look at multiple services to link up and cross-reference data on individuals-data gleaned from mobile phones, social media, grocery store loyalty cards and more. When those friends searched for him in their systems, they found little to no information. "There's a small feeling of satisfaction," he says.<p>...<p>All the lying does seem to foil advertisers. It is "a much bigger problem than people are aware of," says Nick Baker, director of research and consulting of U.K. market research company Verve, which conducted a 2015 survey showing a large amount of fake information on website registrations and the like.<p>Incorrect birth years, he says, are particularly nefarious because advertisers are often <i>trying to match up habits or buying patterns with a specific age group</i>.<p>...<p>Preethy Vaidyanathan, the chief product officer of New York-based marketing technology company Tapad, says they track much more valuable information from <i>phone and web browser use</i>.<p>Still, Ms. Vaidyanathan <i>sees the value in hiding identity online</i>. She says she <i>uses a second email address</i> with a fake name that she <i>gives out to companies she doesn't want to bombard her inbox</i>.<p>Source:<p><a href="https://www.wsj.com/amp/articles/you-werent-born-in-1910-why-people-lie-to-facebook-1522682361?tesla=y" rel="nofollow">https://www.wsj.com/amp/articles/you-werent-born-in-1910-why...</a>
"Facebook Allows Advertisers to Target Users on the Basis of Their Interest in Illegal Firearms"<p>"And it doesn't seem interested in closing this loophole any time soon."<p><a href="https://slate.com/technology/2018/04/facebook-lets-advertisers-target-users-on-the-basis-of-their-interest-in-illegal-firearms.html" rel="nofollow">https://slate.com/technology/2018/04/facebook-lets-advertise...</a>
We live in a world where knowledge of Islam by non Muslims can be detrimental to Muslims that only follow certain verses (beliefs) of their book (aka peaceful muslims), unfortunately for them the holy book is immutable (and also can be used to justify ISIS)<p>Also any form of attack on Muslims permits Muslims to retaliate, causing a vicious circle [1][2]<p>1. <a href="https://www.quran.com/9/36" rel="nofollow">https://www.quran.com/9/36</a>
2. <a href="https://www.quora.com/Where-does-it-say-in-the-Quran-to-kill-infidels" rel="nofollow">https://www.quora.com/Where-does-it-say-in-the-Quran-to-kill...</a>