This looks like it's going to be an increasingly big problem for the platform giants, as it pertains to their increasingly aggressive speech restrictions. As they ramp those restrictions up, I'd expect the need for security to increase accordingly.<p>A mentally unwell person is likely to feel targeted, oppressed, threatened, harmed, etc. by being silenced. They'll feel isolated and it'll very likely feel like a personal attack by the tech company. The platforms today are so large, it surely can seem like being cut off from society in general, like a human right is being revoked.<p>The people the platforms are looking to restrict based on expressed views or behavior, I suspect, are going to tend to have higher than normal rates of mental illness (emphasis that I think it's likely to be a higher rate, not universal).
She's a disgruntled Youtuber.
This is her website: <a href="http://nasimabc.com/" rel="nofollow">http://nasimabc.com/</a><p>From her page showing a screenshot of her Youtube ad revenue dashboard:<p>> Analytics Last 28 days<p>> Views 366,591<p>> Revenue $0.10<p>Highlighted in Red: Revenue $0.10?<p>There's also another bit where she shows her historical vs. current traffic and makes a reasonable case that she's getting down ranked.<p>Interesting times.
One of her complaints was that YouTube age-restricted her yoga videos (which were modest by Western standard, she was just wearing shorts, a shirt and no socks), while not age-restricting much more explicit Nicki Minaj and Miley Cyrus videos. Is there any explanation to why YouTube does this?
What a terrible disaster this afternoon. It should never finish like this.<p>The only thing I hope out of this is that it will maybe relaunch the debates about the megaplatforms and censorship. (Facebook, youtube, Google,...). The bigger they become, the more they will start censoring everything.<p>More people need to launch the movement to host their videos independentlty. Peertube is a good start for example <a href="https://joinpeertube.org/en/home/" rel="nofollow">https://joinpeertube.org/en/home/</a>
Looks like this is the culprit's website<p><a href="http://www.nasimesabz.com/index.html" rel="nofollow">http://www.nasimesabz.com/index.html</a><p>There are links to her (4?) youtube accounts, which have all been terminated. There's a link to her instagram, which has also been terminated, however there is a cached-copy here <a href="http://www.pictame.com/tag/yesilnasim" rel="nofollow">http://www.pictame.com/tag/yesilnasim</a>
As I see it: She posted on her site that she made 10 cents for 366,000 views these days. She must have been making quite a bit and then probably the algo determined that she is not making advertiser friendly videos so the spigot got closed. Must have appealed, we all know how those work.
So looking at her videos on Telegram (<a href="https://t.me/nasimesabz1" rel="nofollow">https://t.me/nasimesabz1</a>), she was anti gun. She was a very weird individual with clear mental health issues.<p>I don't even know what to make of most of her content on that channel.
So if this person was not an employee, it's amazing to me how she got into the building? Usually these large tech campuses are pretty locked down, with at least one badged entrance, and security or front desk staff watching who comes and goes.<p>That said, I've never visited the YouTube campus, so maybe it bucks the norm.
Firstly, my sympathies with the people at Youtube HQ. I have many, many friends in the bay area and so this kind of thing hits close to home.<p>Secondly, I struggle when thinking about if we should give airtime to a shooter's grievances, or reasons. Or even mention who they are. I tend to be on the side of: don't give them any air time. And certainly, don't acquiesce to this type of behavior because it is almost by definition "terrorism" (using violence to cause change in policies).<p>On the other hand, I have been hearing a lot about YouTube demonitization, censorship, etc. Should this shooting be a part of the discussion about censorship? Should YouTube and the tech community let it affect the discourse around censorship?<p>I don't know.
Her videos are still up on Dailymotion <a href="http://www.dailymotion.com/yesilnasim/videos" rel="nofollow">http://www.dailymotion.com/yesilnasim/videos</a>
No justification for her actions, but Google does act as a serf master over the entire internet. It can single handedly decide who succeeds, who fails, and there is no recourse. There is no recourse, no appeal. Some 20 year old dude in Menlo Park decides, and your website, your life, your work, your future, is changed forever. Here's our account: <a href="https://www.medgadget.com/google" rel="nofollow">https://www.medgadget.com/google</a>
She's complaining about being censored. She would have been helped immensely by somebody showing her how to host her own videos on her own website.
A lot of people in this thread are attempting to describe Nasim as a 'weird' or 'unstable' individual. I wonder if there's some narrative building here going on here? Many want to explain it away with a cliche.<p>It's a shame what happened. Too bad our world isn't more like K-Pax.
Serious question here. What are the liability laws like in the US for producers of propaganda? Can you hold say inforwars or brietbart (used as proxies here) liable if they insight or inflame this sort of person to go on a rampage?<p>You could easily prove in a court of law that what they peddle is not factual so I don't see how they can't have some liability. Freedom of speech is freedom to speak, not freedom from consequences.<p>Possible precedent here:
<a href="https://www.nytimes.com/2017/06/16/us/suicide-texting-trial-michelle-carter-conrad-roy.html" rel="nofollow">https://www.nytimes.com/2017/06/16/us/suicide-texting-trial-...</a>
It seems like the revenue share may have pushed her over the edge.<p>Honest question: does the revenue share program do anything positive? It seems like it increases noise quite a bit to me.
The amount of victim blaming in these comments is astounding. Three people are in the hospital because of an armed maniac, and a bunch of the comments are about how YouTube brought this on themselves with their policies about videos.<p>I’m not religious, but y’all need Jesus.
First two comments good for a world-weary chuckle:<p>1. Xenophobic, hateful comment about Muslims<p>2. Complaint about her suicide being a waste because "she's hot"<p>Still not sure why news websites bother with comment sections. I guess because it counts as "engagement?"
If you search her name on twitter, around 40% of what you see is right wing conspiracy theorists who assert, without evidence, that she was a member of ISIS. I also noticed a smaller percentage of people asserting, again without evidence, that she was an NRA member. And neither appear to be true.<p>How do these get spread on Twitter? Are partisans mindlessly voting them up? Are bots behind this?<p>Man, do I feel stupid for thinking the internet would be a force for good. That it would promote democracy, free speech, and critical thinking.<p>I look at Facebook and Twitter, and the bad greatly outweighs the good. I chose a career in web dev, thinking I would be doing something beneficial, but everything we've done amounts to nothing.
I'm sure this submission will get flagged off the front page too because "Iranian woman bears arms against YouTube for censorship" doesn't really fit the narrative around here, but here's her page before it's wiped out too. Her YT and Insta have already been nuked.<p><a href="http://www.nasimesabz.com/" rel="nofollow">http://www.nasimesabz.com/</a>
Please consider not giving this person the online attention she sought as a result of her actions, and join me in flagging stories that dwell on her motives.
Content platforms have no place policing content. All the algorithms currently used are utterly flawed, and they will always be flawed, both technically and morally. Technically, they'll always have false positives, especially at scale. Morally, they turn the platform owners into political figures inherently, siding with or against certain thoughts and sentiments.<p>The only solution is for platforms to be completely content agnostic and allow absolutely everything except outright graphic violence and adult content.<p>No removing any kind of speech whatsoever, including so-called "hate speech", no policing political topics, no identifying "fake news" and so forth -- words cannot hurt anyone, objectively, and everyone has their own personal agency to decide what they want to watch and make their own decisions based on what they see, even if it's state-funded propaganda.<p>If the police or FBI come to them with a warrant and take down notice, then sure. Otherwise, allow absolutely everything, and allow the free market of popularity to reign supreme.<p>In fact, all sides of all political and social issues should be outraged that content platforms think they have a place deciding what we can and cannot see.<p>And advertisers should stop allowing online outrage to dictate where they advertise by realizing two simple truths: a) if I'm watching a video with your advertisement on it, it's because I made the conscious choice to watch that video, and even if the video is about apartheid or waterboarding or guns, it's content that I want to consume, so they have no place stripping me of my personal agency, and b) the old model of advertisers being seen as endorsers for TV shows doesn't apply to on-demand online content.
I have a theory about workplace shootings. American companies have a very pronounced dog-eat-dog culture. So much so that whole books have been written about the workplace asshole [1]. I'm wondering if some of the shootings can be explained as the workplace asshole going one step further and bringing his guns to work.<p>[1] <a href="https://en.wikipedia.org/wiki/The_No_Asshole_Rule" rel="nofollow">https://en.wikipedia.org/wiki/The_No_Asshole_Rule</a>