Youtube's response regarding one of these videos documenting abuses (emphasis mine):<p>> "we've determined that your video does violate our Community Guidelines and have upheld our original decision. <i>We appreciate your understanding</i>."<p>Can someone explain to me why corporations, when interacting with customers regarding complaints/appeals, seem to have "don't forget to add insult to injury" as one of their motto more often than not? Does that kind of patronizing tone sound polite to the ears of a PR drone?
Also see this Twitter thread: <a href="https://twitter.com/EliotHiggins/status/896358097320636416" rel="nofollow">https://twitter.com/EliotHiggins/status/896358097320636416</a><p>> Ironically, by deleting years old opposition channels YouTube is doing more damage to Syrian history than ISIS could ever hope to achieve<p>> Also gone are the dozens of playlists of videos from Syria I created, including dozens of chemical attacks in playlists by date<p>> Keep in mind in many cases these are the only copies of the videos, and in some the channel owner will have died, so nothing can stop it
It was folly to think that YouTube would be a safe place to document war crimes. YouTube is a distribution channel, not a preservation channel. Its ease of use certainly makes it an attractive option to upload things quickly, but anything of historical significance should have the video raws immediately turned over to a human rights organization for preservation.
I have (had) a channel that had videos about missing people, their last sightings on CCTV etc. The parents of a missing person even used an embed video on their site of a CCTV footage. They emailed me if I still have the video because they need it.<p>YouTube banned the whole channel for extremist/hateful content. Probably some of the videos/titles told the AI that the footage is extreme or some sort of glorification.<p>I appealed on some form but don't even bother anymore.<p>I hope YouTube as a video platform (not streaming) gets a serious competitor.
During the Arab Springs I suspected many police violence video would be deleted from Youtube. I had downloaded them to my server and posted everywhere the links for people to mirror them. Not a single person did yet.<p>I have been amazed at the little importance people put on this kind of video. You have video evidence of crimes with faces appearing clearly. It can take 5 to 10 years for such events to calm down enough to reach a point where crimes can be prosecuted.<p>And it is hard to blame youtube for that. They are considered the channel for Lady Gaga and silly cats video. Hell, I know 3 years old toddler who browse youtube unsupervised.<p>In many places Youtube is criticized to promote violence and extremism by leaving these videos. I feel bad for them, they are between a hammer and a hard place.<p>I just hope that the censored video are not totally deleted from their servers. They should have someone reviewing criminal videos and keeping them at the disposal of judicial authorities but even that opens a whole can of worms: do you obey only to US authorities (who do not care about war crimes in other countries)? Do you obey all world authorities including Saudi and Chinese?<p>Anyway, that's youtube's problem, not ours. Simply, helping prosecute war crime is not part of Youtube's mission, so do not trust them for it. To anyone who feels this is important content, use youtube-dl and keep backups. Make torrents of it, share it around, make sure it does not disappear.<p>And when some NGO finally realize that this content is precious, pump up your upload bandwidth and fill their servers.
Such AI coupled with the inflexible policies of companies like Google and Amazon is already starting to be a problem and will only get worse as it's deployed more broadly. Accounts are closed without recourse for invalid reasons and their owners treated like violators. Short of a law requiring explanations and an appeal process, I don't see this situation getting better ever. Yet another reason not to trust these companies or use their services that require creating accounts and agreeing to their bullshit TOS.
Maybe people should get their shit together and realize that <i>true free speech</i> include <i>allowing</i> videos that seek to recruit people into despicable organizations <i>be available!</i> Yeah, even Hitler had <i>a right to say what he thought</i> and it'a a <i>good thing</i> he had it, <i>despite the consequences that ensued.</i><p>The <i>problem that needs to be solved</i> is how to <i>educate people into not being lured into those organization DESPITE having access to those materials...</i> This kind of censorship is just as STUPID as banning drugs like heroin and cocaine (instead of just making them unavailable to children, or without a "license") or the "war on drugs".<p>Imho the problem comes from the fact that corporations try to hard to be "democratic" about things and "please the majority". But this is not a good idea: sometimes the majority of 99% is against freedom, and <i>they are wrong</i>, despite being the 99%. And the majority should be opposed and <i>freedom protected</i> even when the cost is someone's blood. For me personally, there are these words from my native country's national anthem: <i>"life in freedom or death [for all]"</i>... and I will sure as hell fight, die or kill for them.
To me, if you want to regulate controversial opinions, you have to err strongly to the side of too-open.<p>Remember, before the declaration of independence our founding fathers were terrorists/rebels. I don't mean this as a snappy hollow comparison. I'm saying fundamentally, you can't distinguish between a US soldier recruitment video and an ISIS soldier recruitment video without applying a moral context. How would an AI ever do this? And even if it could, who's moral retelling is the right one?<p>Better in my mind to stay out of the censorship game altogether and promote a forum that is inherently structure in a format that incentivizes accuracy over emotions.
YouTube is balking at their own size. They're discovering what should have been obvious to anyone; the sheer amount of content entering their centralized system is impossible to moderate in any fair way. The only way they can manage is (A) prioritize quality moderation toward channels which are more popular, and (B) enforce the most bland, vanilla experience possible.<p>They need to moderate because they are centralized, and their revenue demands it. We, as a society, need to create a better option. Not just another YouTube, but a seamless decentralized solution.
Why not create a setting that allows user to see YouTube as sanitized by their AI or all content?<p>Allow people to chose content level just like they choose security level in browser settings.<p>1. Legal content. May include content that violates YouTube content policy, but is legal in USA, or the country of the viewer. Maximum freedom of speech and maximum ability to see content that you may find offending.<p>2. YouTube content policy met. Content that is legal and meets YouTube Content Policy.<p>3. Legal, Meets YouTube content policy, Meets a certain org's taste. Like when you can pick a charity that you can donate to when you shop on smile.amazon.com. You can select the org whose bubble you want to live in. ADL, Focus on Family, Skeptics etc. The org bans content and it only is banned for people who opt into that blacklist on youtube.<p>4. When user is not logged in they get AI filtered list but can select "all legal" or "all that meets content policy" filters, even when logged out. All others bubbles available to logged in users only.<p>Advertisers can opt into certain bubble if they want, or opt out of certain content e.g. content deemed inappropriate by the AI?<p>How does that sound YouTube?<p>Doesn't the government security agencies want to know who is watching extremist content and who is not interested in it? How would we know who the extremist are if they fall back to person to person, in person, communication?
Yes, I could see how that classifies as "extremist material", but that's no reason to delete them...<p>IMHO the gradual increase of (self-)censorship in the popular Internet is worrying --- one of the most compelling things about the Internet as it existed was that, from the safety of your own home, you could see and experience things that would otherwise be impossible to access. Now it seems it's turned into a massively commercialised effort of "curating" content so that it doesn't offend anyone, and only results in more profits for advertisers.
Since my understanding is that covering up a war crime is itself a war crime under Complicity doctrine, could Google executives get charged for this in The Hague?
I remember when I used to like - no, <i>love</i> - almost anything Google did.<p>That seems like such a long time ago. Since then my attitude has changed to being mostly <i>hostile</i> towards Google, with every such event.<p>Google should have never entered the "content game" and should have remained a neutral search and distribution (YouTube) platform. Once it went down the path of being a content company, it started "compromising" in all sorts of ways that were terrible for its users.<p>I wonder if the higher-ups have even noticed this change in attitude towards them, and if they did, then they've probably decided that making money is more important even if they become the Comcast of the internet (most hated company).
Have they checked with YouTube to see if the files are actually deleted?<p>Like just because their gateway won't give you access to it doesn't necessarily mean that the bits have been scrubbed on the back end.<p>Also: here's a project to archive this information.<p><a href="https://media.ccc.de/v/33c3-7909-syrian_archive" rel="nofollow">https://media.ccc.de/v/33c3-7909-syrian_archive</a>
Once again, the only hope for customer service seems to be a (social) media shitstorm.<p>Seriously, Google, Twitter and FB massively need to ramp up their customer service and not externalize the costs of a lack of support onto society any more. And there are many "costs": people being actively harrassed and intimidated, sometimes so far they are afraid leaving their house, due to hate speech or doxxing, a loss of historically relevant information as in this case, people locked out of vital emails or their businesses (e.g. when their Gmail account gets closed due to copyright violations on Youtube)...
If you use YouTube, you are subject to the whims of that private corporation, regardless of whether it's right or wrong.<p>They should find a way to host the content somewhere else.
Why are people storing evidence on Youtube again?<p>Not blaming the victim, but at this point most of Google services have not shown to be reliable, especially if you require some kind of thinking human behind a decision
I feel like YouTube uses its monopoly to create a walled garden focused on (in their own words) advertiser-friendly content.<p>The thing is, it makes perfect sense from their side - they will make people angry, but why would they bother if those people can't go anywhere else?<p>I'm starting to feel that a competitor providing the same quality of service while allowing all kinds of videos has a chance to succeed. It's OK to have both child videos, porn and Syrian documentation, as long as you can filter - maybe have some sort of a "curiosity" slider that filters child content on one side, YouTube content in the middle and all content to the other side. Also some category toggles,... If you're unhappy with the current selection, just take a few minutes of your time and change your preferences.
Given that all of the videos happen to be anti-ISIS... and YouTube happens to be owned by an evil empire in bed with American military industry which created ISIS... the AI must have figured out that the videos could be a threat to its masters.
What did they train the AI on to deem something 'extremist'?<p>Should we get to see the training data used and labels?<p>Or is this the modern day equivalent of credit score algo, something that can have huge impact on lives, but you are not allowed to know what it is.<p>This is bad.
YouTube is a really horrible service for content creators. For this type of content, you're practically probably best off with LiveLeak (which, incidentally, seems to be a much better source of breaking news than YouTube these days). Ideally, we'd all switch to LBRY or some sort of IPFS video distribution or something, but that will take time.
War crime evidence can also be extremist material. It is often repackaged as propaganda to rile up new troops.<p>Give evidence to the courts or police. Don't upload it to a video entertainment site and expect it to stay up, despite skirting their rules.
As I understand it, this is the result of Google itself having quite a strong political opinions, at least recently. They profiled themselves as being leftist/progressive... their software just enforces this.
very related to this article about facebook [0]<p>corporations control what info passed to people, and create their own version of reality, but blocking what they don't agree with.<p>I know it's AI, but seems that google appeal agrees with AI decision.<p>people should read Noam Chomsky's Manufacturing consent book, here's interview about it in 1992 [1]<p>[0] <a href="https://news.ycombinator.com/item?id=14998081" rel="nofollow">https://news.ycombinator.com/item?id=14998081</a><p>[1] <a href="https://www.youtube.com/watch?v=AnrBQEAM3rE" rel="nofollow">https://www.youtube.com/watch?v=AnrBQEAM3rE</a>
YouTube does not seem to me to be an appropriate medium for "war crimes evidence". Evidence needs documented provenance, chain-of-custody, storage integrity, affidavits, etc etc. Why does this evidence need a high-bandwidth publicly accessible and searchable interface? For what purpose?<p>To be honest, if you have evidence of a war crime, I hope your plan to seek justice doesn't depend on Youtube.
In case it's not already apparent, there's a business opportunity here for someone to automate "set up an S3 bucket and host videos in it" as an app that uses an API key, so that you simply provide the key to the app and it manages your video collection, gives you a UX to it, and charges you a fee per month.
Often there is no difference between war crime evidence and war crime glorification that machine learning could discern. Exactly the same content could be interpreted as "look at us do great things in defense of our noble ideals!" and "look at these monsters do horrific things for no justifiable reason!".<p>The difference is in the audience's mindset - which is only partially influenced by the uploader's intentions, and partially by how other pages and channels link to the video and present it, and partially by historical context (the same content can acquire a different interpretation five years down the road). Machine learning cannot be expected to emulate that.
I am very concerned about Google using AI to filter hoaxes from search results. Government testing syphilis on black population or selling drugs to fund terrorism? That must clearly be a hoax, right?
One of the most interesting developments in AI will be watching how we respond to human rationality detached from human morality. Programs that optimize for practical outcomes are going to come up with a whole host of solutions that we consider abhorrent, not least because the mere notion that that solution is a practical one riles our sensibilities.
I find this interesting in comparison with the google api that detects toxic comments. I suppose we'll be seeing the same sort of situation in comments sections (less irritating though)
To be fair, YouTube is under no obligation to some greater good; it's just a video hosting service. Expecting it to "preserve footage" and any footage at that, is a strange expectation.
Torrent based Youtube alternative when? I think the technology is ready to move all of the content to a distributed system where it cannot be censored.
And, come to think they had me convinced that this was not going to happen for few decades.<p>I think YouTube went down pretty fast and without fight. The ideological takeover of Facebook and Twitter raged on for few years. I think YouTube was taken over literally overnight. I remember being appreciative of YouTube just a few days back.<p>Guess, time to cancel my $15 Youtube Red Family membership. Ugh, I really hate ads on YouTube. And I was happy to give my $15 month over month. But, I can't fund Youtube anymore given what they are doing. $15 to Youtube, $10 to NetFlix, $10 to Amazon, with $35 a month, I can sponsor ton of content on Patreon that I like. My subscription list on YouTube is not 35 people long, I think it would work out.<p>Never ever I thought I would type these words... break up Google and Facebook and Amazon.
<i>should be required by law</i><p>If your videos don't pass the algorithm, post them somewhere else rather than reaching for the government hammer.<p>Youtube/Google has every right to run their business of posting or denying video content the way they see fit without justifying it to you, free user of their service.<p>If you think they're making a bad business decision and that there's a need for a video service that gives great explanations when they deny your videos, start such a service.
This was to be expected. All history books are written this way. History books are government propaganda. History books do not document the truth. History lessons are nothing but propaganda. So history at school is nothing but learning government propaganda.
Thank you WSJ, NYTimes and the traditional media for pressuring youtube, facebook, reddit and social media to censor.<p>People aren't aware that for the past few years, traditional media and social media has been battling behind the scenes over content, narrative and censorship. It was a major war going on that the public was simply unaware of. Suffice it to say, traditional media won.<p>It is amazing how a select group of news organizations and their editors and journalists can use their bully pulpit to intimidate certain industries.
YouTube is not a reliable video host, but that's okay. It's a company.
Fortunately these videos don't really rely on people finding them by having them recommended by an algorithm as they are merely evidence.
I don't see a problem and completely understand why YouTube (especially as it's getting as non-offensive as it can) doesn't want to show war crimes.