> <i>It's a time when we need to rely on trusted news sources</i><p>That's the rub, isn't it? Who's a "trusted" new source? All the major players have shown biases, corruption, and manipulation. Lots of people therefore consider some fly-by-night facebook feed that says what the majors don't to be more "trusted" in response.<p>We shouldn't "rely" on single points of perceived trust, especially in this massive for-profit, for-power media industry. We need to be broadly informed from various angles to have a better chance of piecing together a reasonable sense of what's going on.
Here's what I'm worried about with fake video. If you think fake Russian bot news in US politics is a problem, consider the following.<p>Facebook is increasingly populated with a large number of new users from developing nations who are basically one or two generations removed from subsistence farming. The level of naivete and rumor spreading that you can see in Pakistani Facebook is alarming. Now combine nation-state funded agitprop campaigns with naive users and fake video. Read the following article and imagune how much worse it could be if a malicious organization with funding decided to further weaponize content on myanmar-oriented Facebook groups and pages.<p><a href="https://www.theguardian.com/world/2018/apr/03/revealed-facebook-hate-speech-exploded-in-myanmar-during-rohingya-crisis" rel="nofollow">https://www.theguardian.com/world/2018/apr/03/revealed-faceb...</a>
I don't know, for this type of fake video, you don't even need a face mapping or a voice mapping implemented by deep learning.<p>Just take any random Barack Obama video which we have plenty of, use similar male voice, or copy&paste convenient Obama's voice without context, and some people believe it.<p>It won't fool everybody but that is not that important. Some people still believe it and act accordingly(Pizzagate conspiracy theory) and that's more than enough to achieve the malicious intent.
BuzzFeed's article with the backstory behind this video: <a href="https://www.buzzfeed.com/davidmack/obama-fake-news-jordan-peele-psa-video-buzzfeed" rel="nofollow">https://www.buzzfeed.com/davidmack/obama-fake-news-jordan-pe...</a><p>(Disclosure: I work for BuzzFeed)
I think people are seeing the risks of people creating fake videos but there's another risk that I think is being overlooked. As convincing fake videos become a reality and articles like this inform the general public of that reality it becomes incredibly easy to dismiss any inconvenient videos as deep fakes. If Mitt Romney had made his 47% comment in the present climate he likely would have been able to skate by saying that was a deep fake, created by political adversaries.
Many people already fool themselves by simply refusing to fact-check anything that doesn't support their biases. If you thought "alternative facts" were bad now, just wait for this tech to hit 100% believability.
Can we get the link changed to the non-amp version? <a href="https://www.theverge.com/tldr/2018/4/17/17247334/ai-fake-news-video-barack-obama-jordan-peele-buzzfeed" rel="nofollow">https://www.theverge.com/tldr/2018/4/17/17247334/ai-fake-new...</a>
That sounds like a really bad Obama impersonator. The video is OK but still obviously fake.<p>If you look at the progression of film CGI, with its persistent flaws despite massive budgets, there's no reason to suspect that indistinguishable fakes are coming any time soon.
I'm a little disappointed that it took this long for people to get over the 'OMG celebrity porno' thing about deepfakes to actually start dealing with the real potential consequences. Deepfakes open up an amazingly large list of questions, from the mundane to the truly challenging. That someone will see imagery that fulfills a sexual fantasy is, by far, one of the most trite and harmless possible uses of this technology.<p>Realtime video synthesis has been around for awhile. I recall reading in 1999 that if you watched the ball drop on New Years Eve, the advertisements on the buildings you see won't be the advertisements that are actually present in real life. They were being replaced on the fly with different ads. Now, it's basically possible to replace people. Everywhere that people appear. And this isn't something we can deal with simply by going off of gut reaction and intuition, it really raises significant questions.<p>Who will be the first politician to use a younger version of themselves to promote themselves? Which celebrity will be first to replace not the face, but their body, in film to make themselves more attractive? Which movie studio will be first to film a movie using cheap performers and then re-use the right they have to a major celebrities likeness by just faking them into it? Or maybe they'll only bring in the actual celebrity for close shots?<p>This can be done on commodity hardware, and will only get easier. It will be used for schoolyard bullying, for amazingly uplifting and important artistic works, for debasement and aggrandizement. It's one of these things that we've got to do some thinking about as a society.