It's also worth noting that there far stupider methods that can (right now) generate much more convincing images than deepfakes. The simplest is taking an existing, real picture and taking it out of context, lying about what the image shows, etc. This is actively being done in current conflicts, e.g. taking real footage from the Syrian civil war or from previous wars in Gaza and presenting them as photos from the current war. It works so well because there <i>are</i> enough absolutely horrible photos from the current war, so the fake photo appears plausible.<p>If AI-generated imagery should ever reach the level of being completely artefact-free and absolutely indistinguishable from real photos, then deepfakes will be <i>about on par</i> with the propaganda effect of today's real photos.