It will be quite weird to have this brief blip in the anthropological record of like 1980-2010 where cameras were cheap enough that lots of real images were captured, and then 2010-2019 where everyone (in rich western countries at least) had cameras <i>all the time</i> and we got tons of real images of everyday life (filters are popular of course but using them is an intentional choice).<p>Then everyone goes inside for like 2 years (pandemic), there isn’t much to take pictures of, and then by the time we start going back outside everything is actually paintings by AI artists.
Reading further into that thread, someone provides the OP with a blurry image of the moon but with craters in the wrong places: the "AI enhanced" photo is indeed much sharper and detailed than the original but it still retains the craters and land features in the wrong place. This suggests that the processing isn't simply replacing the moon with real, high-quality moon photos but rather that it knows how the moon, or a moon-like object, should look and creates detail on the original image.<p>This looks a lot like what DLSS does in GPUs.<p>I'm torn on this: it's fake but it's a good way of getting around optics limitations in certain scenarios. Is fake detail on a photo worse than no detail for non-scientific, consumer devices?
I'm amazed at this New York Times article from 1984 narrating about a "not-too-distant future" that is actually happening now:<p><a href="https://www.nytimes.com/1984/11/04/magazine/photography-s-new-bag-of-tricks.html" rel="nofollow">https://www.nytimes.com/1984/11/04/magazine/photography-s-ne...</a><p>> In the not-too-distant future, realistic-looking images will probably have to be labeled, like words, as either fiction or nonfiction, because it may be impossible to tell them apart. We may have to rely on the image maker, and not the image, to tell us into which category certain pictures fall.
Simplified summary for those scared off by a link to Reddit:<p>He made an image of an intentionally blurry moon, and then used the camera to take a picture of it while it was displayed on the monitor from across the room. The resulting picture was of a sharply detailed moon!
Next gen: you go to a music concert and take some crappy photos - low light, far from scene. AI recognizes the artist and downloads artist's most recent high-res stock pictures from GettyImages, has a general model of how a scene is setup, lights, lasers, and re-synthesizes your photo into something with the production quality of a movie concert scene.<p>In a movie it also resamples the clipped audio with the actual song ran though a "live concert" audio filter, adds some nice crowd noises too.<p>Next-next-gen: using deep fake technology artist voices a song dedication just for you.
The original discussion unfolded here 2 days ago:<p><i>"Samsung “space zoom” moon shots are fake, and here is the proof"</i><p><a href="https://news.ycombinator.com/item?id=35107601" rel="nofollow">https://news.ycombinator.com/item?id=35107601</a> (375 comments)<p>And to potentially save you a click, here's the original post:<p><a href="https://old.reddit.com/r/Android/comments/11nzrb0/samsung_space_zoom_moon_shots_are_fake_and_here/" rel="nofollow">https://old.reddit.com/r/Android/comments/11nzrb0/samsung_sp...</a>
Yeah I can just about get a decent crop of the moon on my Nikon Z50 with the 250mm lens on it that looks reasonable, if it’s on a tripod. There’s no way this wasn’t going to be some other trickery going on.<p>What is worrying is why would you buy something with a camera that lies at source. When you need it to tell the truth, how do you know it will be telling it?
It happens that Samsung is actually doing brilliant job in faking it, even accounting for obstruction like tree branches etc: <a href="https://twitter.com/David4252579/status/1634919880217731075" rel="nofollow">https://twitter.com/David4252579/status/1634919880217731075</a><p>All this could have been OK if Samsung wasn't misleading people into believing that the picture is a result of computational photography and excellent optics.<p>I would love to have this in my phone's camera, but clearly identified as details added after the fact and might not represent the actual scene. It's done so well that practically no one noticed and people like MKBHD reeved it as a photography achievement.<p>The truth is, the non-AI implementations also fail to capture how we see the Moon and fixing it with fake details is a welcome improvement as long as it's clear that that's not actual capture.
Scott Hanselman posted today that this is no secret. Samsung have been talking openly about their "Moon Recognition Engine". It's on their website: In Korean, of course. But there's an automated translation of that too.<p><a href="https://www.tiktok.com/@shanselman/video/7209770920301399342?_r=1&_t=8aarTAPfxUp" rel="nofollow">https://www.tiktok.com/@shanselman/video/7209770920301399342...</a><p><a href="https://hachyderm.io/@shanselman/110012495335125530" rel="nofollow">https://hachyderm.io/@shanselman/110012495335125530</a>
I'm not really "nail in the coffin" convinced.<p>Is there a process that does something akin to identifying the moon, and selecting a special filter for that? I'm pretty convinced of this. (Both by the evidence and also it's apparently advertised as such?)<p>What is that filter? I don't know, and the answer to this would inform "how fake" I think these images are.<p>It does seem to be a moon-specific treatment. It doesn't seem to be pasting a JPG of the moon and it doesn't seem to be generating the moon.<p>How fake it is, is, I think a measure of how dependent the result is on the specific sensor data, and how localized the changes are. If there's a moon-specific filter that's excellent at filling in data "between" sensed pixels without looking at any pixels far away (beyond, perhaps, the initial categorization as a photo of the moon where this filter would be good), that's not very fake to me.<p>If there was suddenly a new structure on the moon visible to the sensors, I'd want it reflected in the photo. I would consider the process very fake if it couldn't do that at all, and not very fake if it could do it reliably.
I’m just frustrated that there isn’t even basic thought from Samsung’s end. It could at the very least pull location data, check to see if the moon is above the horizon for the location, check to see if the phase of the moon matches what it sees and THEN apply the filter.
I don't know why this saga is still ongoing. It is a known fact that Samsung enhances the quality of known objects in images with AI. They literally advertise this.
There are aftermarket photography apps. Has anybody tried taking their Samsung S23 and photographing the moon /w one of those? I wonder if the "AI assistant" is limited to Samsung's own bundled camera app, or if it's always on & modifies pictures taken via other apps.
If you didn't go through the comments, this one is interesting imo:<p>someone took a handheld picture of the moon with S21 using GCAM instead of the default app and the results are also quite awesome:<p><a href="https://imgur.com/a/jf1Cbqu" rel="nofollow">https://imgur.com/a/jf1Cbqu</a>
I am not convinced this means they are fake. I'd expect any image enhancement using good AI to depend on the surrounding areas. This could plausibly cause a blank square to be filled in when the surrounding areas have data, while a blank square on its own is not.
I wonder what kind of implications this will have for court cases.<p>There will have to be expert witnesses to discuss any image that is brought in, if it's real or not.
I'm still skeptical. He blurred it and then cropped it. The sharp edge from the cropping is going to mess up the deconvolution optimization by causing any and all deconvolution to strongly reduce sharpness along that edge. If I'm right, the Samsung would restore both halves equally well if the blur was applied after the cropping rather than before.<p>I still want to see this whole thing done without Gaussian blur, which is readily reversible by even primitive algorithms.