Last night, my partner found a new Cinematic Memory in Google Photos, called "Over the Years."<p>We watched it together, and in it, Google animated static photos of her over the last 5 years. It included the typical 3D field effect, but what was disturbing was that it also animated her body and her face. We watched video after video of Google making her likeness smile and turn her head towards the camera.<p>They have to be using facial data from her other photos to make her smile in photos where she was not originally smiling, tilt her head seductively in photos where she never did that, etc.<p>I assume this is a new feature that they're beta testing, because I can't find any information on it whatsoever.<p>They essentially created a deep fake of her, without her permission, and then made this deep fake do things that she never did.<p>I can't imagine what PM at Google ever thought this was a good idea for a feature. If you're productizing AI, you need to be extremely careful that you're not violating your user's personal boundaries.<p>Has anyone else seen this yet?
They don't need to train on your photos to make your photos do these things. It's a diffusion technique / model that can operate on any photo it hasn't seen before. I've seen many "make these people smile in the photo" apps. There is also multiple papers on turning photos into videos.<p>They demoed the photos history / storyboard et al during Google i/o a week or two ago. The "showing the progress of your child learning to swim" was pretty impressive because it included photos of swimming certificates along side ordering the actual swimming photos, all without needing to organize the photos manually, so there is a query/embedding aspect to it
Google knows too much about us, and now MS is out to understand you better than yourself. I thought it couldn't get scarier with "AI" spearhead initiatives, but I was wrong<p><a href="https://www.windowslatest.com/2024/05/20/microsoft-confirms-windows-11-recall-ai-hardware-requirements/" rel="nofollow">https://www.windowslatest.com/2024/05/20/microsoft-confirms-...</a>
To answer your question directly, yes I have also seen that recently on Google photos in the US. I found it weird and useless, and you're right that it's essentially a deep fake produced by Google on our behalf without being asked to. From news reports, it looks like Google teams are desperately trying to build AI in every project:
<a href="https://www.businessinsider.com/ex-googler-ai-work-driven-by-stone-cold-panic-2024-5" rel="nofollow">https://www.businessinsider.com/ex-googler-ai-work-driven-by...</a>
I mean what is your problem?<p>Do you think Google is not analysing your pictures even when it creates highligh memories for years or has semantic search?<p>That's why you use it.