Opening up a whole new dimension of adtech personalization/engagement here... more evil brainstorming:<p>- electronic billboards on the bus corner start doing this while walking down the street<p>- your favorite peacock sitcom starts injecting your friends or recent vacation locations or friends' recent vacation locations on the main character's digital picture frame. Your buddy recently went to Sandals Resort in Jamaica? Now that's going to be emphasized in the unused screen real estate in the background of the bar scene to play on your fomo.<p>- hell, sporting games already have digital billboards on the field barriers that are customized for different markets / broadcasters... why not customize them further for the individual stream
Standard internet outrage bait. There are no ads in the photos. The person uploaded their photo to Meta AI's "imagine me" feature which generates photos of you in exotic situations, and now the company is...putting them in exotic situations. That's literally what it is for.
That's... not an ad? User used Meta AI to touch up a selfie, now Meta AI is generating more selfies for you.<p>I can understand the annoyance from cross-linking apps in the same way it's annoying to get the Threads popup in your feed, but really, this is utter clickbait.
I'll play some defence for them here. You were playing with Meta's AI face tool, and now it's taken some results from that and swapped them in where ads would usually go. I'm assuming they don't do this if you just uploaded photos to Facebook/Instagram, you seemingly gave them a picture with the direct intent of them using it to make AI images.<p>It doesn't seem that much different from when I'm typing a chat reply in Snapchat and it starts automatically suggesting stickers with mine and my friends' faces doing silly things with cartoon bodies. Or using my avatar to try and upsell me to their premium subscription. Don't give them a picture of your face to mess around with if you don't like them messing around using a picture of your face.
It's only a matter of time before this gets put into the programmatic ad formats where you can do individual level targeting. The advertiser won't ever have to see your image as meta will offer it as a service. More importantly, just like things like pmax (google's current optimization product), your account manager will heavily incentive you towards running these AI campaigns.
I think the common wisdom is that if you use a Meta product, they will try every possible form of social engineering to drive engagement.<p>While this example was unexpected, it was predicted (<i>Minority Report</i> et al.) and is very much in line with their MO.
Meta AI asks permission to do this, but note that in some U.S. states personal publicity rights end upon death. Isn't it hilariously Wallacian that this technology can be used to make targeted ads featuring the viewers deceased loved ones?
So, I had a terrific loss of the love of my life. I don't know how, but am seeing ads that literally resemble her likeness and it's happened more then once. Not look alike but her. Once I found out photobucket was training its AI on users photos, I removed them all and uploaded them to Google photos and now on youtube I'll see an add for a college and there she is dressed up graduation dress and colors of the college (a local) college.its like these evil companies are rubbing it in my face I'm thinking of getting a flip phone my heart is broken and continues to be everytime she pops up in an ad even though I know it's impossible to be her, the data analytics companies know of us through whatever our unique IDs are, picture scans or whatever the case and now they are breaking my heart daily and it's been over a year and I'm still mourning barely making it thru my days only to have her keep popping up how evil
There is an interesting phenomenon going on here. On one hand, this is pretty mundane: a software tool you’re currently using shows you examples of ways to use the software. On the other hand, this does technically constitute an “ad”.<p>What’s interesting to me is that some people seem to have a strong emotional reaction to the fact that it’s possible to describe this as “Instagram using my face in an ad,” even if the underlying event lacks the characteristics that would make that statement outrageous.
> Imagine yourself reflecting on life in an endless maze of mirrors where you're the main focus.<p>This feels almost a bit too on-the-nose. But no, apparently it's real?
What better way to increase the hyperreality needed to sell products then to have you in the ad! But also this is HN, so someone needs tk lecture you on “what you mean” and also remind you it’s their product you used and signed up for and signed your rights away too so it’s YOUR fault, or some such pompous condescending trash. I say, lucky you! I say, embrace the future! I say, plaster my face on adult diaper ads and sell them to me. I say, slap my face on that TRT ad from my podcast that I listen to in my lonely apartment to feel a connection to the world while I write code that “matters”. It’s a beautiful world.
I'd like to see the "imagine what you'd look like if you exercised a lot more" or "had better posture" or "spent all summer in the sun" filters.
Jesus Christ the victim blaming in this thread is insane. This is why deepfake laws are needed. Sure OP used Meta AI, and consented, but if I did that I'd probably be consenting to using the picture _for that one session_, where I'm in control. Definitely not for this, they shouldn't be able to put this in their ToS.
Facebook is doing a public service here by being overt.<p>Anyone else would just quietly take your face and those of your contacts and use it to generate an endless stream of faces which are unambiguously not you, yet compellingly familiar.<p>...maximizing their brainwashing effectiveness while minimizing the ick factor and triggering privacy minding behaviors.<p>A lot of harm is done in the world by parties adaptively keeping their invasive conduct at just below the level which will trigger retaliation, legislation, etc. Like databrokers will sell exceptionally private data to anyone with a few bucks, but it's only a 'data breach' when someone takes that data without paying. The existence of databreaches even provides nice plausible deniability for harms that arise from the business of privacy-invasion-for-profit.
I really hate ai generated images of real people. If we thought Photoshop was bad for body image issues this is like putting that effect on heroin. I hope there's a massive backlash against this AI shit tbh. Leave it to meta to find a psychologically toxic application for any given technology.
wait until they generate your, possibly long dead, grandmother (just using information of your and your parents faces and voices) convincing you to try that brand of pies.
just a reminder that you are the product.<p>Sorry to read this, it is frustrating to see our own face in feeds targeted at us. No clue who thought that would be a win.
Any complaint that starts with "used meta..." just loses me. It's like, I stuck this needle into my left testicle and now that testicle hurts.
For starters, how is it not a violation (eg. "personality rights") to use a person's likeness without permission?<p>As a further example, do we really want insurance companies serving ads using near and dear ones as potential disaster victims? This is really getting out of hand.
Maybe I'm in the minority here, but this is kinda cool! As long as the user data doesn't leave the Meta ecosystem (no reason to think it does right now, the ad in question here is from Meta itself), it's not a privacy concern since only you are being shown those unique ads with you in them.<p>Even if other advertisers start using the system, as long as the generated resulting images are never shared with the advertisers and are unique to each user, its just a futuristic way to help you "imagine" what having XYZ product would be like, which is what most ads strive to do.<p>People have knee-jerk reactions to anything to do with ads because of the privacy concerns of yesterday, understandably. But if you actually step back and think about this, there's no reduction in <i>privacy</i> that I can see. If people are creeped out by it, I think they should maybe let people disable them with a setting.<p>But in general, making ads more effective <i>without</i> giving advertisers more data about us is a <i>great</i> thing for the continuation of free amazing internet services!