It's always interesting to note editorialized headlines. People "fleeing" Instagram, or "scrambling", or covid "surging", etc. When they could just provide the data about how many users are leaving and let the data speak for itself.<p>Whether there's even any data behind this or if it's an article based on a handful of random tweets who knows, as it's a pay-walled article.<p>On a meta level, what percentage of HN readers are paying for a subscription to the washington post? Why are links to paid articles even allowed when the vast majority of readers won't even have access to read it?
As the saying goes, you can't put pandora back in the bag. AI art is here to stay, and it's here to dominate. The problem is the economic system within which art is created and assigned value, any attempts to solve the "problem" of AI art without addressing the underlying economic incentives are destined to fail in the same way that we have failed in preventing the use of sweatshops by clothing suppliers.<p>Even if we have strong, even overbearing legislation put into place to protect artists, we're just going to end up with the biggest most profitable offenders buying/using illegally trained models through middle men, and feigning ignorance when/if they are found out.<p>Though most of the "has this model been trained on X" conundrum is likely to be irrelevant on a years (months?) timescale as art styles are far less unique than artists would like to think. See tencent's PhotoMaker or other modified CLIP approaches. Even a model not trained on a particular face can be conditioned to generate said face using a oneshot approach because the vector representing that face can be constructed even if the face itself doesn't exist in the training data. I'm certain the same is true for artistic styles.
Unsurprisingly, artists are more neurotic than the average person. Online artists also seem to spend a lot of time lying to each other about how AI works.
Who remembers when Instagram changed its TOS giving itself license to do these things?<p><a href="https://www.theverge.com/2012/12/20/3790560/instagram-new-terms-of-service-from-overreaction-to-retraction" rel="nofollow">https://www.theverge.com/2012/12/20/3790560/instagram-new-te...</a>
No novelist wrote a novel without first reading other people's work, no painter painted a picture without studying other painters first. And then sold their work for money. All knowledge is derivative. AI just happens not be a person, that's all. But it isn't any different. As much as I hate AI, I don't see any problem with training models based on other people's work.