It's scary, because CSAM (child sexual abuse material) is very, very broad, and quite vague in its definition¹. My three year old son stayed with my parents for a week, and he was being taught to swim in their swimming pool. Of course, I get sent pictures and clips of his progress by a proud grandma, and of course, him being a toddler, he felt swimming trunks to be completely optional (and of course, they are at his age in a private pool).<p>That's not CSAM right? But who knows how the AI will mark those happy pics? Some quirk in its programming and training leading to them being pushed way up whatever ranking is used? It's a nude child after all, and that is most likely one of the few things such an algorithm can detect quite reliably.<p>The automated filter won't care about the context though, and if any of the recent failures of algorithms ruining peoples lives are anything to go by (the Dutch Toeslagenaffaire comes to mind), being flagged by the CSAM filter is a high risk event — even if you can clear your name later by human intervention — because now you are on a list.<p>1: People tend to conjure up horrific images of underage children getting raped by adults when they hear 'CSAM' or 'child pornography', but when you read up on the legal definitions used it becomes very vague, quite fast. A seventeen year old boy sending a dick pic to his 18 year old lover is producing child pornography and can in many jurisdictions be sentenced as such, and nudism of any minor can be seen as CSAM if the pose they strike can be construed as 'erotic' — yet there is no clear definition of what this means, falling squarely into “I'll know it when I see it” territory.