This is just legal cover until such time as its possible to enforce no child exploitation imagery, no copyright stuff, etc.<p>It doesn't matter if they are able to enforce it, Valve can use this policy as cover if they ever get sued.<p>Don't overthink the motivation. They will not even have a bulletproof way to detect AI imagery as it evolves every single day as an arm's race and detection is a full-time job. Even a FAANG or a state actor would need to dedicate team(s) to detection technology and still have false negatives.<p>The same sorts of things already happen for example on YouTube and Twitch, where types of content are against TOS or copyright but enforcement is sporadic and selective, smaller operations often fly under the radar of enforcement, bigger creators who are netting the org sufficient revenue will likely be able to get away with more, etc, the automated tools for detection are flawed.