The Communications Decency Act pretends platforms are somehow different from publishers. ISPs and such, sure, I see a benefit in not holding the way to get to the internet at large accountable for all the things one might encounter there.<p>Individual websites, though--Twitter, Facebook, example.com--how are they somehow different from traditional publishing platforms? If NBC decided they were going to allow anyone who felt like it broadcast anything they liked without supervision or editing, it would be shut down once the first reports of, "Hey, NBC is showing me child pornography and beheading videos" came in.<p>Social media's argument appears to be, "well, we just provide a free speech platform for horrible things. And after we cause damage and people complain, we take bad content down. In N years we'll have AI that can scan things that are posted before anyone else sees them, so we can ban things secretly, which is somehow compatible with our free speech mission."<p>Meanwhile, NBC has already figured out what every hacker ought to know. Blacklists don't work, whitelists do. Someone has to be responsible, experience whatever the content is first, and make a decision to broadcast it or not, and take the heat if they make a bad choice. It feels like tech execs are just piling ramparts of code and rare earth between themselves and any responsibility.<p>What's wrong with a world in which someone at youtube has to approve each cat video before posting it? Where if you want to post a live video of yourself murdering muslims, you have to get your own domain hosting account?