In New York court on May 20th, lawyers representing victims of a mass shooting in Buffalo, New York argued that Meta, Amazon, Discord, Snap, 4chan, and other social media companies all bear responsibility for radicalizing the shooter. The companies defended themselves against claims that their respective design features — including recommendation algorithms — promoted racist content to a man who killed 10 people in 2022, then facilitated his deadly plan. It’s a particularly grim test of a popular legal theory: that social networks are products that can be found legally defective when something goes wrong. Whether this works may rely on how courts interpret Section 230, a foundational piece of internet law.