Numbers 2 and 3 are inherent to free-market capitalism and/or societies (like the US) where freedom of ideological speech is protected, so they aren't solvable. I was going to write "probably aren't solvable," but I actually don't feel any optimism about it.<p>2. <i>"System design that creates perverse incentives where users’ value and wellbeing is sacrificed, such as ad-based revenue models that commercially reward clickbait and the viral spread of misinformation."</i><p>Businesses want views, and the "perverse incentives" are unfortunate consequences of human nature. People respond to emotions, especially outrage, more than reason.<p>Adblockers <i>seem</i> like they're making a dent, but they're likely making the biggest difference at large, well-funded publishers. Small publishers will continue to throw low-quality content at the web and fill pages with ads. Or, worse, they'll use "native advertising," which is possibly more insidious.<p>Regardless, there will be a hunger to attract viewers and therefore an irresistible temptation to push people's buttons in an unhealthy way.<p>The only way I can see this being fixed is fining people for spreading misinformation, and there's no way the US would create or enforce laws like that. That's a huge haven of safety for anyone who wants to publish misinformation.<p>3. <i>"The unintended negative consequences of benevolent design, such as the outraged and polarized tone and quality of online discourse.</i>"<p>This seems really similar to #2. Maybe he's referring to Twitter? I'm not sure, and I'm not sure I'd call Twitter's design benevolent. They're beholden to investors to grow their readership.<p>Either way, loud minorities tend to drown out moderates, and there's nothing you can do about that on a free-to-publish platform.<p>I don't even believe this is an issue inherent to the web. It is, again, just the tendency for organizations (commercial, political, national, and otherwise) to push people's buttons to get them to pay attention.