This article lays out an argument that's clearly intended to be a strawman, but that I find completely reasonable. (Some elements are clearly exaggerated: there's a difference between a website "not working in Sudan" and a website not working on five year old hardware, and modern maximalist websites often don't work properly on the kind of slightly out-of-date hardware that most real users have.)<p>There's another take on this: that, while the web is being used to implement desktop-application-like behaviors and distribute drm-protected content, web technologies are incredibly ill-suited to performing pretty much every task outside of the display of small static pages, and that every attempt to force the web to perform this kind of trick is the admittedly impressive result of a pile of ugly hacks that waste enormous amounts of resources -- whereas performing these tasks using technologies actually suited to them would be simpler for everyone involved and make this content accessible to people using less powerful hardware or with lower-bandwidth connections. I don't think anyone who has worked both with web technologies & with building native applications can seriously disagree with the idea that using web technologies to perform native application tasks requires a pile of ugly hacks, and I don't think anyone who has used older hardware to access newer sites can seriously deny that this kind of over-reach produces systems that are unusable without expensive new equipment. (We can also make the argument that, because advertising is neither the only revenue model for web-driven content nor the best by any measure, there's little sense in trying to support the huge and mostly crooked ad ecosystem when we can easily avoid it entirely.)<p>It used to be that people didn't associate the internet solely with the web, and that if people needed to use ssh, they'd use ssh; if they needed to use ftp or gopher, they'd use ftp or gopher. The web later became a catch-all protocol, and people started developing big complicated wrappers to make existing software work via a web browser while simultaneously various organizations started setting up their internal firewalls to whitelist only port 80. This is a mistake: the web was never designed to replace all of computing, and will never be able to do so in an acceptable manner. We should be taking the huge variety of the software ecosystem back, instead of trying to force everything into web tech and ignoring everything that doesn't fit.