You all must've experience a different past than I did. Sure there were exceptions, but software in the 80s (lol), 90s, 00s, and 10s were generally slower, buggier, and uglier than what we have now. We're talking minutes to start some programs, sometimes it just stopped working and you couldn't use it on that computer ever again without reinstalling your OS. Software regularly crashed - like once an hour for Word. Sure, there were some fast programs, but they often did very little or have any modern conveniences. Global, fast search basically didn't exist in most programs. Rendering issues all over the place. "Oops, that file you were working on is now permanently corrupted!" was not some bizarre occurrence, but something you regularly prepared for by making copies of your files as you worked.<p>I think this is similar to video games - people see a pixelated edge in a game and think it is crap because it rarely happens.<p>Now if you remove "modern" from the question, and just ask how you explain the sloppiness of software, then all the same answers come back up. Today, where software development feels about 90% gluing stuff together, it's because there is so much stuff out there we can't even be aware of it anymore.<p>It used to be that a person could be read on nearly everything (100+ years ago), then you could do so within your field and be aware of everything else, then, for most of the 20th century, you couldn't know everything in your field, but you could mostly know it and be aware of everything else. Sometime about 10 years ago we hit an inflection point, where you can no longer even be aware of everything in your specialty. If you are a database "specialist", you can't even know of all the databases that exist, let alone understand them.<p>So weird, and so impactful to us and our society.<p>Anyway, old man rant over.