There's a vast gap between what we can run and what we're told we should run.<p>Why not run on old hardware, with software that works for you? Because of electricity? As others have pointed out, the environmental cost of manufacture is greater than the difference in performance per energy unit for even systems that are ten years old, and in some cases, older. We're literally sending hardware to landfills because companies like Microsoft want to make more money (which, incidentally, is why I think Microsoft is the cause of the Dark Ages of Computing and why Bill Gates will eventually be remembered as someone who caused more waste and held back progress more than anyone in this time period).<p>I run servers, and even though many of the servers I run are using modern hardware (ARM, Apple ARM, AMD Ryzen), I still run a fleet of AMD AM1 Athlon hardware. They serve DNS, web, firewall, SFTP, NAT, email and so on in all sorts of environments. Why? Because to NAT and firewall a gigabit of traffic you don't need more than a quad core, 2 GHz CPU, and because using all four cores at 100% still takes less than 20 watts for the whole system.<p>I'm even building one in to a 1U case right now to go to colo because the colo power budget is 100 watts or less, and I also have four 10 TB drives and a RAID controller to add. But even these 2014 systems, unlike Intel CPUs of the time, can take 32 gigs of ECC memory, so they're still very usable.<p>So much software has artificial barriers. You need AVX. You ned AVX2. You need SSE4.2. You need FMA3. But why? Do you REALLY need them, or are you fine running certain software a little slower? After all, you're not going to use your 2014 AMD Athlon to transcode to h.265 often.<p>It makes me sad that so much hardware goes to landfill because of completely ridiculous reasons. Add FUD that people share about how just writing zeros over a drive is somehow not good enough, and you have the problem of people literally destroying hardware rather than recycling. It's not a good way to run a planet.