(bit of a rant here, you've been warned)<p>As someone who <i>could've</i> developed an IT/programming career, but didn't because I felt things were already bloating back in the '00s, I agree with the majority: "harvesting your own food" can be rewarding but also a tedious and thankless job. It's certainly not for everyone, but if it works for some people then it is (let's put efficiency aside for a moment) perfectly valid. In fact, being more of a H/W guy I find myself gravitating towards this approach more often than not. Leanness and reproducibility is key for my workflow (I went the RF-world path), I can't afford different end results when a dependency changes/breaks something.<p>IMHO, keeping up with the modern paradigms for S/W development looks like a never-ending nightmare. Yes it's the modern way, yeah it's the state of the art. Still, I didn't feel it was a wise investment of my time to learn all those "modern dev" ropes, and I still feel that 20 years later. I'm nowhere near antiquated and I'm on top of all things tech (wouldn't read HN otherwise), it's just...<p>I see former friends/classmates that went this way, and they're in a constant cat-and-mouse game where 50% of time they're learning/setting up something dev-chain related, the rest 50% doing actual work, and 98% of it feeling way too stressed. I see modern Android devices with their multi-MB apps, bloated to hell and beyond for a simple UI that takes ages to open on multi-core, multi-GHZ SOCs. I see people advocating unused RAM is wasted RAM, never satisfied until every byte is put to good use, reluctant to admit that said good use is just leaving the machine there "ready" to do something, but not doing anything _productive_ actually.<p>And yet.<p>Without that bloat, without the convienience of pre-made libraries and assist tools for almost every function one could desire, we wouldn't be where we are now. Imagine for a moment doing AI-work, 3D movie rendering, data science etc. with a DBless approach on single-core machines with every resource micro-managed to eke out the most performance. It's simply not feasible, we would still be on the 90s... just a bit more hipster.<p>This article resonates so well with me. And at the same time, it feels so distant.