> The real horsemen of the legacy apocalypse is the depth of the dependency tree.<p>I won't speak for Java or Python. For Node/JS, I do not see this being a problem for the future. The reason is because npm is so fragile that the entire thing falls over <i>today</i> if you just look at it wrong. The very second you run "npm install" you have a mountain of instant tech debt. The primary reason for this is because the people writing JavaScript do not know what they are doing. They have not cut their teeth on libraries in C/C++ or other languages and have no clue how backwards compatibility or versioning works. If you doubt what I'm saying, then I invite you to do your own quality inspection of any major JS library or framework out there. I'm not going to name names, but the vast majority of it is pure shit.<p>In my experience, JS code has a life expectancy of about a year and a half. Give or take. After that, it is rewritten. Which also contributes to the lack of quality in the JS ecosystem. There is a built-in assumption that this code won't survive past the time your average dev gets bored of it.<p>People, today, do not realize there are JS frameworks that predate React/Angular/etc. and <i>have already died</i>. Once popular frameworks. And I'm not talking jQuery. You don't see them because JS development = churn. You don't even see CoffeeScript mentioned today, and that was just a few years ago.
There's a fundamental question here which no one ever seems to ask: <i>Why</i> is the modern software industry in such a constant state of flux? Are we really becoming so much collectively smarter every year, and if not, why did the previous version of Software X make the wrong decision? Is there an end point where we all actually figure out how to develop software properly?<p>As I see it, truly mature software should introduce breaking changes twice a decade <i>at most</i>, and ideally less often than that. I don't doubt that most software updates provide a net benefit, but what of the inherent cost of change? Changes require every single user to put in extra work, to adapt to the new version's functionality. Why are software projects so cavalier with their users's time?<p>And this applies to end-user software too, by the way. Every time Slack redesigns its interface, users need to relearn where all the buttons are. Nothing can possibly justify rolling out ten million redesigns; if a facelift is in order, leave it in the oven for long enough to get it right, and then be done.
Quoting the article:<p><i>Around 60% of packages on npm have not been updated in a year or more. Despite the lack of maintenance these packages are still downloaded billions of times.</i><p>This is a problem across many other languages/frameworks as well. Many popular packages have a single maintainer and the entry in the package manager index is accessible only by that maintainer. If that maintainer stops paying attention, the problems could be worse than the package just bitrotting, as we've seen from supply-chain attacks like event-stream[1].<p>There are volunteer orgs like Jazzband[2] which take group ownership of popular packages to ensure ongoing maintenance, but I've not seen many of those so far.<p>[1] <a href="https://www.hillelwayne.com/post/stamping-on-eventstream/" rel="nofollow">https://www.hillelwayne.com/post/stamping-on-eventstream/</a>
[2] <a href="https://jazzband.co/" rel="nofollow">https://jazzband.co/</a>
> I’m just not inclined to agree that civil society can’t continue to run on millions of lines of COBOL for another 60 years. It certainly can.<p>> Java 8 and Python 2 on the other hand are a far more serious threat. When systems can’t get off end of life technology they miss security updates, performance enhancements, and new features.<p>This seems contradictory. Is it not also a problem that <i>COBOL</i> hasn't been getting security updates for years (I assume)? Are COBOL systems typically isolated from the outside world in some way?
Some companies that have large COBOL code bases don't hire "software developers" -- they hire people who have worked in other careers and want to switch to software development and then they extensively train them internally. This could explain the median age of COBOL developers remaining constant.
> The real horsemen of the legacy apocalypse is the depth of the dependency tree. Modern software development stacks abstraction on top of abstraction.<p>I think Python2 will be 2050's COBOL.
Python’s standard library is essentially a Swiss army toolkit for manipulating data. It has email and imap parsing built into it! I’m sure this doesn’t completely explain the dependency tree size but it does explain the proliferation of simple packages in JavaScript. Even data structures are difficult to find in JavaScript. Try looking on npm for a max heap. Python has heapq built in. JavaScript has ten libraries that are all equally unpopular.
This is one of the better tech articles I've seen on here recently. The analysis of why Python vs. node ecosystems vary in dependency depth and breadth is good food for thought. Python has long been a "batteries" included language whereas node has not. Node makes it easy to publish packages whereas my experience with python is that it is quite difficult to get a handle on doing PyPI (versus Anaconda!) the right way. Publishing wheels versus source is confusing and it took me a while to understand the nuances. There's also the idea of supporting python2 vs python3, which people from different quarters will criticize no matter what approach you take.<p>Publishing binaries on Python is also difficult and a highly skilled endeavor requiring additional knowledge of compiling shared objects, docker, and the baroque manylinux concept. Once you have a system down, it becomes easy, but that took me about 6-12 months of accepting increasingly complex requirements before I could get a decent handle on it. I have not tried publishing binaries on npm so I can't say what the relative experience is.
Kinda surprised to not see Perl in there in some form, since it's still around as the glue holding a lot of older systems together (and for new stuff, but that's not the point of this article). Then again, it's got a little bit different of a story, since most (almost all) the old Perl from decades ago will run find on a brand new Perl interpreter released recently.
I can't help but look at this from the other direction: why does code constantly have to be updated? There's something to be said for something that, once built, simply <i>works</i>.<p>People love plenty of old things: antique furniture, oldtimer cars, classic books from centuries ago, monumental buildings. But software needs to be constantly rewritten en updated, and that takes a lot of work.<p>On the one hand I don't want to make the case for outdated languages and systems, but on the other, we are spending a lot of effort just keeping things up to date with the latest technologies. Sometimes that's really necessary of course; security holes need to be fixed, and frequently new features are necessary. Better ways of doing things have been discovered or developed. But man, there's a lot of effort going into these legacy systems.<p>Although I love learning new things, I also kinda hope that some day we'll reach systems and languages that are so well-designed that they don't need to be changed much, or at least keeping them up to date will become trivial.
I know the Rust Evangelism Strike Force is a meme, but Rust genuinely has a good solution to language rot with its "edition" concept, where the language can be updated and the compiler just converts all code to a mutually-interoperable internal representation. It’s not dissimilar to using something like Babel for JS, except that because this internal representation is extremely simple, it’s far easier for the language to change more, add restrictions, remove restrictions, and so forth, while still being interoperable. Sadly, Rust has no good equivalent when it comes to libraries, and it’s not unheard of to have two libraries that should be interoperable fail to compile because there’s no way to find a single version of a shared library that satisfies both of their version constraints, even though the actual types in question are identical. I’m not sure if the error messages still look like this, but they used to say something like `expected foo::Bar, got foo::Bar`.
> In all likelihood the reason the average age of COBOL programmers is stable is because COBOL programmers develop their depth of experience and expertise in other languages before moving over to COBOL later in their career.<p>No, it's probably that there's a steady but small influx of new developers coming in and the bell curve is throwing off the 'average'. Lies, damned lies and statistics.<p>Answering questions with statistics is a rookie mistake.
I'm banking on Java still being needed when I'm 65 (~20 years from now) and close to retirement. When my 401k is worthless, I'll still be able to make a living until I'm dead and buried. Seriously though, Java is the COBOL of tomorrow.
She found plenty to talk about without even mentioning the Lava Flow anti-pattern.<p>Her other essays are also insightful. The one about Steve Jobs is the best about him I have read, even considering Isaacson.
Couldn't read the article because I've "read all of your free stories this month". I would prefer a few ads rather than the Medium's garden.