In most of the cases discussed in the article, it's not the software that rots but instead the users and/or organization that decays. When a system is created, it is generally designed to solve a known set of problems encountered by an existing population of users. Over time, the tool creators, original users, and the current problem domains change but often the old system is modified to solve these changes. A flexible system may be adaptable to certain amounts of change, but only if the current populace of creators/maintainers as well as users understand the limits. If they do not, the software often ends up less useful than if it had never been touched.
Note, article is not talking about bit rot, I think that's confusing a lot of people.<p>The point might be clearer as:<p>"Adaptability and efficiency are opposing priorities."<p>Ecosystems face this too. A stable environment will lead to adaptations that improve efficiency, while creating new dependencies on everything staying the same. In a sense, species are constantly competing to make the ecosystem more fragile.<p>If this holds, there are broad impacts to information systems outside of software. broader impact of this. We like to fantasize about the mind being immortal. Maybe we could fix the telemere thing, figure out cancer, hop our brain to a clone, or upload our consciousness to some cloud.<p>But in my experience, being mentally alive involves some mix of plasticity and progressive refinement. You can't have both forever.
Software rots if you can no longer set up the exact tool chain and environment (compiler, build tool, OS version, external database, etc) required to build and run it. Even interpreted languages suffer from this problem as features are subtlely changed - by design or accident. It doesn't matter if your project is using an ancient version of VC++ or a two year old version of NodeJS. If it doesn't keep up with the latest releases of the tools then it's already got one foot in the grave.
Unless your sandboxed in a self contained hardware and software environment (such as an embedded system) you will eventually be screwed. (rot sounds like a gradual degradation its generally not in my position)<p>Standards changes and OS Updates are the biggest culprit; The Vista update broke a couple of my old programs (written for Windows 95) due to the user access control changes, another was broke due to the fact it interfaced to a piece of hardware on the parallel port and the parallel port and the manufacturer went the way of the dodo.<p>I have seen a stand-alone DOS 6 program running a machine at a factory. The PC has been replaced three times now is only a few years old but the operator says it still does the job, I also have a 8051 powered clock I built in 1987 that still happily ticks along if I plug it in.
Three reasons:<p>1. The code is the same but the people using it forget / never knew the reasons it was built that way - and so it looks rotten for the job<p>2. Because requirements change and people try to make the old code do new things without cleaning up / refactoring correctly - so it now does neither job well, and looks rotten.<p>3. Because the environment / platform changes, the FTP server is moved to a new data center and the timeouts kill the jobs etc. It looks rotten.<p>"Rot" more accurately is just not keeping the code inside the code base up with entropy <i>outside</i> the code base
An interesting simile here is that software rots in the same way that Encyclopedias do.<p>(This is admittedly a fitting simile partly because the simile itself is being rotted by software like Wikipedia.)<p>In the world where new Encyclopedias (and Almanacs and Recipe Books) were printed and sold on an annual basis, the question was often why do we need "this year's Encyclopedia" when the old one is still perfectly valid. Books in general decay pretty slowly and have a long shelf life, but the facts and the views in the world inside them are frozen and possibly. Changes from year to year of an Encyclopedia are somewhat hard to notice, but in Middle School in the 90s I recall having to compare articles from a tobacco yellowed Encyclopedia set from the 70s to trips to the same articles from very early predecessors of Wikipedia. The worlds contained in those two sorts of Encyclopedias were very interestingly diverging. The yellowed Encyclopedia's facts were almost all still valid and "worked", but there were things that didn't hold up and lots of new facts that needed to be inserted in various places. If I were to edit an Encyclopedia, I'm not sure I would start from the version in that yellowed Encyclopedia if I could find a more recent set. Some of the predecessors to Wikipedia were direct descendants of that yellowed Encyclopedia and yet for various reasons historical and technical, Wikipedia itself did not inherit directly from that set in any meaningful way.<p>(It's interesting to note too that the physical media of software to date has a much shorter shelf life than the pulp medium of books, tobacco-smoke-filled library aging included, so argument exists that software rots worse than Encyclopedias physically, at least.)
I find that integrated tests in software projects go a long way to reducing rot. When something eventually invariably breaks due to some external factor, the test suite greatly reduces the time to identify and fix the problem.
"... the arrogance and self-indulgence of youth."<p>s/youth/younger programmers/<p>How can they make work for themselves?<p>1. Do what's already been done, not even knowing it's already been done.
2. Declare "rot" or some similar claim of obsolescence and proceed to redo what's already been done.<p>There's nothing necessarily awful about this unless they fail to do a better job than the earlier effort.<p>Alas, this is too often the case. For a variety of reasons.<p>In the early days, portability was a higher priority. Not to mention longevity. Because everything was expensive.<p>Today's software "rots" a lot faster than the software from the early days of computing, IMO.<p>And so the younger programmers have lots of "work" to do.<p>Yet I do not see much progress being made. Because I do not measure progress by productivity alone.<p>Programmers who can churn out code in a dozen different languages to do the same old things are a dime a dozen.<p>As a user, I do not want software that needs to be updated every week. Poorly written software and gratuitous use of network bandwidth.<p>But I can see how programmers who love writing code would enjoy this state of affairs.
I think <i>rot</i> is not quite the correct metaphor. In my experience it's more likely to ossify, become sclerotic, build up scar tissue. As features are added or performance is tweaked, individual pieces become more complex and the connections between them multiply. If specific action (refactoring) isn't taken to fight this tendency, later developers will react to one piece being maintainable by making even more spurious connections and workarounds in adjacent pieces. That fixes the immediate problem, but makes things worse overall in the long term. Ultimately everything turns into the kind of tangled mess that everyone who has worked on an old multi-person project can recognize.<p>Unfortunately, a good refactoring requires understanding greater than the original author's[1], and therein lies another whole essay. ;)<p>[1] Related to <a href="http://www.linusakesson.net/programming/kernighans-lever/" rel="nofollow">http://www.linusakesson.net/programming/kernighans-lever/</a>
> Apache, the most important web server software today, is an old piece of technology whose name is a play on words (“a patched server”) indicating that it has been massively patched.<p>Is this true? I have never heard that Apache was a play on of words.
There is an analogy to be drawn between software and societies, and they way their early adaptations to one environment block their later adaptions to another.
What does rot even mean, in terms of software?<p>As far as I can tell, he means "Why is complex software hard to <i>change</i>" which is a reasonable, though fairly easy to answer, question.<p>Software doesn't rot. New features or refactors screw it up. New needs or technologies may make it obsolete. But it doesn't rot, and people who talk that way are often busy-body rewriters who want to pitch the existing implementations, with all their Chesterton Fences[1], and begin anew.<p>[1] <a href="http://www.chesterton.org/taking-a-fence-down/" rel="nofollow">http://www.chesterton.org/taking-a-fence-down/</a>
Stewart Brand wrote a book called "How Buildings Learn" and in many ways it's a better version of this post.<p><a href="https://en.wikipedia.org/wiki/Shearing_layers" rel="nofollow">https://en.wikipedia.org/wiki/Shearing_layers</a>
> Newer programming languages are often interesting, but they are typically less flexible at first than older languages. Everything else being equal, older languages perform better and are faster<p>At the cost of more difficulty in writing, or other tradeoffs (e.g. security).<p>> Programmers, especially young programmers, often prefer to start from scratch. .. In part because it is much more fun to write code than to read code, while both are equally hard.<p>No way! You need to NPM literally everything.