The "PC losering" anecdote in Gabriel's original essay is vert dated.<p>In fact, neither design is the "better".<p>In not-so-modern-anymore POSIX, you can choose whether a system call will be restarted after a signal is handled, or whether it will terminate with an error. Both requirements are needed.<p>It is signals themselves that are "worse". But they let you have asynchronous behaviors without using threads.<p>Sometimes you want a signal handler to just set some flag. This is because you have to be careful what you do in a signal handler, as well as how much you do. And then if you want the program to react to that flag, it behooves you to have it wake up from the interrupted system call and not go back to sleep for another 27 seconds until some network data arrives or whatever.<p>In addition to sigaction, you can also abort a system call by jumping out of a signal handler; in POSIX you have sigsetjmp and siglongjmp which save and restore the signal mask. So that would be an alternative to setting a flag and checking. If you use siglongjmp, the signal itself can be set up in such a way that the system call is restarted. The signal handler can then choose to return (syscall is restarted) or bail via siglongjmp (syscall is abandoned). I wouldn't necessarily want to be forced to use siglongjmp as the only way to get around system calls being always restartable.<p>Anyway, the Unix design showed to be capable of being "worse for now", and have space to work toward "better eventually".<p>In the present story, the monolithic linker design isn't "worse". Let's just look at one aspect: crashing on corrupt inputs. Is that a bad requirement not to require robustness? No; the requirement is justifiable, because a linker isn't required to handle untrusted inputs. It's a tool-chain back-end. The only way it gets a bad input is if the middle parts of the toolchain violate its contract; the assembler puts out a bad object file and such. It can be a wasteful requirement to have careful contract checking between internal components.<p>Gabriel naturally makes references to Lisp in the Rise of Worse is Better, claiming that Common Lisp is an example of better. But not everything is robust in Common Lisp. For instance the way type declarations work is "worse is better": you make promises to the compiler, and then if you violate them, you have undefined behavior. Modifying a literal object is undefined behavior in Common Lisp, pretty much exactly like in ISO C. The Loop macro's clause symbols being compared as strings is worse-is-better; the "correct requirement" would have been to use keywords, or else symbols in the CL package that have to be properly made visible to be used.<p>I don't think that Gabriel had a well reasoned and organized point in the essay and himself admitted that it was probably flawed (and on top of that, misunderstood).<p>The essays is about requirements; of course the assumption is that everyone is implementing the requirements right: "worse" doesn't refer to bugs (which would be a strawman interpretation) but to a sort of "taste" in the selection of requirements.<p>Requirements have so many dimensions that it's very hard to know which directions in that space point toward "better". There are tradeoffs at every corner. Adopt this "better" requirement here, but then you have to concede toward "worse" there. If we look at one single requirement at a time, it's not difficult to acquire a sense of which direction is better or worse, but the combinations of thousands of requirements are daunting.<p>If we look for what is the truth, the insight in Gabriel's essay it is that adherence to principled absolutes is often easily defeated by flexible reasoning that takes into account the context.<p>3.1415926 is undeniably a better approximation of pi than 3.14. But if you had to use pencil-and-paper calculations to estimate how many tiles you need for a circular room, it would be worse to be using 3.1415926. You would just do a lot of extra work, for no benefit; the estimate wouldn't be any better. Using the worse 3.14 is better than using 3.1415926; that may be the essence of "worse is better". On the other hand, if you have a calculator with a pi button, it would be worse to be punching in 3.14 than just using the button, and the fact that the button gives you pi to 17 digits is moot. A small bit of context like that can change the way in which the worse-is-better reasoning is applied.