First, I love that this is from over 60 years ago. Here I am in the 21st century, happily thinking that genetic algorithms are relatively new techniques, and along come Von Neumann on HN, reminding me that, no, of course this is foolish, they are if not billions of years old, they are older than old, newer than new, infinitely permanent.<p>Second, more specifically, error correction is a natural extension of two techniques/mandates:<p>a) The need for energy: A machine that is not provided plentiful energy will soon cease to operate as a machine unless it can gather it's own, and graduate to organism status.<p>b) Reproduction: many copies have the opportunity to survive beyond catastrophic errors to a single organism (eg. lightning strikes) and reproduce with mutation to reduce effects of minor but still limiting errors (eg. cold-bloodedness).<p>Thanks for the link.
Can anyone comment on the relevance of this in modern CS or related fields? As a non CS major, whenever I see material from Von Neumann, I can't help but be amazed at it. Yet I get the impression that while many people appreciate his work (he presents very compelling concepts in my opinion), it didn't have the impact on CS or other fields you would expect. Am I right or wrong? Where are these ideas of Von Neumann applied today?
<a href="https://en.wikipedia.org/wiki/Reliability_theory" rel="nofollow">https://en.wikipedia.org/wiki/Reliability_theory</a><p><a href="https://dx.doi.org/10.1006%2Fjtbi.2001.2430" rel="nofollow">https://dx.doi.org/10.1006%2Fjtbi.2001.2430</a><p>"Reliability theory is a general theory about systems failure. It allows researchers to predict the age-related failure kinetics for a system of given architecture (reliability structure) and given reliability of its components. Reliability theory predicts that even those systems that are entirely composed of non-aging elements (with a constant failure rate) will nevertheless deteriorate (fail more often) with age, if these systems are redundant in irreplaceable elements. Aging, therefore, is a direct consequence of systems redundancy. Reliability theory also predicts the late-life mortality deceleration with subsequent leveling-off, as well as the late-life mortality plateaus, as an inevitable consequence of redundancy exhaustion at extreme old ages. The theory explains why mortality rates increase exponentially with age (the Gompertz law) in many species, by taking into account the initial flaws (defects) in newly formed systems. It also explains why organisms “prefer” to die according to the Gompertz law, while technical devices usually fail according to the Weibull (power) law. Theoretical conditions are specified when organisms die according to the Weibull law: organisms should be relatively free of initial flaws and defects. The theory makes it possible to find a general failure law applicable to all adult and extreme old ages, where the Gompertz and the Weibull laws are just special cases of this more general failure law. The theory explains why relative differences in mortality rates of compared populations (within a given species) vanish with age, and mortality convergence is observed due to the exhaustion of initial differences in redundancy levels. Overall, reliability theory has an amazing predictive and explanatory power with a few, very general and realistic assumptions. Therefore, reliability theory seems to be a promising approach for developing a comprehensive theory of aging and longevity integrating mathematical methods with specific biological knowledge."