This reminds me of the distinction between entropy and complexity. A hot cup of water has high entropy, a cold ant has high complexity. Another example I've seen describes the transition between the low-entropy state of two different-temperature bodies of fluid in contact with each other, and the high entropy state when they've equalized: complexity arises in the mixing process, with convection and mixing.<p>Maybe spaghetti code is closer to being like entropy, and we should expect busy beavers to be more analogous to complexity. Busy beavers are kind of on the transition between terminating (low-entropy?) and non-terminating programs (high entropy?), and complexity arises between low and high entropy temporally... Not a perfect analogy I know, even if I knew how to describe it precisely.
Ascribing qualities of being well structured or spaghetti code to programs as tiny as ~40 bits seems an exercise in futility.<p>However, we can prove some precise results if we assume for instance that the number of well structured n-bit programs is at most c^n for some constant c<2.<p>In that case no busy beaver beyond a certain fixed size can be well structured, as that would allow for compressibility.<p>So busy beavers, by virtue of being incompressible, cannot have much structure.
GIT (Goedel's Incompleteness Theorem 1) says that for every consistent formal system there is some statement about the natural numbers which can't be proved in it. This article made me think that maybe any statement about the naturals could be proved by F^n(M) where F is some sort of "strengthening" operation on the formal system M (probably Peano arithmetic), and n is an ordinal number. The problem of proving statements about integers then reduces to inventing notations for expressing ordinal numbers. It's then easy to believe that there might be diminishing returns on expressing larger ordinals, and therefore GIT doesn't matter much in practical reality.<p>I haven't made the above into a formal conjecture. Is there any truth to the above? Have I missed something obvious? Is there a result in the literature to this effect? Cheers.
I kind of wonder if there is some kind of overlap between this problem or some variation of it, and the (as I understand) still open question of how life manages to evolve as quickly as it does.
I don't think the "living nightmare" control flow graph is that bad?<p>B is once again a while(input == 0) loop, that exits to D. D and C together form a while(input == 1) loop, that exits to A. And A is again a switch between two routes, although in this case, rather than dispatching to one of two independent routes, it switches between a full route (go to B) and a partial route (jump straight to D); basically you could consider B and C to exist together in an if-block of sorts.
This article presumes too much foreknowledge of what is being talked about for me.<p>Does anyone have a pointer to what I could read to understand this?
It's amazing how many people jumped in to comment without even a minimal glance at the article.<p>Please folks, before jumping in to comment, at least make sure the article is about what you think it's about.
I am about to write a rant about this author must be out of his mind or simply a junior engineer lost in narcissism...<p>How wrong I have been!<p><a href="https://en.m.wikipedia.org/wiki/Busy_beaver#:~:text=In%20theoretical%20computer%20science%2C%20the,are%20excluded%20from%20the%20game" rel="nofollow">https://en.m.wikipedia.org/wiki/Busy_beaver#:~:text=In%20the...</a>. Busy beaver game is to design a Turing machine that produces the most output for a halting program. And this conjecture is about a general property of the programs that can produce the most output.