Author here. The paper is currently in a state of flux, as I started adding material about the graphical notation, but didn't get very far before my attention drifted to other topics. So please have a look at the following page as well<p><a href="http://tromp.github.io/cl/cl.html" rel="nofollow">http://tromp.github.io/cl/cl.html</a><p>which links to an explanation of the graphical notation, to my corresponding IOCCC entry from 2012, and to a Wikipedia page on Binary Lambda Calculus that has since been deleted.
If you like this see also [1], and Joy as well as Iota and Jot[2] (programming languages). And maybe "Algorithmically probable mutations reproduce aspects of evolution such as convergence rate, genetic memory, and modularity": <a href="https://arxiv.org/abs/1709.00268v8" rel="nofollow">https://arxiv.org/abs/1709.00268v8</a><p>> In the context of his Metabiology programme, Gregory Chaitin, a founder of the theory of algorithmic information, introduced a theoretical computational model that evolves ‘organisms’ relative to their environment considerably faster than classical random mutation. While theoretically sound, the ideas had not been tested and further advancements were needed for their actual implementation. Here we follow an experimental approach heavily based on the theory that Chaitin himself helped found. We apply his ideas on evolution operating in software space on synthetic and biological examples and even if further investigation is needed this work represents the first step towards testing and advancing a sound algorithmic framework for biological evolution.<p>[1] <a href="https://wiki.haskell.org/Chaitin%27s_construction" rel="nofollow">https://wiki.haskell.org/Chaitin%27s_construction</a><p>[2] <a href="https://www.nyu.edu/projects/barker/Iota/" rel="nofollow">https://www.nyu.edu/projects/barker/Iota/</a>
As a layman, it is a personal aspiration to be able to understand this precise paper. It's very appealing to be able to express and reason about the related problems of complexity and edit distances using a generative(?) notation like lambda calculus that you can hack on in accessible languages like Haskell.<p>I also like that the citations include Emperor's New Mind; Godel, Escher, Bach; the Brainfuck homepage, and Haskell. The effects of these ideas on a couple generations of young minds is bearing fruit.<p>It feels like we're on the brink of gamifying a lot of important math.
Is there a video lecture version of this paper that someone can recommend?<p>This is a great paper. But I burn so much mental energy trying to focus when I read something vs. when I listen to a video lecture. So I end up retaining more watching stuff and every more taking some notes.
HN feels psychic sometimes!... this paper happens to tie together a lot of the foundations I was lacking (out of ignorance) that I need for a personal research project. So happy to have found it :)