This is good but you want to use a functional programming (FP) language with lightweight syntax like Lisp that translates directly to/from the intermediate code (icode) tree without additional parsing. Genetic Programming by John Koza explains it in detail:<p><a href="https://en.wikipedia.org/wiki/Genetic_programming" rel="nofollow">https://en.wikipedia.org/wiki/Genetic_programming</a><p>I read the 3rd edition:<p><a href="https://www.amazon.com/Genetic-Programming-III-Darwinian-Invention/dp/1558605436" rel="nofollow">https://www.amazon.com/Genetic-Programming-III-Darwinian-Inv...</a><p>That way all processing resources can go towards exploring the problem space for potential solutions close to the global minimum or maximum, instead of being wasted on code containing syntax errors that won't execute.<p>So the agent's real-world Python LLM code would first be transpiled to Lisp and evolved internally, then after it's tested and shown to perform better imperically than the original code, be translated back and merged into the agent.<p>Then the challenge becomes transpiling to/from other imperative programming (IP) languages like Python, which is still an open problem:<p>-<p>Going from Lisp to Python (or running Lisp within Python) is trivial, and I've seen implementations for similar IP languages like C++ in like 1 page of code. They pop up on HN frequently.<p>But going from Python to Lisp (or running Python within Lisp) is a lot harder if one wishes to preserve readability, which may or may not matter here. Naive conversions bind variables under pseudonyms, so a Python variable like my_counter becomes int_123 and it works like an emulator, merely executing the operations performed by the Python code. Mutability gets buried in monadic logic or functional impurity which has the effect of passing the buck rather than getting real work done. Structs, classes, associative arrays, etc lose their semantic meaning and appear as a soup of operations without recognizable structure.<p>To my knowledge, nobody has done the hard work of partitioning imperative code into functional portions which can be transpiled directly to/from FP code. Those would only have const variables and no connection to other processes of execution other than their initial and final values, to be free of side effects and be expressible as prefix/postfix/infix notation without change to logic, as imperative or functional code.<p>Mutability could be represented as shadowed variables within ephemeral functional sub-scopes, or by creating new value names for each mutation and freeing the intermediate variables via reference counting or garbage collection. Think of each new value as running in a forked version of the current process, with only that value being different after copy-on-write. A simple for-loop from 1 to 1000 would run that many forked processes, keeping only the last one, which contains the final value of the iterator.<p>Mutability can also be represented as message passing between processes. So the FP portions would be ordinary Lisp, glued together with IO functions, possibly monadic. I don't like how Haskell does this, mainly because I don't fully understand how it works. I believe that ClojureScript handles mutability of its global state store by treating each expression as a one-shot process communicating with the store, so that the code only sees initial and final values. While I don't know if I understand how that works, I feel that it's a more understandable way of doing things, and probably better represents how real life works, as explained to me in this comment about Lisp Flavored Erlang (LFE) and Erlang's BEAM (see parent comments for full discussion):<p><a href="https://news.ycombinator.com/item?id=43931177">https://news.ycombinator.com/item?id=43931177</a><p>Note that FP languages like Lisp are usually more concerned with types and categories than IP languages, so can have or may need stronger rules around variable types to emulate logic that we take for granted in IP languages. For example, Lisp might offer numbers of unlimited size or precision that need to be constrained to behave like a float32. Similar constraints could affect things like character encoding and locale.<p>-<p>I first learned about everything I just explained around 2005 after reading the book. I first had thoughts about brute-forcing combinations to solve small logic circuit and programming challenges during my electrical and computer engineering (ECE) courses at UIUC in the late 1990s, because it took so much mental effort and elbow grease to create solutions that are obvious in hindsight.<p>Then the Dot Bomb happened, the Mobile bubble happened, the Single Page Application bubble happened, and the tech industry chose easy instead of simple:<p><a href="https://www.infoq.com/presentations/Simple-Made-Easy/" rel="nofollow">https://www.infoq.com/presentations/Simple-Made-Easy/</a><p>This is why we chose easy hardware like GPUs over simple highly multicore CPUs, and easy languages like Ruby/React over simple declarative idempotent data-driven paradigms like HTTP/HTML/htmx.<p>The accumulated technical debt of always choosing the quick and easy path set AI (and computing in general) back decades. The AI Winter, endless VC wealth thrown at non-problems chasing profit, massive wealth inequality, so many things stem from this daily application of easy at the expense of simple.<p>I wish I could work on breaking down IP languages like Python into these const functional portions with mutability handled through message passing in LFE to create an IP <-> FP transpiler for optimization, automatic code generation and genetic algorithm purposes. Instead, I've had to survive by building CRUD apps and witness the glacial pace of AI progress from the sidelines.<p>It may be too late for me, but maybe these breadcrumbs will help someone finally get some real work done.