I love seeing articles like this... my compiler design course in college was one of my favorite. Also, one of the most useful. Parsers and lexers are useful in so many places besides just code compilers.<p>With that said, I think there is one small issue in the article:<p>> The quintessential first step in any compiler is parsing the source string into an Abstract Syntax Tree<p>If there is a quintessential first step in writing a compiler it is doing lexical analysis with a lexer to break the program up into tokens. Then using a parser to create the AST. While you could go straight from the raw text to parsing, it makes it a lot easier if you lex it first.
It's nice to have a small example of how to compile to LLVM, but the compiler is a bit more limited than what the blog post makes it appear.<p>It's not quite `a compiler for simply typed lambda calculus', but only for a small fragment without higher-order functions. One currently cannot write lambda terms that take functions as arguments.<p>I was curious how the compiler represents closures and manages memory, mainly because I'm looking for a small example of how to do garbage collection in LLVM. But it turns out that the parser doesn't allow function types yet and the compiler itself doesn't implement closures yet.
I remember building funny little DSL-Tools with flex and bison.
<a href="https://github.com/westes/flex" rel="nofollow">https://github.com/westes/flex</a><p>I used it for AutoSar applications, where you create tailored embeded programs from a DSL specification.