The theory of combinatory logic is beautiful. For instance, with basis {S,K}, a term is either S or K or an application of one term to another. S and K satisfy<p><pre><code> S x y z = x z (y z)
K x y = x
</code></pre>
These terms can represent any computable function.
This is clean and simple. I don't quite see what the theory of concatenative combinators achieves beyond complicating matters with stacks and quotations to obtain some concatenative property of programs.<p>Tha analogues of S and K in concatenative combinators are<p><pre><code> [C] [B] [A] s == [[C] B] [C] A
[B] [A] k == A
</code></pre>
which look a bit more complicated, but as the article says this is not even a basis:<p>> This almost gives us completeness; however, there is no way to form "dip" or "sip" using just "s" and "k" because, roughly speaking, they provide no way to dequote items buried in the stack.<p>What can you express in concatenative combinators that is not as easily expressed in combinatory logic?
"Composition Intuition by Conor Hoekstra | Lambda Days 2023" (<a href="https://www.youtube.com/watch?v=Mj8jxYS-hi4" rel="nofollow">https://www.youtube.com/watch?v=Mj8jxYS-hi4</a>) is all about combinators.
Related:<p><i>The Theory of Concatenative Combinators (2007)</i> - <a href="https://news.ycombinator.com/item?id=20849180">https://news.ycombinator.com/item?id=20849180</a> - Aug 2019 (3 comments)<p><i>The Theory of Concatenative Combinators (2007)</i> - <a href="https://news.ycombinator.com/item?id=12328870">https://news.ycombinator.com/item?id=12328870</a> - Aug 2016 (2 comments)<p><i>The Theory of Concatenative Combinators</i> - <a href="https://news.ycombinator.com/item?id=11490051">https://news.ycombinator.com/item?id=11490051</a> - April 2016 (5 comments)