Every time I see a post from Lisp fans about macros, I want to be amazed, but I always just walk away confused. I can tell there's something interesting in there, but the quote-unquote-quasiquote syntax is just so dense that my brain is incapable of comprehending it.
You can achieve memoization of an expression inside a function without any global state and macros:<p><pre><code> (defun compute-hash (key hash f)
"Get the value for KEY in HASH or compute it with F, enter into HASH and return."
(multiple-value-bind (val win)
(gethash key hash)
(if win
val
(setf (gethash key hash) (funcall f key)))))
(defun memoized (f)
(let ((cache (make-hash-table)))
(flet ((memo (x g) (compute-hash x cache g)))
(lambda (&rest args) (apply f #'memo args)))))
(defun fib (n)
(if (<= n 1)
n
(+ (fib (1- n)) (fib (- n 2)))))
;; MEMO is a function that takes a key and a computing function.
;; If a key has been memoized, it returns the cached value, otherwise it calls the computing
;; function with the key and caches the result.
(let ((example (memoized (lambda (memo x)
(format t "X: ~a~%" x)
(let ((result (funcall memo x #'fib)))
(format t "~a~%" (* 2 result)))))))
(trace fib)
(funcall example 5)
(funcall example 5)
(funcall example 5)
(untrace fib))</code></pre>
Rust macros are sort of sufficient to do the kind of rewriting mentioned, but it's maybe cheating because you have to annotate the function with the macro which allows the macro to mangle the whole function body.
> How do you implement `memoize`?<p>> I <i>think</i> that you basically can’t, in JavaScript. Or, more accurately: I can’t think of a way to do it.[1]<p>Oh, this is a case for WeakMaps right?<p><pre><code> const MemoCache = new WeakMap();
function memoize(f, x) {
const cache = MemoCache.get(f) || new Map()
MemoCache.set(f, cache)
if (!cache.has(x)) {
cache.set(x, f(x))
}
return cache.get(x);
}
</code></pre>
Oh wait:<p>> 1. You could create a global memoization map keyed on the <i>function</i> that you’re calling, but this would actually have different semantics than I’m imagining. If I said `memoize(f, 1) + memoize(f, 1)` I would expect those to each invoke `f`, because instances of `memoize` shouldn’t share results. Why not? Because this is a fake example, and a global memoization is a different (easier!) thing than per-call-site memoization.<p>Like I get what you're saying but you could just cache the call site too?<p><pre><code> const MemoCache2 = new WeakMap();
function memoize2(f, x) {
const callsite = new Error().stack
const macro_cache = MemoCache2.get(f) || {};
const micro_cache = macro_cache[callsite] || new Map();
macro_cache[callsite] = micro_cache;
MemoCache2.set(f, macro_cache)
if (!micro_cache.has(x)) {
micro_cache.set(x, f(x))
}
return micro_cache.get(x);
}
</code></pre>
I admit that this is something of a trickery though, but I mean, it's trickery specifically to work around that this person doesn't want to write `const my_f1 = memoize(f), my_f2 = memoize(f)` in some location on the screen. Precisely because people who write JavaScript are not accustomed to macros, they are not expecting `memoize(f, 1) + memoize(f, 1)` to be a proper memoization expression, they aren't expecting weird stuff with weakmaps and inspecting stack traces to identify call sites and all that.
In the associated article linked to at "Leaving aside the absurdity of computing Fibonacci numbers recursively,"[1] (which, yes, I agree), we list the various algorithms as (roughly):<p><pre><code> how to fibonacci space complexity time complexity
------------------------- ---------------- ---------------
insane recursion exponential exponential
memoized insane recursion linear linear
</code></pre>
The space complexity of "insane recursion" without memoization is the maximum stack-depth; the worst case stack is,<p><pre><code> fib(n)
fib(n-1)
fib(n-2)
...
fib(1)
</code></pre>
Which is <i>n</i> stack frames (and the stack frames are of constant size); the space complexity of the whole thing is thus linear in the size of <i>n</i>. (While the call tree is itself exponential in size, the memory required is only the depth of that tree, since we can't call fib(n-1) & fib(n-2) simultaneously[2].<p>(The time complexity is, of course, exponential, and I agree with the "insane" moniker. I also like your comment elsewhere in this thread about people hyperfocusing on the example and missing the larger point of the article … and I'm so sorry but I've been sniped by this.)<p>[1]: <a href="https://ianthehenry.com/posts/fibonacci/" rel="nofollow">https://ianthehenry.com/posts/fibonacci/</a><p>[2]: the little demons in my mind are now trying to scheme up an insaner recursion that attempts this. Threads maybe?
In JavaScript tagged template literals have the ability to identify the callsite. This is really powerful and used to create template identity in lit-html.<p>I've wanted the ability to reference the callsite in functions, and lobbied the V8 team for something like arguments.callsite, but was (probably rightly) politely told no.<p>But if you're willing to abuse tagged template literal syntax, they're really just function calls, so you can do something like:<p><pre><code> const dumbExample = (x) => {
while (someComplicatedStuffHappens()) {
pretendLikeThisFunctionIsBig();
}
const result = memoize(doSomethingVeryExpensive, x)``;
doMoreInterestingWork();
}
</code></pre>
memoize() must return a template tag function, which will be invoked with a TemplateStringsArray (because of the ``) that can act like a callsite identifier, which can be a key into a WeakMap of memoization dictionaries.<p>It's mostly a curiosity because who wants that syntax, but it's interesting that JavaScript does have the special power hidden behind one feature.
Wow. I haven't really played with Lisp since college. But I just started reading <i>The Little Schemer</i> with some friends, and hope to move on to SICP some time this year or next. This blog post made me a little dizzy, but also a little excited about what I'm hoping to explore with these lessons.
Isnt this something that John Shutt solved with his Vau calculus? Basically, each "macro" (actually kinda like fexpr) invocation creates its own static environment, which neatly solves all hygiene problems and problems outlined in this article?
I am going to be honest I didn't really understand what an eigenvalue was until reading this. I'd read the definition but like I didn't really understand why you'd care about that. This was a great article
> Leaving aside the absurdity of computing Fibonacci numbers recursively<p>Is it really absurd? If the compiler can turn it into iteration, then it's a big boy compiler. If not, then meh?