This dovetails with something I was thinking about recently in the context of Reproducible Builds (or the lack thereof). For most of its history Computer Science (at least as practiced by industry) has gone all-in on "functionally equivalent" transformations, and done almost nothing in the space of "output invariant" guarantees.<p>In my day job we sometimes we need to reproduce the build of a firmware ROM or executable, sometimes decades after the engineer who last built it left. Getting a match is easier (or even possible) for older tech <i>only</i> because of the relative unsophistication of the compilers used - even then it's only reproducible "by accident," and not because the compiler vendor made any guarantees about X language construct reliably produces Y machine code and data layout.<p>But we need that! For getting accurate baselines. For security verification. And there's no reason <i>in principle</i> we should have to forego updating compilers, IDEs, and OS environments in the indirect hope of not disturbing anything. Those are two separate things: if we had a through-line of higher level language construct -> semantically defined transformation (irregardless of optimization settings) -> machine code, vendors could continue to update their IDEs and compilers while just making sure they still respect the invariants.<p>C++'s so-called zero-cost abstractions are poor substitute for this: header (library) writers and C++ gurus write code <i>as if</i> they worked like this "guaranteed output transform" I describe, but no compiler actually has to respect it (nevermind that the fine details of what the transformation actually is isn't nailed down) and it differs between Debug and Release build which is particularly bad for game development as TFA makes clear.