I remember learning about compiler optimization at university. Sadly all my colleagues knew(and were taught) was Java and the Sun JVM we used at the time didn't even do most of the stuff we learned, not even the tail-recursion optimization, therefor the opinion of most of the students was that the professor should stop babbling about things that have no application in real life.<p>This was the one time i was on the side of academia...
Quibbles: the strlen() example is poorly explained. It doesn't hoist it because it's "built-in", it hoists it because strlen() is marked "pure" (i.e. without side effects) in the declaration. Any function can get that attribute.
Oh, I remember getting upset last time with this "test". For example:<p><pre><code> 3. Multiplication by 2 to addition - integer
Will GCC transform an integer multiplication by 2 to addition?
</code></pre>
The statement (not function) x * 2 is shift x left 1 bit on almost all compilers. Shift has a lot less dependencies than ADD/LEA and has better reciprocal throughput. Meh.
The comment about tail calls is wrong -- it's not an optimization, it's a guarantee about semantics of your program. Not optimizing tail calls is like making array dereference run in linear time instead of constant: it changes the programs you can write.
I have a little doubt about number 6. A switch() without a "default:" label has undefined behavior if x is negative or above 5, while nothing would happen in the cascading if's. Wouldn't a correct optimization of that be<p><pre><code> void function(int x) {
switch (x) {
case 0: f0(); break;
case 1: f1(); break;
case 2: f2(); break;
case 3: f3(); break;
case 4: f4(); break;
case 5: f5(); break;
default:
}
}</code></pre>