The really interesting part is the discussion of optimizations:<p>> Consider this example raised by Linus: the compiler might look at how the kernel accesses page table entries and notice that no code ever sets the "page dirty" bit. It might then conclude that any tests against that bit could simply be optimized out. But that bit can change; it's just that the hardware makes the change, not the kernel code. So any optimizations made based on the notion that the compiler can "prove" that bit will never be set will lead to bad things. Linus concluded: "Any optimization that tries to prove anything from more than local state is by definition broken, because it assumes that everything is described by the program."<p>Programmers tend to assume that optimization is transparent--code will work the same regardless. But that's not really true, and as Linus's example demonstrates, compiler writers and compiler users may have different ideas of what the compiler is allowed to assume. And there isn't an obvious line to be drawn regarding safe and unsafe assumptions either.