I think this goes to show that <i>theoretical</i> backwards compatibility doesn't matter nearly as much as <i>de facto</i> backwards compatibility. As this IOCCC entry shows, every single C++ revision has been a breaking change to the previous revision. A naive interpretation of this would say "well, C++ is not useful for enterprise use because the language is constantly changing". But in reality it doesn't matter, because the backwards-incompatible parts affect code nobody would intentionally write (excluding IOCCC entries!), and as a result C++ has a deserved reputation for stability.
Because of SFINAE for expressions added in C++11 (which gives the capability to test at compile time whether an arbitrary expression is valid or not), technically <i>any</i> change to the language or standard library can change the behaviour of a valid C++ program.
One behavioral change in C++11 is the noexcept specifier; with user-defined destructors defaulting to noexcept(true). If you have pre-C++11 code throwing exceptions from within a destructor, e.g. to signal a violated postcondition contract, the program will terminate when such an exception is thrown after compiling the program as C++11.<p>Updating the code to function under C++11 requires adding a noexcept(false) specification to throwing destructors.
This is really cool. Though, I have to say, I was hoping that it was examples of undefined behavior and how that changes between versions and compilers.<p>But still, pretty neat in terms of features.
Summary of all tricks used can be found here:
<a href="http://uguu-archive.appspot.com/fuuko/source/c_version.c" rel="nofollow">http://uguu-archive.appspot.com/fuuko/source/c_version.c</a>