Mostly good advice, sometimes even great, but the part about typedefs is total BS. Any non-trivial program will use values that have clearly different meanings but end up being the same C integer type. One's an index, one's a length, one's a repeat count, one's an enumerated value ("enum" was added to the language to support this very usage), and so on. It's stupid that C compilers don't distinguish between any two types that are the same width and signedness; why compound that stupidity? Both humans and static analyzers could tell the difference if you used typedefs, and avoid quite a few bugs as a result. Being able to change one type easily might also make some future maintainer's life much better. There's practically no downside except for having to look up the type in some situations (to see what printf format specifier to use), but that's a trivial problem compared to those that can result from <i>not</i> using a typedef.<p>Don't want to use typedefs? I think that's a missed opportunity, but OK. Don't use them. OTOH, anyone who tries to pretend that the bad outweighs the good, or discourage <i>others</i> from using them, is an ass. Such advice for the kernel is even hypocritical, when that code uses size_t and off_t and many others quite liberally.