> C is an easy language to write code in. Especially if you are competent in some other block structured language - such as Javascript, Python, Ruby and (maybe) Java.<p>I disagree. You can be competent in Python,Ruby and Java and still have no clue about how memory works internally, a strong requirement while writing C. C is easy to learn but takes time to master to reach a point where you are not shooting yourself in the head left and right.<p>> Pointers are bad only if you're an idiot.<p>You don't have to be an idiot to make a pointer mistake, off by one error, overflow a buffer etc... People that wrote nginx, apache, bind, linux kernel, and freebsd are not idiots yet such mistakes happen.
<i>But what about buffer overflows? Buffers overflow when a fixed size buffer is allocated, some data is copied to the buffer and the idiot who wrote the code didn't check to see if the data was too big for the buffer!</i><p>Researchers have found memory corruption flaws in both Daniel J. Bernstein and Wietse Venema's code. You want to try this "de-myth-ification" thing again?
"Wrong: use Assembler Language."<p>I'd be interested to see more research on this. I would hope that modern compilers can generate better optimised code than hand assembly for most cases by now.<p>Does anyone have any pointers to research on this topic?
> C generates compiled code. Code which is very efficient, but not as efficient as hand coded machine instructions.<p>...<i>written by a perfect programmer</i>.<p>> C is an easy language to write code in.<p>I think the word he was looking for was "simple". The field of land mines known as "undefined behaviour" disqualifies C as being an easy language.<p>> So you have to use pointers - which are nothing more than variables which contain addresses.<p>Except that they have their own semantics and syntax as well as their own fun set of undefined behaviour.<p>> This means you can define local variables which shadow external variables quite freely. This makes you code easier to read and safer to develop ...<p>Because everyone loves to keep track of which version of the 'count' variable you're referring to! (Seriously, there is almost never a reason to define the same variable name twice in the same function.)<p>> imperative: C statements are tasks to be executed in sequence. Hence imperative.<p>The word imperative doesn't imply a sequence...<p>> C doesn't support objects<p>Pretty much anything that's not a function is considered to be an object. C doesn't support polymorphic classes. (Well... let's not get into that!)<p>Overall, this might be useful to the author if he finds that he can't remember the details of the language for very long. Anyone not already familiar with the concepts he's talking about will be lost at best, or damaged in the average case.
I hope one of the parts deals with people's aversion to C declarations. The declaration syntax allows vastly complicated data definition, and the fact that declaration looks like use simplifies things immensely.<p>(I also have a personal style quibble -- lots of people seem to think that putting spaces around every lexical element clarifies things; I think it just makes gassy code.)
> So a Java program which is compiled to byte code is actually run as an interpreted program using the jvm as an interpretor.<p>Of course, this is not quite true; large parts of the program are JITted, and the result is sometimes faster than C.
Anyone that thinks that Java byte code is just interpreted at this point clearly has no right to talk about such things. This is about the third article by C programmers in the last month where they believe that Java (and C#) are not ultimately running compiled machine code and are not interpreted. Why has this misconception lasted so long? The first JITs were released last century...
<i>1. Use C if you want something to go as fast as possible.<p>Wrong: use Assembler Language.</i><p>That's ridiculous. I could say "Wrong: use machine code." and go on to reach "Wrong: use wire and soldering iron." and go on and on even more.