It's pretty funny that the article starts with "[an error occurred while processing this directive]" repeated three times. Maybe the language isn't the only cause of mistakes.<p>More seriously, the author makes the pretty decent point that certain classes of errors should be caught automatically. He uses buffer overflows as an example (near the bottom). I agree with his conclusion, but call out a hidden assumption. Does "automatically" include only the compiler? Of course not! We also have static analyzers, automatic test generators, and other kinds of programs that can help with this. We can and should use those, to improve correctness without either cluttering up the language or adding performance-robbing artifacts in the executable code. While it's true that C and C++ make it harder than it should be to write and use such tools, the languages and the tools need only a small nudge to improve that situation dramatically. They don't need to change their essential natures.<p>This "unbundling" of other language-related functionality from the compiler toolchain is an important possibility that should not be overlooked. "Do one thing and do it well" and let the user decide which things to do in which order. There are plenty of other good reasons to develop or prefer other higher-level languages. The kinds of problems the author cites are almost irrelevant.
Where in this article does the author explain <i>why</i> C and C++ are awful? He only gives one or two examples. The main argument seems to be that Scheme is easier to teach but I don't see how that makes C or C++ bad programming languages. I agree that C++ a horrible mess, but C is not...and it's not hard to learn either (in fact, it was the language I learned programming with). I do agree that Scheme is a very nice language, but that doesn't make every other language out there bad.<p>Also I don't agree with the notion that programming is all about high level algorithms and abstraction. In the end you're programming <i>a physical computer</i> and you should be aware of that. In some (or even many) cases Scheme or Python or whatever (even C perhaps) may be too abstract for what you're doing, so use what's appropriate.
I think it is very important that computer science students learn C early on. Learning concepts like memory management and word alignment (which he mentions) is very critical imo.<p>For other majors which have require some programming it is best if they stick to something like Python, etc.<p>But the author does make a lot of good points regarding how many holes there are in these languages, and how much of a time sink they can. In fact I spent the last few days trying to track down this obscure bug where an object being allocated in C++ was randomly being free'd (according to gdb a valid address would become 0x1 all of a sudden) and I couldn't figure it out. I implemented some nasty WAR which makes the program run now...
I kinda agree with the author if you do your software development purely with programming language. It would otherwise be call hacking or prototyping, and this is not exclusive to C or C++. This could also happen wiht Java, python or <insert your favourite language here>.<p>With software engineering, there are a plethora of other tasks and checks that needs to be done before releasing it into wild. ie requirements engineering, verification, reviews etc.