Responding to the title: Not really, no. Neither is C necessarily a great starting point, nor does knowing C necessarily help you being good at other languages etc.<p>I'm not saying learning C isn't useful, but it's far from a necessity and learning other things early can be more effective.<p>The article itself doesn't really contain the claim the title makes, but also doesn't back it up very much, just some fairly generic "C is low-level" points.
C (or something very like it) is effectively the core of my mental VM. It's an extremely useful mental abstraction even though all I write is C++, JS, Python, and GLSL.<p>Being able to estimate "what does this code compile into" is extremely powerful for understanding code, as well as exploring the possibility space of solutions. ("Is it even possible to express this?")<p>It's not an exact mapping to hardware, but it's such a useful level of abstraction to have on tap, even if I never want to write it directly.
This is sort of a 'turtles all the way down' sort of issue. You could also say:<p>> Learn assembly and the rest will come<p>> Learn binary and the rest will come<p>> Learn the fundamentals of electricity and the rest will
come<p>...<p>I think learning the layers below which you typically operate can be beneficial; however, I think the patterns and ideas ( semaphores, scheduling, etc. ) are probably the most beneficial part of an exercise like this -- not the actual C language.
If you're a software developer who doesn't know C, then there's a whole class of software that you can't read - network programming, operating systems, language tools all written in C. This puts you at an extreme disadvantage imho.
Learning C is a good idea because:<p>- Some APIs only support C. On other languages, you can use 3rd party bindings, or use a C interoperability library/API, but this may have a performance overhead, and it may not abstract you from C types.<p>- Some industries mostly use C, or languages similar to C.<p>- Some software has real-time requirements. A garbage collector can introduce non-deterministic delays.<p>- Some software needs fine-grained control over memory management and have a predictable memory layout of data.<p>- Some software needs a low performance overhead, e.g.: when you run your software on low spec hardware/embedded devices, or you need to save power on a battery-powered device, or make the most out of current hardware.<p>Finally, the choice of C doesn't mean you only have to use C. C can interoperate with other languages when they offer a C API.
I think C is useful and cool, but I strongly disagree with the final sentence of the post. I think it's a terrible language for beginners. I think it's a good language for proficient coders who want to get to the next level.<p>If you have a niece or nephew who likes computers and comes to you about it and you try to foist The C Programming Language on them, you will have done them a terrible disservice. 90% of people will find using Python and it's friendly packages for everything to be so much more delightful than C.
Recently, I wanted to write something in C, just to see how long it would take me. This was just an experiment.<p>I got stuck at some elementary string manipulations.<p>So I said, forget this. I pulled out Python, and solved it in a few minutes.<p>This proved to me, that while C has its place, it just does not meet the programmer productivity of other more modern languages.<p>Nevertheless, we need a modern replacement for C. But honestly, I just don't think Rust is it.
I believe you can use rust for things like refrigerator chips.<p><a href="https://www.google.com/search?client=firefox-b-d&q=rust+arduino" rel="nofollow">https://www.google.com/search?client=firefox-b-d&q=rust+ardu...</a>
I defer to one of the world's foremost authorities on C:<p><a href="https://www.youtube.com/watch?v=Ye8mB6VsUHw" rel="nofollow">https://www.youtube.com/watch?v=Ye8mB6VsUHw</a>