> Achieving abstraction is super hard when you can just reach down into some bytes and noodle around with them instead.<p>Disagree. In my experience all was wonderful once I was past the point where I decided to just not worry. Much the opposite, so much pointless and confusing OOP boilerplate (as well as blood, sweat, and all that) goes into "securing" code against misuse.<p>Invest some time into data structures with obvious meaning, or a procedural API that is easily understood. "Nobody" will "ever" misuse, and misuse will be easy to detect. Much better tradeoff IMO.<p>And lastly, of course, abstraction has nothing to do with "security systems". It's against popular opinion, but you can in fact have good abstraction with nothing but plain functions and void pointers. I like to view abstraction as mostly a conceptual thing that doesn't even happen in code.
> A lot of these complaints are just, well, it’s called Progress. If we designed the C in 2018 and made it so we actually wanted to use it, then it would look very different. Probably a lot like Nim or Swift. I think that’s still a niche that is currently unfilled; a modern, powerful language that doesn’t try to provide the same guarantees as Rust and so can be much more minimalist.<p>I think zig is quite a good candidate to fill that niche.
It is very compatible with C, but it offers meaningful improvements while still being very picky about what goes into the language.
Here is the discussion on the Rust Subreddit, with author comments [1].<p>[1]: <a href="https://old.reddit.com/r/rust/comments/9mioiv/porting_c_to_rust_a_case_study_minimp3/" rel="nofollow">https://old.reddit.com/r/rust/comments/9mioiv/porting_c_to_r...</a>
> What is it about C that makes people think L3_imdct_gr() is a perfectly good function name?<p>Early ANSI, like Fortran77, only required 6 characters of a symbol to be significant, with compilers not going much farther beyond that. At that point, it sort of becomes a "when in Rome" thing.
On the "possible bugs in minimp3" section, the links are all to the current master, which appears to have changed since this was written (e.g. <a href="https://github.com/lieff/minimp3/blob/master/minimp3.h#L232" rel="nofollow">https://github.com/lieff/minimp3/blob/master/minimp3.h#L232</a> doesn't point at the function with the bitshift anymore; I expect it should be <a href="https://github.com/lieff/minimp3/blob/644e0fb7fed34f803b6634f72e5ad8cc20a520f7/minimp3.h#L232" rel="nofollow">https://github.com/lieff/minimp3/blob/644e0fb7fed34f803b6634...</a>).<p>Protip: When viewing something like this in GitHub, press "y" and the URL bar will change to be a proper permalink to the current version of the code. This permalink can then be shared. Alternatively, with the line highlighted, press the … button in the gutter and it will offer a "Copy Permalink" option (which gives you the same permalink you get by pressing "y").
> <i>In the end, the results of my work are in the rinimp3 crate, because I suck at naming things.</i><p>Why not minimp3-rs -- thus people familiar with the original C library will be sure where it is coming form...
I admit, I was curious about the `some_struct foo[1];` thing.<p>This is all I could find. It isn't terribly compelling.[1]<p>[1] <a href="https://stackoverflow.com/questions/6390331/why-use-array-size-1-instead-of-pointer" rel="nofollow">https://stackoverflow.com/questions/6390331/why-use-array-si...</a>
"The lack of real bool" -Which is fine as the minimum space it would take to store a bool is 1 byte anyway (unless youre using bitflags), and you probably want to give more detailed information than true/false anyway<p>"but a shitty way to engineer software as a whole. Achieving abstraction is super hard when you can just reach down into some bytes and noodle around with them instead.": Data oriented design is usually better both in ease of use and performance than any random class abstractions that exist; also, see Linux.<p>"Bloody hell you can’t tell whether a pointer points to a single object or an array just by looking at it": You can, usually. There are plural words in languages usually used do denote this, item<i>s</i>, also the size thing he mentioned. If it doesn't have it then its usually just bad practice or poor code quality.<p>"heckin’ ternary operators, just make your if statements not suck.": Ternary operators are great, and usually quite concise. Not sure what they are specifically referring to here :/<p>"The pre and post increment operators are just the worst damn thing in the world.": Again, this knowledge comes to experience, and actually makes things more concise.<p>C is not designed to be a ""beginner"" friendly language, but its essence is simple -- and I would recommend it for any beginner as it really drives home the majority of actual programming principles, and makes you think about what you are doing on a deeper level rather than coating things in a magical dust layer of classes with vtables and garbage collection.<p>"it’s called Progress.": However, with modern programming languages it's one step forward with two steps back most of the time.<p>I agree with stuff about automatic conversions between types, however some compilers will warn you (unless you told it to shut up) about any narrowing conversions that you do, and that's the main trap that people fall into.<p>The majority of debugging/compiling tools are designed primarily with C in mind and are fairly simple to use also.<p>The majority of the rant about C was mainly not based off issues with C itself, but with the code quality of minimp3, which is quite depressing as C itself does have some bad traits imo such as function ptr definitions are bulky, : for bitfields, no predictability for most 'undefined behaviours', dodgy bitshifting too and probably more things I can't think of at the top of my head.