> Why does this matter? It matters for the exact same reason why memory leaks are bad in general: They consume more memory than necessary.<p>Thing is, this doesn't usually matter. I have never gotten an out of memory error from a leak in Java. Now compare that to all the development time I've saved by not having to deal with pointer arithmetic. I consider it a huge win. It's all about the type of apps you're making.
I don't know about the other things but Even Bjarne Stroustrup calls multiple inheritance the wrong way to go. I don't think I'm better than him. So I disagree on that. The other stuff - Well all languages are imperfect. That's why we have so many of them.
> One problem with garbage collection is that it needs to see the entire memory used by the program even if the program is not using big parts of it for anything.<p>The first generational GC was proposed for Lisp in 1983. See Liebermann&Hewitt. The heap is divided into generations based on the lifetime of objects. Typically only the youngest generation, which is kept small, is scanned. This is based on the observation that a lot of objects are only short-lived. Thus a GC does typically does only need to look at a fraction of the memory.<p>Also GCs might want to use regions of similar objects. That way only those regions need to be looked at, which may free up space for an ohject that wants to be allocated currently.
Probably worth jumping up a level in the URL to: <a href="http://warp.povusers.org/grrr/" rel="nofollow">http://warp.povusers.org/grrr/</a> —and noting that this comes from the guy's "What grinds my gears" page, which also links to a "Why I hate C" article.
Java is a decent language for its time, with a bad default tool set and a development culture that leans very heavily towards J2EE-style verbosity (javascript is trending that way now too, w/ angular, react, etc).<p>The JVM, however, is a national treasure.
I ma not sure about the rest of the points made but I agree about the lack of deterministic destructors to free up external resources. C# has the same problem. It would be really good if the destructor ran when the reference count of an object goes to 0. This would make a lot of code that deals with OS resources much cleaner.
Risking discussing religiousity:<p>Java as a language has some rough-edges: multiple inheritance and verbosity. Personally, I'd use Kotlin if there were a requirement to run on the JVM.<p>In terms of the JVM:<p>Erlang VM is under-appreciated: each "process" (not an OS process and lighter than an OS thread) has its own heap so there's no global GC pauses, share-nothing and let-it crash doesn't take out Erlang VM.<p>For anyone stuck on JVM, look at Azul's free Zing for shorter/less GC pauses and a generally faster JVM.
What about the point of gc solving the issue of your app crashing or security issues everywhere because a malformed input ends up crashibg/executing code? In the debian security mail list almost every week there is a path for that situation so looks like it's pretty common.