If you're curious on how to avoid this in your applications, take a look at universal hashing: <a href="https://en.wikipedia.org/wiki/Universal_hashing" rel="nofollow">https://en.wikipedia.org/wiki/Universal_hashing</a>
He really didn't understand Java's take on this.
Being zero-insensitive obviously is totally insecure. Java knew that. But Java decided to fight those kind of attacks better than most others. Java has a still trivial insecure hash function, which it decided to keep, because of an API blunder. But they convert the collisions from a linked list to a tree on too many collisions which indicate an active attack. Those attacks are rare, the common case is still fast.<p>Zero-insensitivity would have been fixable trivially, perl fixed that with 5.18, but they couldn't, so they came up with a proper and much better fix. Unlike perl and everyone else.
I'm looking at all of the lookup tables in that code and wondering how much slower it would be to do a depth-first search and calculate as you went.<p>And then with a trie thrown in.