I don't see how<p>> "Not suprisingly, [...] global counters [...] are poorly distributed as well."<p>is true.<p>I mean, yes, they are not uniformly distributed, but that was never the requirement. As the article itself states, the desired property is that "the values for distinct objects are more or less distinct". With a global counter, you get <i>maximally</i> distinct hash codes. <i>More</i> distinct than any of the other approaches (and not less than any user-implemented function), at least until 2^31 object allocations.<p>Yes, after 2^31 objects you will get repeated values, but that is trivially the case for <i>any</i> pattern of assigning 31 bit hash codes to distinct objects (and any of the pseudo-random approaches will get birthday-paradoxed <i>much</i> sooner, and <i>much</i> harder). The only case where this could matter is in an adversarial scenario where someone is trying to DoS your hash map with crafted inputs. But according to the article itself, it would take 4+ minutes (120 ns * 2^31) of only allocating objects for each global counter wrapping. If an adversary can reliably achieve that already, what's the point in slowing down a hash map by an epsilon every four minutes?