I often wondered about an alternative but related metric called "organization"<p>Entropy, in some sense would seem to measure "complexity", but it's more accurately related as "surprise" I think.<p>It's useful but limited (for example, you can measure the "entropy" present in a string -- of keystrokes, or text -- and determine how likely it is that it's "coherent" or "intelligent" but this is fuzzy, i.e., "too much" entropy, and you are at "randomness", too little and you are at "banality"). It seems like a more precise (but still 0 - 1 bounded) metric would be possible to measure "order" or "organization". Entropy fails at this: 0 entropy does not equal "total order". Just "total boringness" (heh :))<p>I considered something related to some archetypal canonical compression scheme (like LZ), but didn't flesh it out. Considering again now, what about the "self similarity" of the dictionary, combined with the diversity of the dictionary?<p>It's more of a "two-axis" metric but surely we can find a way to corral it into 0..1.<p>Very self-similar, and rather diverse? Highly organized.<p>Low self-similarity, and highly diverse? High entropy / highly disorganized.<p>Low self-similarity, and low diversity? Low entropy / high banality. I.e., simplicity heh :)<p>High self-similarity, low diversity - organized, but "less organized" than something with more diversity.<p>I don't think this is quite there yet, but there's intuitive sync with this.<p>Any takers???? :)