Entropy is the existential phenomenon by which potential (any kind: heat, electromagnetic, and informational/structural) distributes over negative potential.<p>Math and physics geeks will turn beat red and say that entropy is a "scalar value [in their equations] of the distribution of delta probability." This, such as inverse square or whatever appropriate for the relevant manifold surface area (check out Penrose's Road to Reality for an excellent tutorial of manifold distributions.)<p>Conventional students will say "entropy is the waste energy that cannot be used or reclaimed" which sounds good in class yet doesn't fit in reality. Ambient heat is the entropy of whatever generated it, and we all enjoy ambient heat.<p>Everyone else is stuck in a mystical uncertainty (which is a laugh for uncertainty is a synonym for entropy.)<p>Where is it? Probably any location within the area of distribution.<p>What is it? Probably whatever there is, in statistical proportion.<p>In casual terms, any time anyone mentions "uncertainty" they're talking about the entropic distribution of potentials.<p>Thermal equilibrium is merely the most primitive universal example (everything is moving from more potential to less, and heat is the final exhaust of all working processes.<p>Entropy is the distribution of potential over negative potential, in every valid usage.<p>The mistake Shannon made in his monumental instigation of this very subject, that equation that expresses the electron distribution upon the valence of an atom, didn't account for the valence changing energy states. This giving modern information theory the erroneous conclusion that entropy is the "available states in a system" when in fact entropic measure is of all possible states, even those not fitting the model (a human construction.)<p>Other than all of this, the book looks quite interesting!