I've written a few articles about Entropy (I'm a physicist working in DS).<p>Almost all of them have Python code to illustrate concepts.<p>-<p>1. Entropy of a fair coin toss - <a href="https://bytepawn.com/what-is-the-entropy-of-a-fair-coin-toss.html" rel="nofollow">https://bytepawn.com/what-is-the-entropy-of-a-fair-coin-toss...</a><p>2. Cross entropy, joint entropy, conditional entropy and relative entropy - <a href="https://bytepawn.com/cross-entropy-joint-entropy-conditional-entropy-and-relative-entropy.html" rel="nofollow">https://bytepawn.com/cross-entropy-joint-entropy-conditional...</a><p>3. Entropy in Data Science - <a href="https://bytepawn.com/entropy-in-data-science.html" rel="nofollow">https://bytepawn.com/entropy-in-data-science.html</a><p>4. Entropy of a [monoatomic] ideal gas with coarse-graining - <a href="https://bytepawn.com/entropy-of-an-ideal-gas-with-coarse-graining.html" rel="nofollow">https://bytepawn.com/entropy-of-an-ideal-gas-with-coarse-gra...</a><p>5. All entropy related posts - <a href="https://bytepawn.com/tag/entropy.html" rel="nofollow">https://bytepawn.com/tag/entropy.html</a>
I did a deep dive on entropy a couple years ago. I found the concept to be much harder to understand than I expected! Specifically, it was confusing to shift from the intuitive but wrong “entropy is disorder” to “entropy is about the number of possible microstates in a macrostate” (Boltzmann Entropy) <a href="https://en.wikipedia.org/wiki/Boltzmann%27s_entropy_formula" rel="nofollow">https://en.wikipedia.org/wiki/Boltzmann%27s_entropy_formula</a><p>I was extra confused when I discovered that a spread out cloud of hydrogen is lower entropy than the same cloud gravitationally bound together in a star. So entropy isn’t just about “spreading out,” either.<p>I found that Legos provide a really nice example to illustrate entropy, so I’ll share that here.<p>Consider a big pile of Legos, the detritus of many past projects. Intuitively, a pile of Legos is high entropy because it is disordered—but if we are trying to move beyond order/disorder, we need to relate it to micro states and macro states.<p>Therefore, a pile of Legos is high entropy because if you randomly swap positions of the pieces it will all be the same macrostate—ie a big pile of Legos. Nevertheless, each of the Lego pieces is still in a very specific position— and if we could clearly snapshot all those positions, that would be the specific microstate. That means that the macrostate of the pile has an astronomical number of possible microstates — there are many ways to reorganize the pieces that still look like a pile.<p>On the other hand, consider a freshly built Lego Death Star. This is clearly low entropy. But to understand why in terms of microstates, it is because very few Legos can be swapped or moved without it not really being a Death Star anymore. The low entropy is because there are very few microstates (specific Lego positions) that correspond to the given macro state (being a Death Star).<p>This specific case helped me grok Boltzmann entropy. To extend it, consider a box with a small ice crystal in it: this has many fewer possible microstates than the same box filled with steam. In the steam, molecules can pretty much we swapped and moved anywhere and the macrostate is the same. With the crystal, if you start randomly swapping molecules to different microstates, it stops being an ice crystal quickly. So an ice crystal is low entropy.<p>Now, the definition of what counts as a macrostate is very important in this… but this comment is long enough and I still haven’t gotten to the gym…
I appreciate the explanation, but the very first example doesn't sit well with me. Water forming into ice cubes spontaneously looks weird simply because we’re not used to seeing it. Consider a time-lapse of an icicle forming as a sort of counter-example: <a href="https://m.youtube.com/watch?v=mmHQft7-iSU" rel="nofollow">https://m.youtube.com/watch?v=mmHQft7-iSU</a><p>(Not refuting entropy as the order of time at all, just noting a visual example is not great evidence.)
I like my entropy story with more steam engines,<p>“The most misunderstood concept in physics", by Veritasium (YouTube, 2023) (<a href="https://youtu.be/DxL2HoqLbyA?si=5a_4lCnuv85lRb57" rel="nofollow">https://youtu.be/DxL2HoqLbyA?si=5a_4lCnuv85lRb57</a>)
So if I get this right, there is an infinitely small possibility that a cracked egg returns to its initial state.
Imagine that happening and being put on video. We'd all believe we're living in a simulation and witnessed a glitch.<p>No-one would believe the scientists explaining that although highly improbable, the uncracked egg does make scientific sense.
Now imagine that this was in gradeschool curricula!<p>I really think education is mostly about providing higher-level intuitions - making correct thought habitual and thus easy.<p>Part of what's so attractive about this particular article is how it would mesh with related fields (chemistry, statistics, politics, evolution, astophysics, climate science, etc)
> entropy is just a fancy word for ‘number of possible arrangements’<p>It isn’t though.<p>Entropy is a fancy word for potential distribution over negative potential. Negative potential is the “surface area” over which potential may distribute. The “number of possible arrangements” casually fits into this, yet misses some unintuitive possibilities, like the resistive variance or other characteristics not preempted by who ever constructed the intellectual model.<p>Idealists insist entropy is a scalar state resolve of delta probability in their model. They are self deceived. Entropy is the existential tendency for potential to distribute toward equilibrium.<p>As long as boffins can throw away results that do not correlate, they can insist it is anything they like.