TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

What Is Entropy?

288 点作者 jfantl大约 1 个月前

34 条评论

TexanFeller大约 1 个月前
I don’t see Sean Carroll’s musings mentioned yet, so repeating my previous comment:<p>Entropy got a lot more exciting to me after hearing Sean Carroll talk about it. He has a foundational&#x2F;philosophical bent and likes to point out that there are competing definitions of entropy set on different philosophical foundations, one of them seemingly observer dependent: - <a href="https:&#x2F;&#x2F;youtu.be&#x2F;x9COqqqsFtc?si=cQkfV5IpLC039Cl5" rel="nofollow">https:&#x2F;&#x2F;youtu.be&#x2F;x9COqqqsFtc?si=cQkfV5IpLC039Cl5</a> - <a href="https:&#x2F;&#x2F;youtu.be&#x2F;XJ14ZO-e9NY?si=xi8idD5JmQbT5zxN" rel="nofollow">https:&#x2F;&#x2F;youtu.be&#x2F;XJ14ZO-e9NY?si=xi8idD5JmQbT5zxN</a><p>Leonard Susskind has lots of great talks and books about quantum information and calculating the entropy of black holes which led to a lot of wild new hypotheses.<p>Stephen Wolfram gave a long talk about the history of the concept of entropy which was pretty good: <a href="https:&#x2F;&#x2F;www.youtube.com&#x2F;live&#x2F;ocOHxPs1LQ0?si=zvQNsj_FEGbTX2R3" rel="nofollow">https:&#x2F;&#x2F;www.youtube.com&#x2F;live&#x2F;ocOHxPs1LQ0?si=zvQNsj_FEGbTX2R3</a>
评论 #43688517 未加载
评论 #43688816 未加载
评论 #43688804 未加载
quietbritishjim大约 1 个月前
I like the axiomatic definition of entropy. Here&#x27;s the introduction from <i>Pattern Recognition and Machine Learning</i> by C. Bishop (2006):<p>&gt; The amount of information can be viewed as the ‘degree of surprise’ on learning the value of x. If we are told that a highly improbable event has just occurred, we will have received more information than if we were told that some very likely event has just occurred, and if we knew that the event was certain to happen we would receive no information. Our measure of information content will therefore depend on the probability distribution p(x), and we therefore look for a quantity h(x) that is a monotonic function of the probability p(x) and that expresses the information content. The form of h(·) can be found by noting that if we have two events x and y that are unrelated, then the information gain from observing both of them should be the sum of the information gained from each of them separately, so that h(x, y) = h(x) + h(y). Two unrelated events will be statistically independent and so p(x, y) = p(x)p(y). From these two relationships, it is easily shown that h(x) must be given by the logarithm of p(x) and so we have h(x) = − log2 p(x).<p>This is the definition of information for a single probabilistic event. The definition of entropy of a random variable follows from this by just taking the expectation.
评论 #43691716 未加载
评论 #43693308 未加载
评论 #43697126 未加载
nihakue大约 1 个月前
I&#x27;m not in any way qualified to have a take here, but I have one anyway:<p>My understanding is that entropy is a way of quantifying how many different ways a thing could &#x27;actually be&#x27; and yet still &#x27;appear to be&#x27; how it is. So it is largely a result of an observer&#x27;s limited ability to perceive &#x2F; interrogate the &#x27;true&#x27; nature of the system in question.<p>So for example you could observe that a single coin flip is heads, and entropy will help you quantify how many different ways that could have come to pass. e.g. is it a fair coin, a weighted coin, a coin with two head faces, etc. All these possibilities increase the entropy of the system. An arrangement _not_ counted towards the system&#x27;s entropy is the arrangement where the coin has no heads face, only ever comes up tails, etc.<p>Related, my intuition about the observation that entropy tends to increase is that it&#x27;s purely a result of more likely things happening more often on average.<p>Would be delighted if anyone wanted to correct either of these intuitions.
评论 #43686452 未加载
评论 #43687464 未加载
评论 #43686509 未加载
评论 #43686706 未加载
asdf_snar大约 1 个月前
I throw these quotes by Y. Oono into the mix because they provide viewpoints which are in some tension with those who take -\sum_x p(x) log p(x) definition of entropy as fundamental.<p>&gt; Boltzmann’s argument summarized in Exercise of 2.4.11 just derives Shannon’s formula and uses it. A major lesson is that before we use the Shannon formula important physics is over.<p>&gt; There are folklores in statistical mechanics. For example, in many textbooks ergodic theory and the mechanical foundation of statistical mechanics are discussed even though detailed mathematical explanations may be missing. We must clearly recognize such topics are almost irrelevant to statistical mechanics. We are also brainwashed that statistical mechanics furnishes the foundation of thermodynamics, but we must clearly recognize that without thermodynamics statistical mechanics cannot be formulated. It is a naive idea that microscopic theories are always more fundamental than macroscopic phenomenology.<p>sources: <a href="http:&#x2F;&#x2F;www.yoono.org&#x2F;download&#x2F;inst.pdf" rel="nofollow">http:&#x2F;&#x2F;www.yoono.org&#x2F;download&#x2F;inst.pdf</a> <a href="http:&#x2F;&#x2F;www.yoono.org&#x2F;download&#x2F;smhypers12.pdf" rel="nofollow">http:&#x2F;&#x2F;www.yoono.org&#x2F;download&#x2F;smhypers12.pdf</a>
xavivives大约 1 个月前
Over the last few months, I&#x27;ve been developing an unorthodox perspective on entropy [1] . It defines the phenomenon in much more detail, allowing for a unification of all forms of entropy. It also defines probability through the same lens.<p>I define both concepts fundamentally in relation to priors and possibilities:<p>- Entropy is the relationship between priors and ANY possibility, relative to the entire space of possibilities.<p>- Probability is the relationship between priors and a SPECIFIC possibility, relative to the entire space of possibilities.<p>The framing of priors and possibilities shows why entropy appears differently across disciplines like statistical mechanics and information theory. Entropy is not merely observer-dependent, but prior-dependent. Including priors not held by any specific observer but embedded in the framework itself. This helps resolve the apparent contradiction between objective and subjective interpretations of entropy.<p>It also defines possibilities as constraints imposed on an otherwise unrestricted reality. This framing unifies how possibility spaces are defined across frameworks.<p>[1]: <a href="https:&#x2F;&#x2F;buttondown.com&#x2F;themeaninggap&#x2F;archive&#x2F;a-unified-perspective-of-entropy-and-probability&#x2F;" rel="nofollow">https:&#x2F;&#x2F;buttondown.com&#x2F;themeaninggap&#x2F;archive&#x2F;a-unified-persp...</a>
评论 #43689764 未加载
glial大约 1 个月前
One thing that helped me was the realization that, at least as used in the context of information theory, entropy is a property of an individual (typically the person receiving a message) and NOT purely of the system or message itself.<p>&gt; entropy quantifies uncertainty<p>This sums it up. Uncertainty is the property of a person and not a system&#x2F;message. That uncertainty is a function of both a person&#x27;s model of a system&#x2F;message and their prior observations.<p>You and I may have different entropies about the content of the same message. If we&#x27;re calculating the entropy of dice rolls (where the outcome is the &#x27;message&#x27;), and I know the dice are loaded but you don&#x27;t, my entropy will be lower than yours.
评论 #43685585 未加载
评论 #43688999 未加载
评论 #43686121 未加载
评论 #43687411 未加载
hatthew大约 1 个月前
I&#x27;m not sure I understand the distinction between &quot;high-entropy macrostate&quot; and &quot;order&quot;. Aren&#x27;t macrostates just as subjective as order? Let&#x27;s say my friend&#x27;s password is 6dVcOgm8. If we have a system whose microstate consists of an arbitrary string of alphanumeric characters, and the system arranges itself in the configuration 6dVcOgm8, then I would describe the macrostate as &quot;random&quot; and &quot;disordered&quot;. However, if my friend sees that configuration, they would describe the macrostate as &quot;my password&quot; and &quot;ordered&quot;.<p>If we see another configuration M2JlH8qc, I would say that the macrostate is the same, it&#x27;s still &quot;random&quot; and &quot;unordered&quot;, and my friend would agree. I say that both macrostates are the same: &quot;random and unordered&quot;, and there are many microstates that could be called that, so therefore both are microstates representing the same high-entropy macrostate. However, my friend sees the macrostates as different: one is &quot;my password and ordered&quot;, and the other is &quot;random and unordered&quot;. There is only one microstate that she would describe as &quot;my password&quot;, so from her perspective that&#x27;s a low-entropy macrostate, while they would agree with me that M2JlH8qc represents a high-entropy macrostate.<p>So while I agree that &quot;order&quot; is subjective, isn&#x27;t &quot;how many microstates could result in this macrostate&quot; equally subjective? And then wouldn&#x27;t it be reasonable to use the words &quot;order&quot; and &quot;disorder&quot; to count (in relative terms) how many microstates could result in the macrostate we subjectively observe?
评论 #43688865 未加载
IIAOPSW大约 1 个月前
Its the name for the information bits you don&#x27;t have.<p>More elaborately, its the number bits needed to fully specify something which is known to be in some broad category of state but the exact details to calculate it are unknown.
voidhorse大约 1 个月前
I think this is a pretty good introduction but it gets a little bogged down in the binary encoding assumption, which is an extraneous detail. It does help to know <i>why</i> the logarithm is chosen as a measure of information though regardless of base, once you know that &quot;entropy&quot; is straightforward. I&#x27;d agree that much of the difficulty arises from the uninformative name and the various mystique it carries.<p>To try to expand on the information measure part from a more abstract starting point: Consider a probability distribution, some set of probabilities p. We can consider it as indicating our degree of <i>certainty</i> about what will happen. In an equiprobable distribution, e.g. a fair coin flip (1&#x2F;2, 1&#x2F;2) there is no skew either which way, we are admitting that we basically have no reason to suspect any particular outcome. Contrarily, in a split like (1&#x2F;4, 3&#x2F;4) we are stating that we are more certain that one particular outcome will happen.<p>If you wanted to come up with a <i>number</i> to represent the amount of <i>uncertainty</i>, it&#x27;s clear that the number should be higher the closer the distribution is to being completely equiprobable (1&#x2F;2, 1&#x2F;2)—complete lack of certainty about the result, and the number should be smallest when we are 100% certain (0, 1).<p>This means that the function has to be an order inversion on the probability values—that is I(1) = 0 (no uncertainty). The logarithm, to arbitrary base (selecting a base is just a change of units) has this property under the convention that I(0) = inf (that is, a totally improbable event carries <i>infinite</i> information—after all, an impossibility occurring would in fact be the ultimate surprise).<p>Entropy is just the average of this function taken over the probability values (multiply each probability in the distribution by the log of the inverse of the probabilities and sum them). In info theory you also usually assume the probabilities are independent, and so the further condition that I(pq) = I(p) + I(q) is also stipulated.
karpathy大约 1 个月前
What I never fully understood is that there is some implicit assumption about the dynamics of the system. So what that there are more microstates of some macrostate as far as counting is concerned? We also have to make assumptions about the dynamics, and in particular about some property that encourages mixing.
评论 #43686753 未加载
评论 #43687472 未加载
评论 #43686847 未加载
tsimionescu大约 1 个月前
This goes through all definitions of entropy, except the very first one, which is also the one that is in fact measurable and objective: the variation in entropy is the amount of heat energy that the system exchanges with the environment at a given temperature during a reversible process. While tedious, this can be measured, and it doesn&#x27;t depend on any subjective knowledge about the system. Any two observers will agree on this value, even if one knows all of the details of every single microstate.
anon84873628大约 1 个月前
Nitpick in the article conclusion:<p>&gt;Heat flows from hot to cold because the number of ways in which the system can be non-uniform in temperature is much lower than the number of ways it can be uniform in temperature ...<p>Should probably say &quot;thermal energy&quot; instead of &quot;temperature&quot; if we want to be really precise with our thermodynamics terms. Temperature is not a direct measure of energy, rather it is an extensive property describing the relationship between change in energy to change in entropy.
评论 #43687544 未加载
评论 #43689386 未加载
brummm大约 1 个月前
I love that the author clearly describes why saying entropy measures disorder is misleading.
评论 #43685641 未加载
bargava大约 1 个月前
Here is a good overview on Entropy [1]<p>[1] <a href="https:&#x2F;&#x2F;arxiv.org&#x2F;abs&#x2F;2409.09232" rel="nofollow">https:&#x2F;&#x2F;arxiv.org&#x2F;abs&#x2F;2409.09232</a>
评论 #43685936 未加载
marojejian大约 1 个月前
This is the best description of entropy and information I&#x27;ve read: <a href="https:&#x2F;&#x2F;arxiv.org&#x2F;abs&#x2F;1601.06176" rel="nofollow">https:&#x2F;&#x2F;arxiv.org&#x2F;abs&#x2F;1601.06176</a><p>Most of all, it highlights the subjective &#x2F; relative foundations of these concepts.<p>Entropy and Information only exist relative to a decision about the set of state an observer cares to distinguish.<p>It also caused me to change my informal definition of entropy from a negative (&quot;disorder)&quot; to a more positive one (&quot;the number of things I might care to know&quot;)<p>The Second Law now tells me that the number of interesting things I don&#x27;t know about is always increasing!<p>This thread inspired me to post it here: <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=43695358">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=43695358</a>
dswilkerson大约 1 个月前
Entropy is expected information. That is, given a random variable, if you compute the expected value (the sum of the values weighted by their probability) of the information of an event (the log base 2 of the multiplicative inverse of the probability of the event), you get the formula for entropy.<p>Here it is explained at length: &quot;An Intuitive Explanation of the Information Entropy of a Random Variable, Or: How to Play Twenty Questions&quot;: <a href="http:&#x2F;&#x2F;danielwilkerson.com&#x2F;entropy.html" rel="nofollow">http:&#x2F;&#x2F;danielwilkerson.com&#x2F;entropy.html</a>
bowsamic大约 1 个月前
I didn’t read in depth but it seems to me on first glance (please correct me if I’m wrong) but as with all articles on entropy this seems to explain everything but the classical thermodynamic quantity called entropy which is 1. the quantity to which all these others are chosen to be related to and 2. the one that is by far the most difficult to explain intuitively<p>Information and statistical explanations of entropy are very easy. The real question is, what does entropy mean in the original context that it was introduced in, before those later explanations?
im3w1l大约 1 个月前
So here is an amusing thought experiment I thought of at one point.<p>Imagine a very high resolution screen. Say a billion by a billion pixels. Each of them can be white, gray or black. What is the lowest entropy possible? Each of the pixels has the same color. How does the screen look? Gray. What is the highest entropy possible? Each pixel has a random color. How does it look from a distance? Gray again.<p>What does this mean? I have no idea. Maybe nothing.<p><i>Also sorry for writing two top level comments, but I just really care about this topic</i>
flanked-evergl大约 1 个月前
Not sure what the point of this article, it seems to focus on confusion which could be cleared up with a simple visit to wikipedia.<p>&gt; But I have no idea what entropy is, and from what I find, neither do most other people.<p>The article does not go on to explain what entropy is, it just tries to explain away some hypothetical claims about entropy which as far as we can tell do hold, and does not explain why, if they were wrong, they do in fact hold.
im3w1l大约 1 个月前
As a kid I wanted to invent a perpetuum mobile. From that perspective, entropy is that troublesome property that prevents a perpetuum mobile of the second kind. And any fuzziness or ambiguity in its definition is a glimmer of hope that we may yet find a loop hole.
jwilber大约 1 个月前
There’s an interactive visual of Entropy here in the Where To Partition section (midway thru the article): <a href="https:&#x2F;&#x2F;mlu-explain.github.io&#x2F;decision-tree&#x2F;" rel="nofollow">https:&#x2F;&#x2F;mlu-explain.github.io&#x2F;decision-tree&#x2F;</a>
jwarden大约 1 个月前
Here&#x27;s my own approach to explaining entropy as a measure of uncertainty: <a href="https:&#x2F;&#x2F;jonathanwarden.com&#x2F;entropy-as-uncertainty" rel="nofollow">https:&#x2F;&#x2F;jonathanwarden.com&#x2F;entropy-as-uncertainty</a>
FilosofumRex大约 1 个月前
Boltzmann and Gibbs turn in their graves, every time some information theorist mutilates their beloved entropy. Shanon &amp; Von Neumann were hacking a new theory of communication, not doing real physics and never meant to equate thermodynamic concepts to encoding techniques - but alas now dissertations are written on it.<p>Entropy can&#x27;t be a measure of uncertainty, because all the uncertainty is in the probability distribution p(x) - multiplying it with its own logarithm and summing doesn&#x27;t tell us anything new. If it did, it&#x27;d violate quantum physics principles including the Bell inequality and Heisenberg uncertainty.<p>The article never mentions the simplest and most basic definition of entropy, ie its units (KJ&#x2F;Kelvin), nor the 3rd law of thermodynamics which is the basis for its measurement.<p>“Every physicist knows what entropy is. Not one can write it down in words.” Clifford Truesdell
评论 #43690648 未加载
评论 #43690413 未加载
Ono-Sendai大约 1 个月前
Anyone else notice how the entropy in the 1000 bouncing balls simulation goes down at some point, thereby violating the second law of thermodynamics? :)
评论 #43688752 未加载
gozzoo大约 1 个月前
The visualisation is great, the topic is interesting and very well explained. Can sombody recomend some other blogs with similar type of presentation?
评论 #43686167 未加载
fedeb95大约 1 个月前
given all the comments, it turns out that a post on entropy has high entropy.
vitus大约 1 个月前
The problem with this explanation (and with many others) is that it misses why we should care about &quot;disorder&quot; or &quot;uncertainty&quot;, whether in information theory or statistical mechanics. Yes, we have the arrow of time argument (second law of thermodynamics, etc), and entropy breaks time-symmetry. So what?<p>The article hints very briefly at this with the discussion of an unequally-weighted die, and how by encoding the most common outcome with a single bit, you can achieve some amount of compression. That&#x27;s a start, and we&#x27;ve now rediscovered the idea behind Huffman coding. What information theory tells us is that if you consider a sequence of two dice rolls, you can then use even fewer bits on average to describe that outcome, and so on; as you take your block length to infinity, your average number of bits for each roll in the sequence approaches the entropy of the source. (This is Shannon&#x27;s source coding theorem, and while entropy plays a far greater role in information theory, this is at least a starting point.)<p>There&#x27;s something magical about statistical mechanics where various quantities (e.g. energy, temperature, pressure) emerge as a result of taking partial derivatives of this &quot;partition function&quot;, and that they turn out to be the same quantities that we&#x27;ve known all along (up to a scaling factor -- in my stat mech class, I recall using k_B * T for temperature, such that we brought everything back to units of energy).<p><a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Partition_function_(statistical_mechanics)" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Partition_function_(statistica...</a><p><a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Fundamental_thermodynamic_relation" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Fundamental_thermodynamic_rela...</a><p>If you&#x27;re dealing with a sea of electrons, you might apply the Pauli exclusion principle to derive Fermi-Dirac statistics that underpins all of semiconductor physics; if instead you&#x27;re dealing with photons which can occupy the same energy state, the same statistical principles lead to Bose-Einstein statistics.<p>Statistical mechanics is ultimately about taking certain assumptions about how particles interact with each other, scaling up the quantities beyond our ability to model all of the individual particles, and applying statistical approximations to consider the average behavior of the ensemble. The various forms of entropy are building blocks to that end.
alex5207大约 1 个月前
Super read! Thanks for sharing
sysrestartusr大约 1 个月前
at some point my take became: if nothing orders the stuff that lies and flies around, any emergent structures that follow the laws of nature eventually break down.<p>organisms started putting things in places to increase &quot;survivability&quot; and thriving of themselves until the offspring was ready for the job at which point the offspring started to additionaly put things in place for the sake of the &quot;survivability&quot; and thriving of their ancestors ( mostly overlooking their nagging and shortcomings because &quot;love&quot; and because over time, the lessons learned made everything better for all generations ) ...<p>so entropy is only relevant if all the organisms that can put some things in some place for some reason disappear and the laws of nature run until new organisms emerge. ( which is why I&#x27;m always disappointed at leadership and all the fraudulent shit going on ... more pointlessly dead organisms means less heads that can come up with ways to put things together in fun and useful ways ... it&#x27;s 2025, to whomever it applies: stop clinging to your sabotage-based wannabe supremacy, please, stop corrupting the law, for fucks sake, you rich fucking losers )
nanna大约 1 个月前
Yet another take on entropy and information focused on Claude Shannon and lacking even a single mention of Norbert Wiener, even though they invented it simultaneously and evidence suggests Shannon learned the idea from Wiener.
NitroPython大约 1 个月前
Love the article, my mind is bending but in a good way lol
DadBase大约 1 个月前
My old prof taught entropy with marbles in a jar and cream in coffee. “Entropy,” he said, “is surprise.” Then he microwaved the coffee until it burst. We understood: the universe favors forgetfulness.
ponty_rick大约 1 个月前
As a software engineer, I learned what entropy was in computer science when I changed the way that a function was called which caused the system to run out of entropy in production and caused an outage. Heh.
alganet大约 1 个月前
Nowadays, it seems to be a buzzword to confuse people.<p>We IT folk should find another word for disorder that increases over time, specially when that disorder has human factors (number of contributors, number of users, etc). It clearly cannot be treated in the same way as in chemistry.
评论 #43685135 未加载
评论 #43686856 未加载