TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

What Is Entropy?

345 点作者 ainoobler10 个月前

28 条评论

Jun810 个月前
A well known anecdote reported by Shannon:<p>&quot;My greatest concern was what to call it. I thought of calling it &#x27;information,&#x27; but the word was overly used, so I decided to call it &#x27;uncertainty.&#x27; When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, &#x27;You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage.&#x27;&quot;<p>See the answers to this MathOverflow SE question (<a href="https:&#x2F;&#x2F;mathoverflow.net&#x2F;questions&#x2F;403036&#x2F;john-von-neumanns-remark-on-entropy" rel="nofollow">https:&#x2F;&#x2F;mathoverflow.net&#x2F;questions&#x2F;403036&#x2F;john-von-neumanns-...</a>) for references on the discussion whether Shannon&#x27;s entropy is the same as the one from thermodynamics.
评论 #41039156 未加载
glial10 个月前
I felt like I finally understood Shannon entropy when I realized that it&#x27;s a subjective quantity -- a property of the observer, not the observed.<p>The entropy of a variable X is the amount of information required to drive the observer&#x27;s uncertainty about the value of X to zero. As a correlate, your uncertainty and mine about the value of the same variable X could be different. This is trivially true, as we could each have received different information that about X. H(X) should be H_{observer}(X), or even better, H_{observer, time}(X).<p>As clear as Shannon&#x27;s work is in other respects, he glosses over this.
评论 #41041552 未加载
评论 #41039753 未加载
评论 #41039769 未加载
评论 #41042127 未加载
评论 #41042394 未加载
评论 #41041784 未加载
评论 #41039724 未加载
评论 #41042369 未加载
评论 #41049384 未加载
评论 #41040365 未加载
评论 #41045637 未加载
dekhn10 个月前
I really liked the approach my stat mech teacher used. In nearly all situations, entropy just ends up being the log of the number of ways a system can be arranged (<a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Boltzmann%27s_entropy_formula" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Boltzmann%27s_entropy_formula</a>) although I found it easiest to think in terms of pairs of dice rolls.
评论 #41039821 未加载
评论 #41040780 未加载
评论 #41041597 未加载
评论 #41040442 未加载
ooterness10 个月前
For information theory, I&#x27;ve always thought of entropy as follows:<p>&quot;If you had a really smart compression algorithm, how many bits would it take to accurately represent this file?&quot;<p>i.e., Highly repetitive inputs compress well because they don&#x27;t have much entropy per bit. Modern compression algorithms are good enough on most data to be used as a reasonable approximation for the true entropy.
评论 #41045052 未加载
tasteslikenoise10 个月前
I&#x27;ve always favored this down-to-earth characterization of the entropy of a discrete probability distribution. (I&#x27;m a big fan of John Baez&#x27;s writing, but I was surprised glancing through the PDF to find that he doesn&#x27;t seem to mention this viewpoint.)<p>Think of the distribution as a histogram over some bins. Then, the entropy is a measurement of, if I throw many many balls at random into those bins, the probability that the distribution of balls over bins ends up looking like that histogram. What you usually expect to see is a uniform distribution of balls over bins, so the entropy measures the probability of other rare events (in the language of probability theory, &quot;large deviations&quot; from that typical behavior).<p>More specifically, if P = (P1, ..., Pk) is some distribution, then the probability that throwing N balls (for N very large) gives a histogram looking like P is about 2^(-N * [log(k) - H(P)]), where H(P) is the entropy. When P is the uniform distribution, then H(P) = log(k), the exponent is zero, and the estimate is 1, which says that by far the most likely histogram is the uniform one. That is the largest possible entropy, so any other histogram has probability 2^(-c*N) of appearing for some c &gt; 0, i.e., is very unlikely and exponentially moreso the more balls we throw, but the entropy measures just how much. &quot;Less uniform&quot; distributions are less likely, so the entropy also measures a certain notion of uniformity. In large deviations theory this specific claim is called &quot;Sanov&#x27;s theorem&quot; and the role the entropy plays is that of a &quot;rate function.&quot;<p>The counting interpretation of entropy that some people are talking about is related, at least at a high level, because the probability in Sanov&#x27;s theorem is the number of outcomes that &quot;look like P&quot; divided by the total number, so the numerator there is indeed counting the number of configurations (in this case of balls and bins) having a particular property (in this case looking like P).<p>There are lots of equivalent definitions and they have different virtues, generalizations, etc, but I find this one especially helpful for dispelling the air of mystery around entropy.
评论 #41041716 未加载
Tomte10 个月前
PBS Spacetime‘s entropy playlist: <a href="https:&#x2F;&#x2F;youtube.com&#x2F;playlist?list=PLsPUh22kYmNCzNFNDwxIug8q1Zz0Mj60H&amp;si=y2XIvxDJOFyCpfIu" rel="nofollow">https:&#x2F;&#x2F;youtube.com&#x2F;playlist?list=PLsPUh22kYmNCzNFNDwxIug8q1...</a>
评论 #41040358 未加载
eointierney10 个月前
Ah JCB, how I love your writing, you are always so very generous.<p>Your This Week&#x27;s Finds were a hugely enjoyable part of my undergraduate education and beyond.<p>Thank you again.
yellowcake010 个月前
Information entropy is literally the strict lower bound on how efficiently information can be communicated (expected number of transmitted bits) if the probability distribution which generates this information is known, that&#x27;s it. Even in contexts such as calculating the information entropy of a bit string, or the English language, you&#x27;re just taking this data and constructing some empirical probability distribution from it using the relative frequencies of zeros and ones or letters or n-grams or whatever, and then calculating the entropy of that distribution.<p>I can&#x27;t say I&#x27;m overly fond of Baez&#x27;s definition, but far be it from me to question someone of his stature.
ccosm10 个月前
&quot;I have largely avoided the second law of thermodynamics, which says that entropy always increases. While fascinating, this is so problematic that a good explanation would require another book!&quot;<p>For those interested I am currently reading &quot;Entropy Demystified&quot; by Arieh Ben-Naim which tackles this side of things from much the same direction.
utkarsh85810 个月前
I sometimes ponder where new entropy&#x2F;randomness is coming from, like if we take the earliest state of universe as an infinitely dense point particle which expanded. So there must be some randomness or say variety which led it to expand in a non uniform way which led to the dominance of matter over anti-matter, or creation of galaxies, clusters etc. If we take an isolated system in which certain static particles are present, will there be the case that a small subset of the particles will get motion and this introduce entropy? Can entropy be induced automatically, atleast on a quantum level? If anyone can help me explain that it will be very helpful and thus can help explain origin of universe in a better way.
评论 #41047076 未加载
评论 #41045378 未加载
niemandhier10 个月前
My goto source for understanding entropy: <a href="http:&#x2F;&#x2F;philsci-archive.pitt.edu&#x2F;8592&#x2F;1&#x2F;EntropyPaperFinal.pdf" rel="nofollow">http:&#x2F;&#x2F;philsci-archive.pitt.edu&#x2F;8592&#x2F;1&#x2F;EntropyPaperFinal.pdf</a>
jsomedon10 个月前
Am I only one that can&#x27;t download the pdf, or is the file server down? I can see the blog page but when I try downloading the ebook it just doesn&#x27;t work..<p>If the file server is down.. anyone could upload the ebook for download?
bdjsiqoocwk10 个月前
Hmmm that list of things that contribute to entropy I&#x27;ve noticed omits particles which under &quot;normal circumstances&quot; on earth exist in bound states, for example it doesn&#x27;t mentions W bosons or gluons. But in some parts of the universe they&#x27;re not bound but in different state of matter, e.g. quark gluon plasma. I wonder how or if this was taken I to account.
suoduandao310 个月前
I like the formulation of &#x27;the amount of information we don&#x27;t know about a system that we could in theory learn&#x27;. I&#x27;m surprised there&#x27;s no mention of the Copenhagen interpretation&#x27;s interaction with this definition, under a lot of QM theories &#x27;unavailable information&#x27; is different from available information.
vinnyvichy10 个月前
The book might disappoint some..<p>&gt;I have largely avoided the second law of thermodynamics ... Thus, the aspects of entropy most beloved by physics popularizers will not be found here.<p>But personally, this bit is the most exciting to me.<p>&gt;I have tried to say as little as possible about quantum mechanics, to keep the physics prerequisites low. However, Planck’s constant shows up in the formulas for the entropy of the three classical systems mentioned above. The reason for this is fascinating: Planck’s constant provides a unit of volume in position-momentum space, which is necessary to define the entropy of these systems. Thus, we need a tiny bit of quantum mechanics to get a good approximate formula for the entropy of hydrogen, even if we are trying our best to treat this gas classically.
GoblinSlayer10 个月前
There&#x27;s fundamental nature of entropy, but as usual it&#x27;s not very enlightening for poor monkey brain, so to explain you need to enumerate all its high level behavior, but its high level behavior is accidental and can&#x27;t be summarized in a concise form.
评论 #41045388 未加载
drojas10 个月前
My definition: Entropy is a measure of the accumulation of non-reversible energy transfers.<p>Side note: All reversible energy transfers involve an increase in potential energy. All non-reversible energy transfers involve a decrease in potential energy.
评论 #41039223 未加载
评论 #41045410 未加载
评论 #41041667 未加载
tromp10 个月前
Closely related recent discussion on The Second Law of Thermodynamics (2011) (franklambert.net):<p><a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=40972589">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=40972589</a>
tsoukase10 个月前
After years of thought I dare to say the 2nd TL is a tautology. Entropy is increasing means every system tends to higher probability means the most probable is the most probable.
评论 #41048653 未加载
tromp10 个月前
Closely related recent discussion: <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=40972589">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=40972589</a>
prof-dr-ir10 个月前
If I would write a book with that title then I would get to the point a bit faster, probably as follows.<p>Entropy is <i>just</i> a number you can associate with a probability distribution. If the distribution is discrete, so you have a set p_i, i = 1..n, which are each positive and sum to 1, then the definition is:<p>S = - sum_i p_i log( p_i )<p>Mathematically we say that entropy is a real-valued function on the space of probability distributions. (Elementary exercises: show that S &gt;= 0 and it is maximized on the uniform distribution.)<p>That is it. I think there is little need for all the mystery.
评论 #41040045 未加载
评论 #41039673 未加载
评论 #41041554 未加载
评论 #41039409 未加载
评论 #41042957 未加载
评论 #41040688 未加载
评论 #41039827 未加载
评论 #41042740 未加载
评论 #41040854 未加载
评论 #41040416 未加载
ctafur10 个月前
The way I understand it is with an analogy to probability. To me, events are to microscopic states like random variable is to entropy.
评论 #41043548 未加载
dmn32210 个月前
This seems like a great resource for referencing the various definitions. I&#x27;ve tried my hand at developing an intuitive understanding: <a href="https:&#x2F;&#x2F;spacechimplives.substack.com&#x2F;p&#x2F;observers-and-entropy" rel="nofollow">https:&#x2F;&#x2F;spacechimplives.substack.com&#x2F;p&#x2F;observers-and-entropy</a>. TLDR - it&#x27;s an artifact of the model we&#x27;re using. In the thermodynamic definition, the energy accounted for in the terms of our model is information. The energy that&#x27;s not is entropic energy. Hence why it&#x27;s not &quot;useable&quot; energy, and the process isn&#x27;t reversible.
zoenolan10 个月前
Hawking on the subject<p><a href="https:&#x2F;&#x2F;youtu.be&#x2F;wgltMtf1JhY" rel="nofollow">https:&#x2F;&#x2F;youtu.be&#x2F;wgltMtf1JhY</a>
foobarbecue10 个月前
How do you get to the actual book &#x2F; tweets? The link just takes me back to the forward...
评论 #41043554 未加载
ThrowawayTestr10 个月前
MC Hawking already explained this<p><a href="https:&#x2F;&#x2F;youtu.be&#x2F;wgltMtf1JhY" rel="nofollow">https:&#x2F;&#x2F;youtu.be&#x2F;wgltMtf1JhY</a>
arjunlol10 个月前
ΔS = ΔQ&#x2F;T
illuminant10 个月前
Entropy is the distribution of potential over negative potential.<p>This could be said &quot;the distribution of what ever may be over the surface area of where it may be.&quot;<p>This is erroneously taught in conventional information theory as &quot;the number of configurations in a system&quot; or the available information that has yet to be retrieved. Entropy includes the unforseen, and out of scope.<p>Entropy is merely the predisposition to flow from high to low pressure (potential). That is it. Information is a form of potential.<p>Philosophically what are entropy&#x27;s guarantees?<p>- That there will always be a super-scope, which may interfere in ways unanticipated;<p>- everything decays the only mystery is when and how.
评论 #41039265 未加载
评论 #41038878 未加载
评论 #41038837 未加载
评论 #41038826 未加载