TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

It took me 10 years to understand entropy

398 点作者 dil8大约 3 年前

44 条评论

threatripper大约 3 年前
I don&#x27;t understand entropy and this article did not change it. The issue I take is with the definition of &quot;the most likely state&quot;.<p>Think of a series of random bits that can be either 0 or 1 with equal probability. How likely is it that they are all 0 or all 1? Not very likely. There is exactly one configuration. How likely is it that they have a specific configuration of 0 and 1? Equally likely. All states are equally likely. If you randomly flip bits you go from one state to another state but each one is equally likely to occur. There is no special meaning to a specific configuration if you don&#x27;t give it one.<p>If you look at the average of all bits you start grouping all states together with the equal number of 1s. If you talk about the average there is only one configuration that is all 1s but most configurations have roughly 50% 1s. If you now start flipping bits you will meander through all possible bit-states but the average will most likely be close to 50% 1s most of the time.<p>In physics we usually look at averages such as the average velocity expressed as temperature. Therefore it makes sense to group together all states using the average and then the states with very low or very high averages are few.<p>But if you look deeper than that averaging it stops making sense to me. It&#x27;s a completely different world. I don&#x27;t know what Entropy is supposed to mean on the level of individual states&#x2F;configurations. I don&#x27;t understand what kind of macroscopic &quot;averaging&quot; function we may use to group up those states. There could be more than one possibility - from that would follow that there is more than one definition of macro-Entropy. Ideally there should be one general definition of how we have to look at those microstates and from that follows our general definition of Entropy. Sadly I didn&#x27;t study Physics and this topic still continues to confuse me. The usual explanations fail to enlighten me.
评论 #31189530 未加载
评论 #31190827 未加载
评论 #31191616 未加载
评论 #31190153 未加载
评论 #31192318 未加载
评论 #31191300 未加载
评论 #31193140 未加载
评论 #31189464 未加载
评论 #31189988 未加载
评论 #31202023 未加载
评论 #31192793 未加载
评论 #31191103 未加载
评论 #31189298 未加载
评论 #31191536 未加载
评论 #31192683 未加载
评论 #31192646 未加载
评论 #31194225 未加载
评论 #31204804 未加载
评论 #31191460 未加载
evouga大约 3 年前
One aspect of entropy that I always find counterintuitive is that unlike mass, charge, etc. it is not a physical quantity. In fact, from the point of view of an experimenter with perfect information about a physical system, the entropy of the system is <i>exactly</i> conserved over time (as made precise by Liouville&#x27;s Theorem). The Second Law survives in this setting only in the most trivial sense that a constant function does not decrease.<p>It&#x27;s only when you start making crude measurements---lumping positions into pixels, clouds of particles each with their own kinetic energy into a single scalar called &quot;temperature,&quot; etc---that you start to see a nontrivial entropy and Second Law. Different ways of lumping microstates into macrostates will give you different (and inconsistent) notions of entropy.
评论 #31189151 未加载
评论 #31189186 未加载
评论 #31191500 未加载
评论 #31189526 未加载
评论 #31188579 未加载
评论 #31189859 未加载
评论 #31211234 未加载
评论 #31188619 未加载
vivekd大约 3 年前
This reminds me of a great article that I saw on Hacker news that really helped explain the concept of Entrophy to me. Linked here:<p><a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=24140808" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=24140808</a>
评论 #31188576 未加载
ogogmad大约 3 年前
I thought entropy (in the Shannon sense) was a property of discrete and finite probability distributions. It&#x27;s essentially a measure of how random a sample from such a probability distribution is. Notably, continuous probability distributions don&#x27;t have meaningful entropy (or in some sense, their entropy is always infinite). It&#x27;s worth considering the similarities and differences between entropy and standard deviation.<p>I thought the 2nd law of thermodynamics was saying that with incomplete knowledge, the probability distribution of possible states becomes more and more spread out as time goes on. It&#x27;s almost a limit to how you can make predictions or simulations of physics <i>when the initial state of the system is not fully known</i>. Equivalently, it&#x27;s a banal statement about chaos in the sense of chaos theory.<p>The only thing I don&#x27;t get is how physicists get around the discrete and finite restriction. Maybe the state of the system is not what has entropy. Rather, one can define an arbitrary function f from the system to a finite set S, and then talk about the entropy of f(System at time t), because this is indeed a discrete and finite probability distribution which you can take the entropy of.<p>Hmmm. Maybe I understand entropy.
评论 #31190096 未加载
评论 #31189846 未加载
评论 #31189344 未加载
评论 #31190083 未加载
评论 #31189282 未加载
评论 #31202356 未加载
评论 #31189559 未加载
评论 #31189905 未加载
评论 #31189948 未加载
评论 #31211274 未加载
javajosh大约 3 年前
Nice writeup! BTW statistical thermodynamics has a name for that set of possible microstates, perhaps the most pretentious sounding name in all of physics, the &quot;canonical ensemble&quot;.
评论 #31188108 未加载
评论 #31188456 未加载
评论 #31188130 未加载
l33t2328大约 3 年前
Like the great Von Neuman once quipped, “ Why don&#x27;t you call it entropy”, von Neumann suggested. “In the first place, a mathematical development very much like yours already exists in Boltzmann&#x27;s statistical mechanics, and in the second place, no one understands entropy very well, so in any discussion you will be in a position of advantage.”
mellavora大约 3 年前
The typical measure of entropy (Shannon or Gibbs, and let&#x27;s spare details for later and after you&#x27;ve read up on the theory of large deviations) is<p>- sum (p log(p))<p>which is not that different than the formula for the mean<p>sum (p 1&#x2F;n)<p>the critical difference is the normalization constant is based on the probability of the state rather than assuming a uniform probability over all states.<p>So, in effect, the entropy is a measure of the mean. It is a measure adopted to the case where &quot;mean&quot; is ill-defined because the number of modes and&#x2F;or the variation around those modes is not handled well by simpler metrics.
评论 #31191721 未加载
评论 #31197719 未加载
评论 #31192362 未加载
stevebmark大约 3 年前
The author mentions Boltzman brains and that a human body could theoretically spontaneously form out of particles given a long enough time span. Of course, nothing like this can ever happen. It’s the fallacy of thinking infinite time means infinite possibilities.
评论 #31188988 未加载
评论 #31189078 未加载
评论 #31190010 未加载
评论 #31189050 未加载
评论 #31189170 未加载
complex_pi大约 3 年前
In short: the authors make a good summary of these ideas:<p>- Entropy in thermodynamic equilibrium is well understood. The early theory (before statistical mechanics was developed) fits well with our modern understanding.<p>- The analogies made about entropy are not always good and indeed, if you try to match the physics with &quot;entropy is disorder&quot; it does not always work.<p>- In non-equilibrium situations it is, as the author points out, more complex.<p>Regarding the last item, even Stephen Hawking postulated some strange ideas about the universe having to rewind past some point in time, so that the big crush would be the mirror of the big bang.
photochemsyn大约 3 年前
Here&#x27;s another head-spinning application of the concept of entropy, in quantum information theory:<p><a href="https:&#x2F;&#x2F;www.cambridge.org&#x2F;core&#x2F;books&#x2F;abs&#x2F;quantum-information-theory&#x2F;quantum-information-and-entropy&#x2F;5BCF610B966AFB2468D0F0F46249DCD0" rel="nofollow">https:&#x2F;&#x2F;www.cambridge.org&#x2F;core&#x2F;books&#x2F;abs&#x2F;quantum-information...</a><p>&gt; &quot;The first fundamental measure that we introduce is the von Neumman entropy. It is the quantum analog of the Shannon entropy, but it captures both classical and quantum uncertainty in a quantum state. The von Neumann entropy gives meaning to a notion of the information qubit. This notion is different from that of the physical qubit, which is the description of a quantum state in an electron or a photon. The information qubit is the fundamental quantum informational unit of measure, determining how much quantum information is in a quantum system.&quot;<p>Incidentally chem.libretexts.org, a collection of open-source chemistry textbooks, has a good overview of the physical-chemical applications. The site is kind of a mess but you&#x27;d want chapter 18.3:<p><a href="https:&#x2F;&#x2F;chem.libretexts.org&#x2F;Bookshelves&#x2F;General_Chemistry&#x2F;Map:_A_Molecular_Approach_(Tro)" rel="nofollow">https:&#x2F;&#x2F;chem.libretexts.org&#x2F;Bookshelves&#x2F;General_Chemistry&#x2F;Ma...</a>
AnthonyAguirre大约 3 年前
Actually a quite nice article. After also spending years as a professional physicist not understanding entropy, I finally decided that I was not necessarily the problem, and spent the last 5 years or so trying to understand it better by rewording the foundations with my research group. (I&#x27;m one of the papers the author cites is part of a series from our group developing &quot;observational entropy&quot; in order to do so.)<p>A lot of what makes this topic confusing is just that there are the two basic definitions — Gibbs (\sum p_i log l_i) and &quot;Boltzmann&quot; (log \Omega) — entropy, and they&#x27;re really rather different. There&#x27;s usually some confusing handwaving about how to relate them, but the fact is that in a closed system one of them (generally) rises and the other doesn&#x27;t, and one of them depends on a coarse-graining into macrostates and the other doesn&#x27;t.<p>The better way to relate them, I&#x27;ve come to believe, is to consider them both as limits of a more general entropy (the one we developed — first in fact written down in some form by von Neumann but for some reason not pursued much over the years.) There&#x27;s a brief version here: <a href="https:&#x2F;&#x2F;link.springer.com&#x2F;article&#x2F;10.1007&#x2F;s10701-021-00498-x" rel="nofollow">https:&#x2F;&#x2F;link.springer.com&#x2F;article&#x2F;10.1007&#x2F;s10701-021-00498-x</a>.<p>This entropy has Gibbs and Bolztmann entropy as limits, is good in and out of equilibrium, is defined in quantum theory and with a very nice classical-quantum correspondence, and has been shown to reproduce thermodynamic entropy in both our papers and the elegant one by Strasberg and Winter: <a href="https:&#x2F;&#x2F;journals.aps.org&#x2F;prxquantum&#x2F;abstract&#x2F;10.1103&#x2F;PRXQuantum.2.030202" rel="nofollow">https:&#x2F;&#x2F;journals.aps.org&#x2F;prxquantum&#x2F;abstract&#x2F;10.1103&#x2F;PRXQuan...</a><p>After all this work I finally feel that entropy makes sense to me, which it never quite did before — so I hope this is helpful to others.<p>p.s. If you&#x27;re not convinced a new definition of entropy is called for, ask a set of working physicists what it would mean to say &quot;the entropy of the universe is increasing.&quot; Since von Neumann entropy is conserved in a closed system (which the universe is if anything is), and there really is no definition of a quantum Boltzmann entropy (until observational entropy), the answers you&#x27;ll get will be either a mush or a properly furrowed brows.
评论 #31199968 未加载
评论 #31192408 未加载
SnowHill9902大约 3 年前
Statistical mechanics is one way of representing entropy but you don’t need it. The second law of thermodynamics can be expressed in other much more general terms. Also it requires that the system be isolated not “thermally isolated”. There’s other types of interactions such as gravitational and electromagnetic.
评论 #31188620 未加载
评论 #31211315 未加载
chemmail大约 3 年前
I had an art teacher who was very philosophical. One day he described to the class what entropy was. I took a lot of physics and even astrophysics. Little did i know he had a better conceptual understanding and explanation than i&#x27;ve ever heard before. Too bad i don&#x27;t remember exactly what he said.
评论 #31189004 未加载
评论 #31190587 未加载
dieselerator大约 3 年前
That&#x27;s the problem with randomness.<p>The required Dilbert reference: <a href="https:&#x2F;&#x2F;dilbert.com&#x2F;strip&#x2F;2001-10-25" rel="nofollow">https:&#x2F;&#x2F;dilbert.com&#x2F;strip&#x2F;2001-10-25</a>
shannifin大约 3 年前
&gt; &quot;Entropy is not Disorder One of the most popular belief about entropy is that it represents disorder.&quot;<p>This is what confused me the most about entropy in high school, the &quot;order &#x2F; disorder&quot; lingo. Isn&#x27;t &quot;order&quot; a metaphysical concept, something a conscious entity <i>thinks</i> about a system? How would nature know the difference? It took me some years to understand that that lingo is indeed misleading. (Still definitely not an expert of course.)
评论 #31188602 未加载
评论 #31202491 未加载
评论 #31188339 未加载
deltaonefour大约 3 年前
&gt;Contrary to popular opinion, uniformly distributed matter is unstable when interactions are dominated by gravity (Jeans instability) and is actually the least likely state, thus with very low entropy. Most probable states, with high-entropy, are those where matter is all lumped together in massive objects.<p>That means over time the system becomes more ordered and starts organizing itself into spheres.<p>I once brought this question up on physics stack exchange and basically the answers were either some form of rolling their eyes at me or dismissing me outright. The people who did answer the question stated that as particles organize themselves into spheres some other part of the universe gets hotter as a result and that the seemingly self organization I see going on with the solar system was just an isolated system.<p>This answer still seemed far fetched to me. It still looks as if some overall self organization is still going on if the universe gets hotter on one side and matter gets organized into solar systems on another side.<p>It took me 3 years to somewhat understand what entropy is. If you have loaded dice that always roll 6s then the dice rolling ALL 6s is the highest entropic state. rolling Random numbers would then be a low entropy state.<p>Entropy is simply a phenomenon of probability. As time moves forward, particles enter high probability configurations. Like rolling dice. As you roll dice more and more... rolling random numbers has a higher probability then rolling all 6s...<p>It just so happens that disordered arrangements happen to have higher probabilities in most systems. But if you look at a system of loaded dice or the solar system... in those cases Ordered configurations have higher probabilities. That&#x27;s really all it is. The entire phenomenon of entropy comes down to probability and the root of probability is the law of large numbers.
raven105x大约 3 年前
Entropy: &quot;to describe energy loss in irreversible processes&quot;. We have no clue about what is or is not reversible. Complex systems exhibit self-organizing behavior for no reason (that we understand), and we continue to identify more conditions under which this occurs. How does a Nobel Prize get handed out for identifying&#x2F;quantifying &quot;self-organization&quot; <a href="http:&#x2F;&#x2F;pespmc1.vub.ac.be&#x2F;COMPNATS.html" rel="nofollow">http:&#x2F;&#x2F;pespmc1.vub.ac.be&#x2F;COMPNATS.html</a> without bringing everything we think we know about entropy under scrutiny? Self-organization does not consume energy any more than entropic decay emits it. Irreversibility is a poor assumption.
评论 #31188706 未加载
评论 #31188581 未加载
评论 #31188486 未加载
评论 #31188574 未加载
评论 #31188515 未加载
Gravityloss大约 3 年前
Gas molecules in a box - entropy seems quite straighforward there. An even distribution is the most likely state and has the highest entropy.<p>In space, at large scales, gravity starts dominating - so stars and planets are actually a higher likelihood state than an even distribtion.<p>Isn&#x27;t this just about statistical independence? In a small amount of gas (almost by definition of what is a gas), the particles don&#x27;t have much effect on each other. One can assume statistical independence.<p>While in space with gravity overwhelming other effects, the particles have very much effect on each other. Hence the statistics about their state are affected by these dependencies. So the previous intuition about entropy can&#x27;t hold.
nfc大约 3 年前
I enjoyed the article but have a very minor nitpick. I didn&#x27;t understand why the author added this sentence.<p>&quot;However, the timescales involved in these calculation are so unreasonably large and abstract that one could wonder if these makes any sense at all.&quot;<p>Apart from the fact that we could wonder about anything and everything I think the author does not state what evidence do we have to suspect that large enough timescales would change the laws of physics.<p>It could be the case of course, and it would be great to talk about them if they exist but without further justification I feel that this sentence is an unjustified opinion in what is otherwise a very nice article that helps better understand enthropy.
zwieback大约 3 年前
Of course you don&#x27;t need to really understand entropy for it to be useful. It&#x27;s definitely an interesting concept but when I was crunching equations for Thermodynamics, one of the weeder classes for ME, it becomes clear you need it for things to balance out. Once you&#x27;ve cranked threw a dozen or so problems you get a feel for what it is even if the physics and the spiritual side of it remains murky.<p>Now, 35 years later, when I marvel at my new engine or what have you, I still vaguely remember my entropy-problems days and appreciate that someone worked this stuff out.
sjg007大约 3 年前
I view entropy as a probability distribution of some set of configurations of something. Entropy is low if there’s only one configuration and high if uniformly distributed.<p>There’s also some observer&#x2F;interaction effect which is like introduction a conditional probability which would cause crystallization in an otherwise homogeneous system. Essentially a catalyst.<p>I also find it fascinating that when it is super cold outside and you throw a pan of boiling water out the window it turns to snow instantly vs a cup of room temperature water which does not. It probably fits in terms of activation energy as well.
Buttons840大约 3 年前
I recommend Information Theory for Intelligent People: <a href="http:&#x2F;&#x2F;tuvalu.santafe.edu&#x2F;~simon&#x2F;it.pdf" rel="nofollow">http:&#x2F;&#x2F;tuvalu.santafe.edu&#x2F;~simon&#x2F;it.pdf</a>
评论 #31200054 未加载
topspin大约 3 年前
That&#x27;s pretty good as far as I&#x27;m concerned. Took me a couple years to really grasp electrical impedance. Breakthrough for me was a concise book written in 1976 by Rufus P. Turner.<p>Subtle things take a while to get.
andrewgleave大约 3 年前
The Science of Can and Can&#x27;t[1] is interesting in how it looks to address a number of fundamentals via counterfactuals including the 2nd Law of Thermodynamics.<p>Edit: See [2] for background about Constructor Theory.<p>[1] <a href="https:&#x2F;&#x2F;www.chiaramarletto.com&#x2F;books&#x2F;the-science-of-can-and-cant&#x2F;" rel="nofollow">https:&#x2F;&#x2F;www.chiaramarletto.com&#x2F;books&#x2F;the-science-of-can-and-...</a> [2] <a href="https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=8DH2xwIYuT0" rel="nofollow">https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=8DH2xwIYuT0</a>
cb321大约 3 年前
Of possible related interest: <a href="https:&#x2F;&#x2F;arxiv.org&#x2F;abs&#x2F;chao-dyn&#x2F;9603009" rel="nofollow">https:&#x2F;&#x2F;arxiv.org&#x2F;abs&#x2F;chao-dyn&#x2F;9603009</a><p>I think Bricmont is a clear thinker&#x2F;presenter on these matters and this article actually showed up in a &quot;for humanities people&quot; anthology. [1]<p>[1] <a href="https:&#x2F;&#x2F;www.amazon.com&#x2F;Flight-Science-Reason-Academy-Sciences&#x2F;dp&#x2F;0801856760" rel="nofollow">https:&#x2F;&#x2F;www.amazon.com&#x2F;Flight-Science-Reason-Academy-Science...</a>
mensetmanusman大约 3 年前
I had the pain, and pleasure, of taking and then (assistant) teaching thermodynamics at MIT.<p>One of the tidbits that always stuck with me was that astronomers have estimated that observable universe’s total entropy:<p>When you compare that value to the maximum possible entropy, i.e. the heat death of the universe, and then to the ridiculously low entropy state of the beginning of the universe, we are currently halfway along in that ‘timeline’.<p>It always brought to mind a grandfather clock; the clock stops when the weight hits the floor, and we are halfway there…
empiricus大约 3 年前
My understanding of entropy: it is a measure of how big a system (matter + energy from a space region) is, and how much its components have interacted with each other: Entropy ~ log(number of possible system states). As the universe unfolds, systems originally isolated are starting to interact and to form bigger systems, hence the number of possible states increases, and entropy increases too.
评论 #31190488 未加载
hilbert42大约 3 年前
Of all of physics, entropy is the most depressing part.
7speter大约 3 年前
Of all places, it was a conversation about the Socratic Forms in a Political Theory course I took in college that really brought the weight of the concept home to me. It went something like &quot;Unlike the realm Socratic forms exist in, everything in our universe is subject to entropy; it is in everything&#x27;s nature to degrade or decay over time.&quot;<p>Maybe there&#x27;s more to that? I&#x27;m all ears.
评论 #31189969 未加载
leereeves大约 3 年前
Small nit in case the author sees this: the image labelled &quot;Entropy of each configuration of system with two dices where the observed macrostate is their sum&quot; is either incorrect or mislabeled.<p>For example, 2 and 12 each have 1 microstate, and ln 1 = 0, so the entropy of 2 and 12 is 0, but the image says 0.028 (which is the probability of 2 or 12, not the entropy).
snikeris大约 3 年前
&gt; Boltzmann imagined that our universe could have reached thermodynamical equilibrium and its maximal entropy state a long time ago, but that a spontaneous entropy decrease to the level of our early universe occured after an extremely long period of time, just for statistical reasons.<p>I’m interested in reading more about this. Any pointers?
cheese_van大约 3 年前
I read with interest most well written articles explaining entropy. I often leave the article mildly satisfied that I understood it. Until the next day when I again have to figure out the difference between &quot;high&quot; and &quot;low&quot; entropy in a particular model, and invariably I mix up the two.
raxxorraxor大约 3 年前
As a computer scientist it isn&#x27;t helpful that the entropy in thermodynamics and that in computer science (informational content - I don&#x27;t know a good English term) collide a bit.
RappingBoomer大约 3 年前
but is it not hubris to think that we really know much about the origin and outcome of the universe? is it wise to make decisions based on this modicum of knowledge that we currently have regarding thermodynamics and the universe?<p>I suspect that the scientists of a trillion years from now will know a lot more than we do know...so, I don&#x27;t really much that much confidence in current pronouncements regarding the beginning and possible end of the universe..<p>and yes I do have a degree in science and courses in physics &amp; thermodynamics
评论 #31193754 未加载
评论 #31188174 未加载
评论 #31189993 未加载
est大约 3 年前
Shannon called the function &quot;entropy&quot; and used it as a measure of &quot;uncertainty,&quot; interchanging the two words in his writings without discrimination
sylware大约 3 年前
scientific method: it started with thermodynamic entropy, but scientits found out that this truth is much deeper engrained in our universe, then we got a mathematically generalized version, which is now used used to explain the &quot;arrow of time&quot; which our time reversable physics equations would not be able to explain alone.
dandanua大约 3 年前
The problem is that entropy is a subjective notion.<p>It&#x27;s a measure of our lack of knowledge about the state of a system.
m0llusk大约 3 年前
Alternatively it took ten years for Aurelien Pelissier&#x27;s misunderstandings of entropy to decay.
sytelus大约 3 年前
Anyone who think they understand entropy is living in a state of sin.
评论 #31190385 未加载
hatware大约 3 年前
Google freewall? Guess I won&#x27;t read this article...
评论 #31190663 未加载
评论 #31190339 未加载
pkrumins大约 3 年前
I think the word entropy is science’s largest mistake.
评论 #31191039 未加载
tamaharbor大约 3 年前
English was not my college professors native language. It took me a while to realize entropy was not the same as enthalpy. Very confusing.
评论 #31188134 未加载
评论 #31202860 未加载
fysicsnurd大约 3 年前
entropy does not increase. The universe has organized itself into people, brains, cities, iPhones
poulpy123大约 3 年前
not to brag but it took only 2 months to forget anything about it