TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

A Mathematical Theory of Communication (1948) [pdf]

197 pointsby alokraiabout 5 years ago

13 comments

noemaabout 5 years ago
Probably one of the dozen or so most important publications of the twentieth century. Ironically though it would be Norbert Wiener's interpretation of information (exactly Shannon's times minus one) that would cement itself in the popular lexicon due to it being far more intuitive. Where Wiener posits information as negative-entropy (or "order"), Shannon interprets it as representing a degree of freedom, or uncertainty. The problem with Wiener's is that the underlying epistemology carries a subjective bent, where information is equivocal to "meaning" or semantic content. Shannon's, meanwhile is ultimately superior as an objective metric, though far less intuitive (under his interpretation, a random string is technically the most information-saturated construct possible, because it possesses the highest degrees of freedom).
评论 #23040632 未加载
评论 #23039434 未加载
评论 #23039617 未加载
markus92about 5 years ago
What I think is telling, is how easy to read Shannon’s paper is. Even today it is used pretty much as is at many EE colleges to teach communication theory.<p>Another cool fact: Shannon’s master thesis is most likely the most influential one of all time: in it, he linked Boolean algebra to electrical circuits with switches, essentially inventing digital circuit theory.
评论 #23039875 未加载
danharajabout 5 years ago
BTW here&#x27;s the paper by Hartley cited on the first page it is also very readable and insightful. I found it helped clarify some of the subtler points in Shannon&#x27;s paper to read it as well.<p><a href="http:&#x2F;&#x2F;keszei.chem.elte.hu&#x2F;entropia&#x2F;Hartley1928text.pdf" rel="nofollow">http:&#x2F;&#x2F;keszei.chem.elte.hu&#x2F;entropia&#x2F;Hartley1928text.pdf</a>
dredmorbiusabout 5 years ago
Recommended on an earlier thread: From Aristotle to John Searle and Back Again: Formal Causes, Teleology, and Computation in Nature&quot; by E. Feser.<p><a href="https:&#x2F;&#x2F;muse.jhu.edu&#x2F;article&#x2F;618359" rel="nofollow">https:&#x2F;&#x2F;muse.jhu.edu&#x2F;article&#x2F;618359</a><p><a href="http:&#x2F;&#x2F;www.netherhallhouse.org.uk&#x2F;wp-content&#x2F;uploads&#x2F;2018&#x2F;03&#x2F;formal_causes.pdf" rel="nofollow">http:&#x2F;&#x2F;www.netherhallhouse.org.uk&#x2F;wp-content&#x2F;uploads&#x2F;2018&#x2F;03...</a><p>Source: <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=12080670" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=12080670</a>
sl8rabout 5 years ago
If you&#x27;re interested in this, check out Cover&#x27;s Information Theory textbook — the rabbit hole goes much deeper. One of the most interesting examples, is that when you&#x27;re betting on a random event, Shannon entropy tells you how much to bet &amp; how quickly you can compound your wealth. Cover covers (heh) this, and the original paper is Kelly: <a href="http:&#x2F;&#x2F;www.herrold.com&#x2F;brokerage&#x2F;kelly.pdf" rel="nofollow">http:&#x2F;&#x2F;www.herrold.com&#x2F;brokerage&#x2F;kelly.pdf</a>
评论 #23040234 未加载
iggabout 5 years ago
The article was renamed &quot;The Mathematical Theory of Communication&quot; in the 1949 book of the same name, a small but significant title change after realizing the generality of this work.
评论 #23036594 未加载
评论 #23037708 未加载
vga805about 5 years ago
I just recently finished the biography of Shannon called &quot;A Mind at Play.&quot;<p>It was quite a joy and I highly recommend to anyone interested in these sorts of bios.
评论 #23038170 未加载
dbcurtisabout 5 years ago
&gt; The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have <i>meaning</i> ...<p>I never fail to suppress a chuckle when I read that.
评论 #23040588 未加载
dredmorbiusabout 5 years ago
A well-deserved perennial. Some prior discussions:<p>2016, 53 comments: <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=12079826" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=12079826</a><p>2017, 11 comments: <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=15095393" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=15095393</a>
dr_dshivabout 5 years ago
There seems to be a great, unresolved difference of opinion about whether information entropy and thermodynamic entropy are commensurable.<p>In trying to connect Shannon entropy to thermodynamic entropy, I always get stuck on the fact that you need to have a defined alphabet or symbol set.
评论 #23039512 未加载
评论 #23045428 未加载
评论 #23039280 未加载
smitty1eabout 5 years ago
Shannon&#x27;s work has had vast influence.<p>Here is a great, short overview of the topic =&gt; <a href="https:&#x2F;&#x2F;youtu.be&#x2F;_PG-jJKB_do" rel="nofollow">https:&#x2F;&#x2F;youtu.be&#x2F;_PG-jJKB_do</a><p>There is no entropy regarding question of whether the Jade is an attractive presenter.
评论 #23039196 未加载
auntienomenabout 5 years ago
I think this gets posted on Hacker News once or twice a year. It&#x27;s a pleasure every time.
abetuskabout 5 years ago
Anyone have good recommendations on LDPC or Turbo Code theory?
评论 #23038285 未加载