Probably one of the dozen or so most important publications of the twentieth century. Ironically though it would be Norbert Wiener's interpretation of information (exactly Shannon's times minus one) that would cement itself in the popular lexicon due to it being far more intuitive. Where Wiener posits information as negative-entropy (or "order"), Shannon interprets it as representing a degree of freedom, or uncertainty. The problem with Wiener's is that the underlying epistemology carries a subjective bent, where information is equivocal to "meaning" or semantic content. Shannon's, meanwhile is ultimately superior as an objective metric, though far less intuitive (under his interpretation, a random string is technically the most information-saturated construct possible, because it possesses the highest degrees of freedom).
What I think is telling, is how easy to read Shannon’s paper is. Even today it is used pretty much as is at many EE colleges to teach communication theory.<p>Another cool fact: Shannon’s master thesis is most likely the most influential one of all time: in it, he linked Boolean algebra to electrical circuits with switches, essentially inventing digital circuit theory.
BTW here's the paper by Hartley cited on the first page it is also very readable and insightful. I found it helped clarify some of the subtler points in Shannon's paper to read it as well.<p><a href="http://keszei.chem.elte.hu/entropia/Hartley1928text.pdf" rel="nofollow">http://keszei.chem.elte.hu/entropia/Hartley1928text.pdf</a>
Recommended on an earlier thread: From Aristotle to John Searle and Back Again: Formal Causes, Teleology, and Computation in Nature" by E. Feser.<p><a href="https://muse.jhu.edu/article/618359" rel="nofollow">https://muse.jhu.edu/article/618359</a><p><a href="http://www.netherhallhouse.org.uk/wp-content/uploads/2018/03/formal_causes.pdf" rel="nofollow">http://www.netherhallhouse.org.uk/wp-content/uploads/2018/03...</a><p>Source: <a href="https://news.ycombinator.com/item?id=12080670" rel="nofollow">https://news.ycombinator.com/item?id=12080670</a>
If you're interested in this, check out Cover's Information Theory textbook — the rabbit hole goes much deeper. One of the most interesting examples, is that when you're betting on a random event, Shannon entropy tells you how much to bet & how quickly you can compound your wealth. Cover covers (heh) this, and the original paper is Kelly: <a href="http://www.herrold.com/brokerage/kelly.pdf" rel="nofollow">http://www.herrold.com/brokerage/kelly.pdf</a>
The article was renamed "The Mathematical Theory of Communication" in the 1949 book of the same name, a small but significant title change after realizing the generality of this work.
I just recently finished the biography of Shannon called "A Mind at Play."<p>It was quite a joy and I highly recommend to anyone interested in these sorts of bios.
> The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have <i>meaning</i> ...<p>I never fail to suppress a chuckle when I read that.
There seems to be a great, unresolved difference of opinion about whether information entropy and thermodynamic entropy are commensurable.<p>In trying to connect Shannon entropy to thermodynamic entropy, I always get stuck on the fact that you need to have a defined alphabet or symbol set.
Shannon's work has had vast influence.<p>Here is a great, short overview of the topic => <a href="https://youtu.be/_PG-jJKB_do" rel="nofollow">https://youtu.be/_PG-jJKB_do</a><p>There is no entropy regarding question of whether the Jade is an attractive presenter.