If anyone is on the fence about reading this, or worried about their ability to comprehend the content, I would tell you to go ahead and give it a chance. Shannon's writing is remarkably lucid and transparent. The jargon is minimal, and his exposition is fantastic.<p>As many other commentators has mentioned, it is impressive that such an approachable paper would lay the foundations for a whole field. I actually find that many subsequent textbooks seem to obfuscate the simplicity of the idea of entropy.<p>Two examples from the paper really stuck with me. In one, he discusses the importance of spaces for encoding language, something which I had never really considered before. In the second, he discusses how it is the redundancy of language that allows for crosswords, and that a less redundant language would make it harder to design these (unless we started making them 3D!). It made me think more deeply about communication as a whole.
While well known for this paper and "information theory", Shannon's master's thesis* is worth checking out as well. It demonstrated some equivalence between electrical circuits and boolean algebra, and was one of the key ideas that enabled digital computers.<p>* <a href="https://en.wikipedia.org/wiki/A_Symbolic_Analysis_of_Relay_and_Switching_Circuits" rel="nofollow">https://en.wikipedia.org/wiki/A_Symbolic_Analysis_of_Relay_a...</a>
Not many know about it, but this paper (written in 1948) stemmed from a lesser-known paper Shannon wrote in 1945 called "A Mathematical Theory of Cryptography"[0].<p>[0]: <a href="https://evervault.com/papers/shannon" rel="nofollow">https://evervault.com/papers/shannon</a>
Shannon's original paper on the topic was written during WWII and I believe it was classified and is much more concise as an introduction. After that, he and Weaver put together the famous and much more comprehensive 1948 paper which expanded into the noisy coding theorem. Meanwhile his original paper ("Communication in the Presence of Noise") was published in 1949, possibly after declassification. I highly recommend reading it first, taking maybe an hour to read. Another terrific intro is a chapter of a book by Bruce Carlson: "Communication Systems: An Introduction to Signals and Noise..." I have a scan of the chapter linked here: <a href="https://drive.google.com/file/d/0B9oyGOnmkS7GTFlmQ2F1RWNFd28/view?usp=drivesdk&resourcekey=0-1XuLeFM81UbGviMW3ONqNQ" rel="nofollow">https://drive.google.com/file/d/0B9oyGOnmkS7GTFlmQ2F1RWNFd28...</a>
As an undergrad I struggled to understand why log was used to measure information. Could not find a reason in any textbook.<p>Took a deep breath and decided to download and read this paper. Surprise, surprise: it's super approachable and the reasoning for using log is explained on the first page.
Among other things, this paper is surprisingly accessible. You can give it to a beginner without much math background and they'll be able to understand it. I actually find it better than most modern books on information theory.
Shannon did a lot more interesting things than just this paper.<p>If you become more interested in Claude Shannon, I recommend the biography "A Mind At Play"<p><a href="https://en.wikipedia.org/wiki/A_Mind_at_Play" rel="nofollow">https://en.wikipedia.org/wiki/A_Mind_at_Play</a><p>A very interesting person.
I use this paper whenever I teach information theory. If you are mathematically inclined, I’d recommend you to read the demonstration of his two main theorems, it’s illuminating.
I find it incredible how "simple" were his theories and enormous impact they had. Is there anyone else who developed such seemingly "simple" theories?
The LaTeX code can be found at [1] (.tar.gz) or by clicking the 'directory' link towards the bottom of page [2].<p>[1] <a href="https://web.archive.org/web/20080516051043/http://cm.bell-labs.com/cm/ms/what/shannonday/shannon1948.tar.gz" rel="nofollow">https://web.archive.org/web/20080516051043/http://cm.bell-la...</a><p>[2] <a href="https://web.archive.org/web/20080516051043/http://cm.bell-labs.com/cm/ms/what/shannonday/paper.html" rel="nofollow">https://web.archive.org/web/20080516051043/http://cm.bell-la...</a>
Another great read from Shannon <a href="https://archive.org/details/bstj28-4-656" rel="nofollow">https://archive.org/details/bstj28-4-656</a>
I recently went through two books: (1) Fortune's Formula and (2) A Man for All Markets. They both impressed upon me a deep appreciation for Shannon's brilliant mind.<p>Curious if there are any great resources/books you'd recommend on Information Theory.