Shannon's original article "A Mathematical Theory of Communication" is available online, for example at [1]; I find it very readable. The MIT article talks about information theory as applied to our digital world, but does not mention that it's now applied widely in theoretical physics; watch [2] for an intro. For example, the Hawking radiation from a black hole can function as an error correcting (erasure) code, allowing recovery of the information that fell into the black hole.<p>[1] <a href="http://people.math.harvard.edu/~ctm/home/text/others/shannon/entropy/entropy.pdf" rel="nofollow">http://people.math.harvard.edu/~ctm/home/text/others/shannon...</a><p>[2] <a href="https://www.youtube.com/watch?v=v5UbN0xx4X0" rel="nofollow">https://www.youtube.com/watch?v=v5UbN0xx4X0</a>
Information theory is one of very few things out of academia that has really stuck with me and shaped how I look at problems in the world. The current applications for it are so vast and immediate that I wonder how many other places it could still be applied to.<p>Being able say and practice things like "now let's consider what this might look like in the frequency domain..." can open up radical new approaches to solving problems. "Ohhhh that signal/channel has a hard roll-off starting at 20khz... I probably need to increase/decrease my sample rate accordingly" vs flying blind with traditional time domain analysis. In practical terms, frequency domain work is mandatory for things like high-efficiency video or audio codecs.<p>Understanding what entropy vs information really means is a huge part of being able to competently build cryptographic primitives from first principles.
Previous discussion of the Shannon limit -- specifically the top end of analog lines:<p><a href="https://news.ycombinator.com/item?id=4344349" rel="nofollow">https://news.ycombinator.com/item?id=4344349</a>
This MIT article is okay at an introduction to the subject.<p>I found the following Youtube video to be better however: <a href="https://simons.berkeley.edu/events/theoretically-speaking-mary-wootters" rel="nofollow">https://simons.berkeley.edu/events/theoretically-speaking-ma...</a><p>As a ~1-hour talk, Dr. Wootters is able to dig more deeply into Reed Solomon codes (a very popular error-correction code for nearly 50 years), as well as applications into strange stuff: like using RS codes in "Test Pooling" to save money on Syphilis tests (and probably the same methodology being used for "Test Pooling" in today's COVID19 world).<p>Dr. Wootters keeps things relatively dumbed down, never getting too into the weeds of the math (and indeed: only sticks with the GF(5) field, a prime field instead of talking about the more applicable extension fields). Still, extension fields follow mostly the same concepts, and GF(5) is sufficient to cover all the concepts.
So how does this relate to Hamming codes? [0]<p>[0] <a href="https://m.youtube.com/watch?v=X8jsijhllIA" rel="nofollow">https://m.youtube.com/watch?v=X8jsijhllIA</a>