Shannon's original paper, titled "A Mathematical Theory of Communication"[0] which basically founded Information Theory as its own coherent and well defined field is also very readable. Some of the concepts were well known before he published it, some were original -- but the coherent formulation, notation and concentration in that paper is what established the field now known as "Information Theory".<p>[0] <a href="http://worrydream.com/refs/Shannon%20-%20A%20Mathematical%20Theory%20of%20Communication.pdf" rel="nofollow">http://worrydream.com/refs/Shannon%20-%20A%20Mathematical%20...</a>
I found Hartley's paper [1] (also discussed in the video sequence) on quantifying information surprisingly accessible and a good read in general.<p>[1] <a href="http://www.dotrose.com/etext/90_Miscellaneous/transmission_of_information_1928b.pdf" rel="nofollow">http://www.dotrose.com/etext/90_Miscellaneous/transmission_o...</a>
"History of the alphebet" [sic]<p>Not mentioned is the curious fact that there seems to have been only one independent discovery of the alphabet. All alphabets are descended from the same root, proto-sinaitic or have been made up by people who knew about alphabets (Korean). Irish runes and Khmer script thus share a common ancestor.
I think it's insane that Khan Academy should attempt to cover information theory giving the Coursera course has a 1% passing rate.<p>I am two videos in, biting my lip to not make technical comments that don't matter at this level. They do a great job of simplifying a very complex subject.
Information Theory is one of my most favourite courses that I took at uni. The idea that you can take something as abstract as "information", and quantify it in a useful manner was eye-opening.