TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

A brief introduction to the beauty of Information Theory

171 pointsby justinucdabout 5 years ago

5 comments

dentalpersonabout 5 years ago
Shameless plug: I&#x27;m trying to start an online reading group for &quot;Elements of Information Theory&quot; by Cover and Thomas.<p>If you&#x27;re interested, please join (<a href="https:&#x2F;&#x2F;www.reddit.com&#x2F;r&#x2F;mathreadinggroup&#x2F;" rel="nofollow">https:&#x2F;&#x2F;www.reddit.com&#x2F;r&#x2F;mathreadinggroup&#x2F;</a>) and the discord (<a href="https:&#x2F;&#x2F;discord.gg&#x2F;D9FjZXs" rel="nofollow">https:&#x2F;&#x2F;discord.gg&#x2F;D9FjZXs</a>).
mikorymabout 5 years ago
This is an interesting topic and is closely related to what I am working on at the moment.<p>One interesting thing I can say is that while subsets of a set are the language of probability and of logic (the existential quantifier and the universal quantifier are the two adjoints of the inverse image function), we have the dual to be the language of entropy: partitions of a set.<p>You can unify the two via a self-dual set theory, where you essentially replace the notions of subset (i.e., subobjects) and partitions (i.e., quotient objects) with an abstract subobject notion. This was part of the basis for my MSc and then my supervisor and a colleague of mine [1] showed that you can have what they call a Noetherian form over the category of sets. It is essentially by taking the pullback of the subobject and quotient object functors, though I can&#x27;t remember if it is the usual pullback when seen as a category of categories.<p>[1] This is not all published, but you could have a look at the papers of Zurab Janelidze. The key point was his work on functorially self-dual group theory.
tuxxyabout 5 years ago
It&#x27;s funny seeing this at the top of HN today. I just wrote a utility in Rust this past weekend to calculate the Shannon and metric entropy of a file (<a href="https:&#x2F;&#x2F;github.com&#x2F;tuxxy&#x2F;entropy&#x2F;" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;tuxxy&#x2F;entropy&#x2F;</a>).<p>I really love the Shannon entropy equation. I don&#x27;t know why, but I just find it so fascinating and elegant. I love it so much that I even got it tattooed on my wrist recently haha! (<a href="https:&#x2F;&#x2F;twitter.com&#x2F;__tux&#x2F;status&#x2F;1257737842308022275&#x2F;photo&#x2F;1" rel="nofollow">https:&#x2F;&#x2F;twitter.com&#x2F;__tux&#x2F;status&#x2F;1257737842308022275&#x2F;photo&#x2F;1</a>)
birktjabout 5 years ago
I like the original paper by Shannon &quot;A Mathematical Theory of Communication&quot; (<a href="http:&#x2F;&#x2F;people.math.harvard.edu&#x2F;~ctm&#x2F;home&#x2F;text&#x2F;others&#x2F;shannon&#x2F;entropy&#x2F;entropy.pdf" rel="nofollow">http:&#x2F;&#x2F;people.math.harvard.edu&#x2F;~ctm&#x2F;home&#x2F;text&#x2F;others&#x2F;shannon...</a>) a lot. It is quite readable and also probably the most important paper in the field of information theory.
jzer0coolabout 5 years ago
I was introduced first to &quot;gray code&quot; as an undergraduate for error correction. For those new to this topic it could be where you can start for an introduction.