TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

A brief introduction to the beauty of Information Theory

171 点作者 justinucd大约 5 年前

5 条评论

dentalperson大约 5 年前
Shameless plug: I&#x27;m trying to start an online reading group for &quot;Elements of Information Theory&quot; by Cover and Thomas.<p>If you&#x27;re interested, please join (<a href="https:&#x2F;&#x2F;www.reddit.com&#x2F;r&#x2F;mathreadinggroup&#x2F;" rel="nofollow">https:&#x2F;&#x2F;www.reddit.com&#x2F;r&#x2F;mathreadinggroup&#x2F;</a>) and the discord (<a href="https:&#x2F;&#x2F;discord.gg&#x2F;D9FjZXs" rel="nofollow">https:&#x2F;&#x2F;discord.gg&#x2F;D9FjZXs</a>).
mikorym大约 5 年前
This is an interesting topic and is closely related to what I am working on at the moment.<p>One interesting thing I can say is that while subsets of a set are the language of probability and of logic (the existential quantifier and the universal quantifier are the two adjoints of the inverse image function), we have the dual to be the language of entropy: partitions of a set.<p>You can unify the two via a self-dual set theory, where you essentially replace the notions of subset (i.e., subobjects) and partitions (i.e., quotient objects) with an abstract subobject notion. This was part of the basis for my MSc and then my supervisor and a colleague of mine [1] showed that you can have what they call a Noetherian form over the category of sets. It is essentially by taking the pullback of the subobject and quotient object functors, though I can&#x27;t remember if it is the usual pullback when seen as a category of categories.<p>[1] This is not all published, but you could have a look at the papers of Zurab Janelidze. The key point was his work on functorially self-dual group theory.
tuxxy大约 5 年前
It&#x27;s funny seeing this at the top of HN today. I just wrote a utility in Rust this past weekend to calculate the Shannon and metric entropy of a file (<a href="https:&#x2F;&#x2F;github.com&#x2F;tuxxy&#x2F;entropy&#x2F;" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;tuxxy&#x2F;entropy&#x2F;</a>).<p>I really love the Shannon entropy equation. I don&#x27;t know why, but I just find it so fascinating and elegant. I love it so much that I even got it tattooed on my wrist recently haha! (<a href="https:&#x2F;&#x2F;twitter.com&#x2F;__tux&#x2F;status&#x2F;1257737842308022275&#x2F;photo&#x2F;1" rel="nofollow">https:&#x2F;&#x2F;twitter.com&#x2F;__tux&#x2F;status&#x2F;1257737842308022275&#x2F;photo&#x2F;1</a>)
birktj大约 5 年前
I like the original paper by Shannon &quot;A Mathematical Theory of Communication&quot; (<a href="http:&#x2F;&#x2F;people.math.harvard.edu&#x2F;~ctm&#x2F;home&#x2F;text&#x2F;others&#x2F;shannon&#x2F;entropy&#x2F;entropy.pdf" rel="nofollow">http:&#x2F;&#x2F;people.math.harvard.edu&#x2F;~ctm&#x2F;home&#x2F;text&#x2F;others&#x2F;shannon...</a>) a lot. It is quite readable and also probably the most important paper in the field of information theory.
jzer0cool大约 5 年前
I was introduced first to &quot;gray code&quot; as an undergraduate for error correction. For those new to this topic it could be where you can start for an introduction.