TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Information Theory for Machine Learning [pdf]

55 点作者 rabidsnail将近 9 年前

2 条评论

gajjanag将近 9 年前
As a learning exercise&#x2F;fun project for the author, I think this is ok.<p>But for any serious study, I fail to see what this offers over the completely free, very readable, more carefully written, and more thorough &quot;Information Theory, Inference, and Learning Algorithms&quot; by David MacKay: <a href="http:&#x2F;&#x2F;www.inference.phy.cam.ac.uk&#x2F;itila&#x2F;book.html" rel="nofollow">http:&#x2F;&#x2F;www.inference.phy.cam.ac.uk&#x2F;itila&#x2F;book.html</a>.<p>As an example of the problems with the pdf in its current stage, Theorem 2 (asymptotic source coding) includes the term &quot;negligible loss&quot; without even defining what &quot;loss&quot; means in source coding. Lossless and lossy coding are very different things, all the preceding stuff is really discussing the lossless coding problem. Pedagogically, these need decoupling.
评论 #11891301 未加载
mike_hock将近 9 年前
Bookmark comment; ignore.