TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Variational Inference for Machine Learning [pdf]

79 点作者 alex_hirner超过 8 年前

2 条评论

marmaduke超过 8 年前
Stan and PyMC3 both implement automatic differentiation based variational inference, so you can write down your statistical model and not care &quot;much&quot; about derivatives.<p><a href="http:&#x2F;&#x2F;mc-stan.org" rel="nofollow">http:&#x2F;&#x2F;mc-stan.org</a> <a href="https:&#x2F;&#x2F;github.com&#x2F;pymc-devs&#x2F;pymc3" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;pymc-devs&#x2F;pymc3</a>
评论 #12872373 未加载
评论 #12873203 未加载
coherentpony超过 8 年前
&gt; Many samples needed, especially in high dimensions<p>This isn&#x27;t true. For Monte Carlo sampling, the convergence of unbiased estimators (for example the expectation) is independent of the dimension of the state space. In fact, this is exactly the reason to <i>prefer</i> Monte Carlo integration over, say, a Riemann sum.
评论 #12869544 未加载
评论 #12869541 未加载
评论 #12870159 未加载