TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Automatic Differentiation: The most underused tool in the machine learning toolbox?

50 点作者 gaika超过 16 年前

3 条评论

mattj超过 16 年前
Easiest answer: If you're using neural nets (his example), you could just write the backprop algorithm. Chances are performance matters, so you can hand tune your code to generate the best assembly.<p>Most of machine learning work involves huge data sets. You divide your time between cleaning up / massaging your data until it's usable, coming up with models, deriving properties of the models, implementing inference for those models, and, most importantly, tuning your code so you can actually get meaningful results on huge datasets.<p>Doing the differentiation is, by far, the easiest part of all of that.<p>Also, in many cases, your model won't have a tractable form (like, say, requiring you to sum over all permutations in your data set at each step of your training). You have to come up with ways of approximating these results, often using sampling techniques.<p>Being able to find a derivative to a function that takes O(n!) time to calculate exactly isn't exactly useful - for gradient optimization methods, you'll often have to calculate the value more often than the gradient.<p>Basically, when finding a derivative is feasible it's more useful and not much more work to derive it yourself.
评论 #487154 未加载
yummyfajitas超过 16 年前
Ok, reading that article left me with one important question: wtf is automatic differentiation?<p>Luckily wikipedia exists.<p><a href="http://en.wikipedia.org/wiki/Automatic_differentiation" rel="nofollow">http://en.wikipedia.org/wiki/Automatic_differentiation</a>
评论 #487129 未加载
tectonic超过 16 年前
Very cool, I didn't know about this.<p>Python library for this: <a href="http://www.seanet.com/~bradbell/pycppad/index.xml" rel="nofollow">http://www.seanet.com/~bradbell/pycppad/index.xml</a>