TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Calculus on Computational Graphs: Backpropagation

55 pointsby inetseeover 9 years ago

6 comments

xtacyover 9 years ago
It&#x27;s also known as &quot;automatic differentiation&quot; -- it&#x27;s quite different from numerical&#x2F;symbolic differentiation.<p>More information here:<p>- <a href="https:&#x2F;&#x2F;justindomke.wordpress.com&#x2F;2009&#x2F;02&#x2F;17&#x2F;automatic-differentiation-the-most-criminally-underused-tool-in-the-potential-machine-learning-toolbox&#x2F;" rel="nofollow">https:&#x2F;&#x2F;justindomke.wordpress.com&#x2F;2009&#x2F;02&#x2F;17&#x2F;automatic-diffe...</a><p>- <a href="https:&#x2F;&#x2F;wiki.haskell.org&#x2F;Automatic_Differentiation" rel="nofollow">https:&#x2F;&#x2F;wiki.haskell.org&#x2F;Automatic_Differentiation</a><p>The key idea is extending common operators (+, -, product, &#x2F;, key mathematical functions) that usually operate on _real numbers_ to tuples of real numbers (x, dx) (the quantity and its derivative with respect to some variable) such that the operations preserve the properties of differentiation.<p>For instance (with abuse of notation):<p><pre><code> - (x1, dx1) + (x2, dx2) = (x1 + x2, dx1 + dx2). - (x1, dx1) * (x2, dx2) = (x1 * y1, x1 * dx2 + x2 * dx1). - sin((x, dx)) = (sin(x), cos(x)). </code></pre> Note that the right element of the tuple can be computed precisely from quantities readily available from the inputs to the operator.<p>It&#x27;s also extensible to derivatives of scalars that are functions of many variables by a vector (of those variables) (common in machine learning).<p>It&#x27;s beautifully implemented in Google&#x27;s Ceres optimisation package:<p><a href="https:&#x2F;&#x2F;ceres-solver.googlesource.com&#x2F;ceres-solver&#x2F;+&#x2F;1.8.0&#x2F;include&#x2F;ceres&#x2F;jet.h" rel="nofollow">https:&#x2F;&#x2F;ceres-solver.googlesource.com&#x2F;ceres-solver&#x2F;+&#x2F;1.8.0&#x2F;i...</a>
评论 #10149369 未加载
评论 #10150178 未加载
versteegenover 9 years ago
Anyone reading this who hasn&#x27;t already should do themselves a favour and read the other articles on colah&#x27;s blog. Beautifully presented demonstrations of ML algorithms, a number running live in your browser <a href="http:&#x2F;&#x2F;colah.github.io&#x2F;" rel="nofollow">http:&#x2F;&#x2F;colah.github.io&#x2F;</a>
jmountover 9 years ago
My demonstration Scala automatic differentiation library: <a href="http:&#x2F;&#x2F;www.win-vector.com&#x2F;blog&#x2F;2010&#x2F;06&#x2F;automatic-differentiation-with-scala&#x2F;" rel="nofollow">http:&#x2F;&#x2F;www.win-vector.com&#x2F;blog&#x2F;2010&#x2F;06&#x2F;automatic-differentia...</a>
outlaceover 9 years ago
This is beautiful. I&#x27;ve never seen a more concise yet powerfully clear explanation of backpropagation. This explanation is so fundamental in that it relies on the fewest number of axioms.
plgover 9 years ago
Love these tutorials. Not sure that the LaTeX font is the right choice for a web page though.
misiti3780over 9 years ago
this is easily the best explanation of back-propagation i have found on the web - nice work