TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Visualizing the Math Behind Logistic Regression and Newton's Method

191 点作者 seanharr11将近 8 年前

7 条评论

davedx将近 8 年前
What is the difference between Newton&#x27;s Method and Gradient Descent?<p>Edit: Found an answer: <a href="https:&#x2F;&#x2F;www.quora.com&#x2F;In-optimization-why-is-Newtons-method-much-faster-than-gradient-descent" rel="nofollow">https:&#x2F;&#x2F;www.quora.com&#x2F;In-optimization-why-is-Newtons-method-...</a>
评论 #14881761 未加载
评论 #14881182 未加载
评论 #14881737 未加载
评论 #14882913 未加载
JadeNB将近 8 年前
Since the author is reading, a few small typos, followed by one slightly more substantial comment: &#x27;simgoid&#x27; should be &#x27;sigmoid&#x27; (S-shaped); `x y = log(x) + log(y)` should be `log(x y) = log(x) + log(y)`;&#x27;guarentee&#x27; should be &#x27;guarantee&#x27;; &#x27;recipricol&#x27; should be &#x27;reciprocal&#x27;.<p>I would like to see some mention of the fact that the division by the gradient is a meaningless, purely formal <i>motivation</i> for the correct step (inverting the Hessian) that follows.
评论 #14882553 未加载
评论 #14882137 未加载
评论 #14881683 未加载
theoh将近 8 年前
Dialing up the complexity a bit from Newton&#x27;s method, it would be interesting to know whether there are now better explanations of the conjugate gradient method online than this classic (or at least high-profile) intro: <a href="https:&#x2F;&#x2F;www.cs.cmu.edu&#x2F;~quake-papers&#x2F;painless-conjugate-gradient.pdf" rel="nofollow">https:&#x2F;&#x2F;www.cs.cmu.edu&#x2F;~quake-papers&#x2F;painless-conjugate-grad...</a>
mcphage将近 8 年前
For the graphs of the home price &#x2F; bathroom data set, what does the vertical axis represent? I don&#x27;t see it labeled or discussed anywhere.
评论 #14881660 未加载
craigching将近 8 年前
This is a really nice introduction to logistic regression, well done! My one quibble with the OP is the jump into Newton&#x27;s method. Maybe a derivation to explain the five steps would help. Thanks!
评论 #14881238 未加载
bigtoine123将近 8 年前
I search everywhere about the difference between Newton&#x27;s Method and Gradient Descent? but I couldn&#x27;t find something that useful. Can u suggest any website&#x2F; article where I can learn the difference?
评论 #14881724 未加载
xU1ppskunDmy6oz将近 8 年前
The author has far too little mathematical understanding to be teaching anybody (that’s my impression at least). If you don’t understand Newton’s method before reading this, you won’t understand it afterwards. “A method for finding the roots of a polynomial”. Why polynomials? Does it work, always? Is it fast? Why would following the tangent repeatedly be a good idea? “We take the inverse instead of the reciprocal because it’s a matrix”. Not impressed.
评论 #14881304 未加载
评论 #14881288 未加载
评论 #14881298 未加载