TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Visualizing the Math Behind Logistic Regression and Newton's Method

191 pointsby seanharr11almost 8 years ago

7 comments

davedxalmost 8 years ago
What is the difference between Newton&#x27;s Method and Gradient Descent?<p>Edit: Found an answer: <a href="https:&#x2F;&#x2F;www.quora.com&#x2F;In-optimization-why-is-Newtons-method-much-faster-than-gradient-descent" rel="nofollow">https:&#x2F;&#x2F;www.quora.com&#x2F;In-optimization-why-is-Newtons-method-...</a>
评论 #14881761 未加载
评论 #14881182 未加载
评论 #14881737 未加载
评论 #14882913 未加载
JadeNBalmost 8 years ago
Since the author is reading, a few small typos, followed by one slightly more substantial comment: &#x27;simgoid&#x27; should be &#x27;sigmoid&#x27; (S-shaped); `x y = log(x) + log(y)` should be `log(x y) = log(x) + log(y)`;&#x27;guarentee&#x27; should be &#x27;guarantee&#x27;; &#x27;recipricol&#x27; should be &#x27;reciprocal&#x27;.<p>I would like to see some mention of the fact that the division by the gradient is a meaningless, purely formal <i>motivation</i> for the correct step (inverting the Hessian) that follows.
评论 #14882553 未加载
评论 #14882137 未加载
评论 #14881683 未加载
theohalmost 8 years ago
Dialing up the complexity a bit from Newton&#x27;s method, it would be interesting to know whether there are now better explanations of the conjugate gradient method online than this classic (or at least high-profile) intro: <a href="https:&#x2F;&#x2F;www.cs.cmu.edu&#x2F;~quake-papers&#x2F;painless-conjugate-gradient.pdf" rel="nofollow">https:&#x2F;&#x2F;www.cs.cmu.edu&#x2F;~quake-papers&#x2F;painless-conjugate-grad...</a>
mcphagealmost 8 years ago
For the graphs of the home price &#x2F; bathroom data set, what does the vertical axis represent? I don&#x27;t see it labeled or discussed anywhere.
评论 #14881660 未加载
craigchingalmost 8 years ago
This is a really nice introduction to logistic regression, well done! My one quibble with the OP is the jump into Newton&#x27;s method. Maybe a derivation to explain the five steps would help. Thanks!
评论 #14881238 未加载
bigtoine123almost 8 years ago
I search everywhere about the difference between Newton&#x27;s Method and Gradient Descent? but I couldn&#x27;t find something that useful. Can u suggest any website&#x2F; article where I can learn the difference?
评论 #14881724 未加载
xU1ppskunDmy6ozalmost 8 years ago
The author has far too little mathematical understanding to be teaching anybody (that’s my impression at least). If you don’t understand Newton’s method before reading this, you won’t understand it afterwards. “A method for finding the roots of a polynomial”. Why polynomials? Does it work, always? Is it fast? Why would following the tangent repeatedly be a good idea? “We take the inverse instead of the reciprocal because it’s a matrix”. Not impressed.
评论 #14881304 未加载
评论 #14881288 未加载
评论 #14881298 未加载