TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Bridging the gap between neural networks and functions

73 pointsby prathyvshalmost 2 years ago

4 comments

politelemonalmost 2 years ago
Hmm, might be just me, this feels like a refresher for people who already understand NN and transformers. This will probably escape most devs. I've had a bit better luck with the fastai course which is a series of YouTube videos, so it's a slower pace but explained quite well without requiring a lot of understanding.
评论 #36455365 未加载
ilakshalmost 2 years ago
One nice thing about the 1986 Hinton paper was that he described the equations very explicitly in a way that even a math dummy like me could implement.<p><a href="https:&#x2F;&#x2F;github.com&#x2F;runvnc&#x2F;mlp&#x2F;blob&#x2F;master&#x2F;neuralnetwork.cpp">https:&#x2F;&#x2F;github.com&#x2F;runvnc&#x2F;mlp&#x2F;blob&#x2F;master&#x2F;neuralnetwork.cpp</a><p><a href="https:&#x2F;&#x2F;github.com&#x2F;runvnc&#x2F;nnpapers&#x2F;blob&#x2F;master&#x2F;hinton86.pdf">https:&#x2F;&#x2F;github.com&#x2F;runvnc&#x2F;nnpapers&#x2F;blob&#x2F;master&#x2F;hinton86.pdf</a><p>This article is also a very good explanation.
kurthralmost 2 years ago
I think the original title was better: &quot;Bridging the gap between neural networks and functions&quot;<p>It discusses the standard backpropagation optimization method in differential form and the functional approximation of neural networks, but doesn&#x27;t discuss transformers at all that I could tell. I think the code might be helpful to some in understanding implementation, but so much is now done in accelerators that it doesn&#x27;t really capture real implementations.
jppopealmost 2 years ago
The font on the site made it really hard to read.
评论 #36480652 未加载