TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Notes on neural networks from scratch in Clojure

41 点作者 mjdowney将近 2 年前

2 条评论

wokwokwok将近 2 年前
&gt; Much of the magic inside of neural network libraries has less to do with cleverer algorithms and more to do with vectorized SIMD instructions and&#x2F;or being parsimonious with GPU memory usage and communication back and forth with main memory.<p>I mean… that’s not <i>really</i> fair is it?<p>We’ve been able to build NN libraries for 30 years, but it’s the transformers algorithm on top of it, and the stacked layers forming a coherent network that are the complex parts right?<p>Implement stable diffusion in clojure (the python code for it is all open source) and we quickly see that there is a lot of complexity once <i>you’re doing something useful</i> that the primitive operations don’t support.<p>It’s not really any different from opencv with the basic matrix operations and then paper-by-paper implementations of various algorithms.<p>Building a basic pixel matrix library using clojure wouldn’t give you an equivalent to opencv either.<p>Is there really a clear meaningful bridge between building low level operations and building high level functions out of them?<p>When you implement sqrt, you’ve learnt a thing… but it doesn’t help you build a rendering engine.<p>Hasn’t this always been the problem with learning ML “from scratch?”<p>You start with basic operations, do MNIST… and then… uh, well, no. Now you clone a python repo that implements the paper you want to work on and modify it, because implementing it from scratch with your primitives isn’t really possible.
评论 #36186015 未加载
xrd将近 2 年前
This is such a terrific write-up. It&#x27;s always felt like the ML space takes years of study to get a foothold. But this is a clear path to learning critical first principles. Thanks for writing this!
评论 #36185464 未加载