TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Arraymancer – Deep learning Nim library

211 点作者 archargelod大约 1 年前

6 条评论

angusturner大约 1 年前
I would love for a non-python based deep learning framework to gain traction.<p>My initial impression though is that the scope is very broad. Trying to be both sci-kit learn and numpy and torch seems like a recipe for doing none of these things very well.<p>Its interesting to contrast this with the visions&#x2F;aspirations of other new-ish deep learning frameworks. Starting with my favorite, Jax offers &quot;composable function transformations + autodiff&quot;. Obviously there is still a tonne of work to do this well, support multiple accelerators etc. etc. But notably I think they made the right call to leave high level abstractions (like fully-fledged NN libraries or optimisation libraries) out of the Jax core. It does what it says on the box. And it does it really really well.<p>TinyGrad seems like another interesting case study, in the sense that it is aggressively pushing to reduce complexity and LOC while still providing the relevant abstractions to do ML on multiple accelerators. It is quite young still, and I have my doubts about how much traction it will gain. Still a cool project though, and I like to see people pushing in this direction.<p>PyTorch obviously still has a tonne of mind-share (and I love it), but it is interesting to see the complexity of that project grow beyond what it is arguably necessary. (e.g. having a &quot;MultiHeadAttention&quot; implementation in PyTorch is a mistake in my opinion).
评论 #39862448 未加载
评论 #39861724 未加载
评论 #39865452 未加载
jononor大约 1 年前
As someone who uses ML on embedded devices, it is great to see good alternatives in compiled languages. Nim seems like a very useful and pragmatic language in this regard. Certainly a huge step up from the C and C++ which is still very entrenched. I think that solid libraries for deep learning is something we will see in practically all programming languages. In 10 years a library covering core usecase (of today) will be as standard as a JSON parser and a web sever, for almost any ecosystem.
评论 #39865776 未加载
CornCobs大约 1 年前
What syntax of nim&#x27;s is the network: ... Used to declaratively construct the neural networks? Is it a macro? Looks really neat!
评论 #39861563 未加载
评论 #39861559 未加载
wodenokoto大约 1 年前
Having grown up with JavaScript Python and R, I’m kinda looking towards learning a compiled language.<p>I’ve given a bit of thought to Rust since it’s polars native and I want to move away from pandas.<p>Is nim a good place to go?
评论 #39862974 未加载
评论 #39861687 未加载
评论 #39861742 未加载
miki123211大约 1 年前
IMO, no language without a Jupyter kernel can ever be a serious contender in the machine learning research space.<p>I was pretty skeptical of Jupyter until recently (because of accessibility concerns), but I just can&#x27;t imagine my life without it any more. Incidentally, this gave me a much deeper appreciation and understanding of why people loved Lisp so much. An overpowered repl is an useful tool indeed.<p>Fast compilation times are great and all, but the ability to modify a part of your code while keeping variable values intact is invaluable. This is particularly true if you have large datasets that are somewhat slow to load or models that are somewhat slow to train. When you&#x27;re experimenting, you don&#x27;t want to deal with two different scripts, one for training the model and one for loading and experimenting with it, particularly when both of them need to do the same dataset processing operations. Doing all of this in Jupyter is just so much easier.<p>With that said, this might be a great framework for deep learning on the edge. I can imagine this thing, coupled with a nice desktop GUI framework, being used in desktop apps for using such models. Things like LLM Studio, Stable Diffusion, voice changers utilizing RVC (as virtual sound cards and&#x2F;or VST plugins), or even internal, proprietary models, to be used by company employees. Use cases where the model is already trained, you already know the model architecture, but you want a binary that can be distributed easily.
评论 #39863623 未加载
评论 #39863784 未加载
评论 #39865299 未加载
logicchains大约 1 年前
Interesting that it &quot;Supports tensors of up to 6 dimensions&quot;. Is it difficult to support an arbitrary number of dimensions, e.g. does Nim lack variadic generics?
评论 #39861624 未加载
评论 #39861621 未加载