TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Building a Language and Compiler for Machine Learning

243 pointsby ViralBShahover 6 years ago

13 comments

eigenspaceover 6 years ago
Mike Innes is one of the real rockstars in the Julia community. He has a knack for making surprisingly small and elegant packages which compose so naturally with the base language that they feel built-in.<p>After a `using Flux`, julia is suddenly a machine learning language, instead of a language with a machine learning library. I&#x27;d argue it shouldn&#x27;t be surprising that he found his way to Julia because Julia is one of few languages that allows one to make such packages.<p>His other packages such as MacroTools.jl, Lazy.jl and Zygote.jl are also well worth checking out.
mark_l_watsonover 6 years ago
As an old Lisp user, I am impressed by Flux (which I started using this weekend, after someone on HN recommended Flux to me) as transforming Julia, like building Lisp up to a new language for whatever problem you are working. I also appreciate how incredibly easy it was to get started with Flux which ‘just worked’ with CUDA 10 and the GPU in my laptop and the model zoo was great to get started. Really quality stuff!
评论 #18596273 未加载
stabblesover 6 years ago
Since the blog post does not have many code samples, this non-trivial AD example with Zygote.jl is worth sharing (it&#x27;s from their readme):<p><pre><code> julia&gt; using Zygote julia&gt; fs = Dict(&quot;sin&quot; =&gt; sin, &quot;cos&quot; =&gt; cos); julia&gt; derivative(x -&gt; fs[readline()](x), 1.0) cos -0.8414709848078965 julia&gt; -sin(1.0) # &#x27;true&#x27; derivative in 1.0 -0.8414709848078965 </code></pre> So Zygote can apply AD to an anonymous function that looks up a function in a hash table from user input.<p><a href="https:&#x2F;&#x2F;github.com&#x2F;FluxML&#x2F;Zygote.jl" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;FluxML&#x2F;Zygote.jl</a>
评论 #18594689 未加载
ChrisRackauckasover 6 years ago
As someone who works in merging differential equations and machine learning, I have found this kind of work essential for what I do. Pervasive AD that allows merging neural networks and diffeq solvers is allowing us to explore all of kinds of new models and new problems. Sure it doesn&#x27;t impact vanilla machine learning all that much (though Zygote.jl does allow for a lot of optimizations that wouldn&#x27;t be possible with tracing-based AD), but it definitely opens up a new wave of AI possibilities.
评论 #18595033 未加载
评论 #18594161 未加载
评论 #18594623 未加载
jlebarover 6 years ago
As an XLA:GPU person I&#x27;m curious how the performance of Julia natively compiling to CUDA compares to using XLA:GPU.<p>In particular, is this a promising approach, or do you see it as a dead end compared to generating GPU code natively? If it&#x27;s promising, are there things we need to do in XLA:GPU to make it less awful for you?<p>(Reasons you might want to use XLA:GPU include, you don&#x27;t have to reinvent all our performance and correctness hacks for cudnn, and maybe our kernels run faster since we&#x27;re targeting such a limited domain?)
评论 #18597774 未加载
UncleOxidantover 6 years ago
I&#x27;m tasked with running several ML algorithms on a new hardware accelerator. Currently there is an LLVM toolchain for that new hardware, but no Python support is expected for a while which means implementing a bunch of ML code in C or maybe C++ (not a very pleasant prospect). I&#x27;m wondering, since Julia has an LLVM backend would it be possible to emit LLVM IR from Julia which could then be fed into our LLVM toolchain?<p>One thing that comes to mind here: does Julia use some kind of primitives for various things like matrix multiplication that might be difficult to export at the LLVM-IR level?
评论 #18594351 未加载
评论 #18595366 未加载
评论 #18594497 未加载
评论 #18597141 未加载
shafteover 6 years ago
I&#x27;d be interested in a direct comparison with similar efforts undertaken by existing frameworks; for example Torch Script[1], which aims to produce a language which shares a syntactic frontend with Python while getting all the goodies that ahead-of-time compilation gives you (symbolic diff, operator fusion, etc).<p>Seems to me that the primary challenge for any &quot;next-generation&quot; framework or language is getting people to actually use the thing. Sharing a front-end with Python and a backend with PyTorch seems like a good way to bootstrap that.<p>[1] <a href="https:&#x2F;&#x2F;pytorch.org&#x2F;docs&#x2F;master&#x2F;jit.html?highlight=torchscript" rel="nofollow">https:&#x2F;&#x2F;pytorch.org&#x2F;docs&#x2F;master&#x2F;jit.html?highlight=torchscri...</a>
lostmsuover 6 years ago
I wonder if this feature is in any way different from LINQ for expressions?<p>In C# you can say<p><pre><code> Expr&lt;Func&lt;float,float&gt;&gt; sin = Math.Sin; </code></pre> And then write a function<p><pre><code> Expr Derivative(Expr expr) =&gt; ... </code></pre> Which will take the above sin, and compute its derivative as another Expr, which can be later compiled using Expr.Compile()<p>In C# this has been introduced to make SQL bindings.<p>So far, the only difference I see is that in C# there&#x27;s a distinction between expression trees and functions themselves, but in Julia there&#x27;s not.
评论 #18603346 未加载
pjmlpover 6 years ago
I love the work being done in Julia, as competition is good, and maybe it would make Python community be more supportive of the ongoing JIT attempts.
Myrmornisover 6 years ago
&gt; Meanwhile, the idea of ML models fundamentally being differentiable algorithms – often called differentiable programming – has caught on.<p>&gt; We need a language to write differentiable algorithms, and Flux takes Julia to be this language.<p>Recently on HN there was some discussion of this paper by Conal Elliott on automatic differentiation in a pure functional language (Haskell): <a href="https:&#x2F;&#x2F;arxiv.org&#x2F;abs&#x2F;1804.00746" rel="nofollow">https:&#x2F;&#x2F;arxiv.org&#x2F;abs&#x2F;1804.00746</a><p>This is a rather large and vague question but I&#x27;m curious whether people have comments on the relative merits of Julia vs a pure functional language for supporting &quot;differentiable programming&quot; for ML?
StefanKarpinskiover 6 years ago
Also posted here: <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=18593453" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=18593453</a>. Maybe a mod could combine the two?
tehsauceover 6 years ago
This looks awesome, and makes me wonder if anyone done any real-time graphics experiments with Julia? With this great AD and gpu support, I would love to try using this on some graphics applications!
评论 #18594487 未加载
glemmaPaulover 6 years ago
This is gonna be very interesting when this can be combined with a distributed network of specialized CNNs for highly specialized tasks (if there isn&#x27;t already)