I recently ported a reinforcement learning algorithm from PyTorch to Julia. I did my best to keep the implementations the same, with the same hyperparameters, network sizes, etc. I think I did a pretty good job because the performance was similar, solving the CartPole environment in the a similar number of steps, etc.<p>The Julia implementation ended up being about 2 to 3 times faster. I timed the core learning loops, the network evaluations and gradient calculations and applications, and PyTorch and Julia performed similar here. So it wasn't that Julia was faster at learning. Instead it was all the in-between, all the "book keeping" in Python ended up being much faster in Julia, enough so that overall it was 2 to 3 times faster.<p>(I was training on a CPU though. Things may be different if you're using a GPU, I don't know.)
Julia is such a wonderful language. There are many design decisions that I like, but most importantly to me, its ingenious idea of combining multiple dispatch with JIT compilation still leaves me in awe. It is such an elegant solution to achieving efficient multiple dispatch.<p>Thanks to everyone who is working on this language!
I've been running the 1.6 release candidates, and the compilation speed improvements have been massive. There have been plenty of instances in the past where I've tried to 'quickly' show off some Julia code, and I end up waiting ~45 seconds for a plot to show or a minute for a Pluto notebook to run, and that's not to mention waiting for my imports to finish. It's still slower than Matlab for the first run, but it's at least in the same ballpark now.
On the package ecosystem side, 1.6 is required for JET.jl [0]. Despite being a dynamic language, the Julia compiler does a lot of static analysis (or "abstract interpretation" in Julia lingo). JET.jl exposes some of this to the user, opening a path for additional static analysis tools (or maybe even custom compilers).<p>[0]: <a href="https://github.com/aviatesk/JET.jl" rel="nofollow">https://github.com/aviatesk/JET.jl</a>
See also Lyndon’s blog post [1] about what all has changed since 1.0, for anyone who’s been away for a while.<p>[1] <a href="https://www.oxinabox.net/2021/02/13/Julia-1.6-what-has-changed-since-1.0.html" rel="nofollow">https://www.oxinabox.net/2021/02/13/Julia-1.6-what-has-chang...</a>
Whatever improves loading times is more than welcome. It's not really acceptable to wait because you import some libraries. In understand Julia makes lots of things under the hood and that there's a price to pay for that but being a python user, it's a bit inconvenient.<p>But I'll sure give it a try because Julia hits a sweet spot between expressiveness and speed (at least for the kind of stuff I do : matrix, algorithms, graphs computations).
I like Julia (mostly because of multiple dispatch). The only thing that's lacking is an industry strength Garbage Collector, something that can be found in the JVM.<p>I know that you shouldn't produce garbage, but I happen to like immutable data structures and those work better with optimised GCs.
I know its minor, but I still hope they will fix scoping<p>not that my suggestion is good, but what they have now is bad<p><a href="https://github.com/JuliaLang/julia/issues/37187" rel="nofollow">https://github.com/JuliaLang/julia/issues/37187</a>
The feature I'm most excited about is the parallel — and automatic — precompilation. Combined with the iterative latency improvements, Julia 1.6 has far fewer coffee breaks.
Are the performance claims of Julia greatly exaggerated?<p>Julia loses almost consistently to Go, Crystal, Nim, Rust, Kotlin, Python (PyPy, Numpy):
<a href="https://github.com/kostya/benchmarks" rel="nofollow">https://github.com/kostya/benchmarks</a><p>Is this because of bad typing or they didn't use Julia properly in idiomatic manner?
Is there a per-project way to manage dependencies yet? I find global package installation to be the biggest weakness of all the R projects out there. Anaconda can help, but it’s not widely used for R projects. And Docker... well, don’t get me started.
maybe i misread this, but milestone "1.6 blockers" still has 3 open with "1.6 now considered feature-complete. This milestone tracks release-blocking issues." - so how can 1.6 be ready?