I will go against the trend here and give a big thanks to the whole Julia team for all their wonderful work.<p>I've been a heavy Julia user for +4 years and adore this ecosystem. I use Julia for parallel computing, modeling and solving large-scale optimization problems, stochastic simulations, etc. During the last year or so, creating plots and dashboards has become much easier too.<p>Julia makes it surprisingly easy to go from "idea" to "large-scale simulation". I've used it in production and just for prototyping/research. I can engage with Julia as deeply as I would with C code or as "lightly" as I engage with Matlab/R.<p>I'm excited to see what comes next.
My favorite change (even though it's not listed in the changelog), is that just-in-time compiled code now has frame pointers[1], making Julia code much more debuggable. Profilers, debuggers, etc. all can now work out of the box.<p>Extra excited that the project I happen to work on (the Parca open source project[2]) influenced this change [3][4]. Shout out to Valentin Churavy for driving this on the Julia front!<p>[1] <a href="https://github.com/JuliaLang/julia/commit/06d4cf072db24ca6df9187af2eaf290f6d0ac8c2">https://github.com/JuliaLang/julia/commit/06d4cf072db24ca6df...</a><p>[2] <a href="https://parca.dev/" rel="nofollow">https://parca.dev/</a><p>[3] <a href="https://github.com/parca-dev/parca-demo/pull/37">https://github.com/parca-dev/parca-demo/pull/37</a><p>[4] <a href="https://github.com/JuliaLang/julia/issues/40655">https://github.com/JuliaLang/julia/issues/40655</a>
Matlab users should switch to Julia. It’s a real programming language, and better in many ways.<p>I provide the option of Julia in my tutorials. Students are lazy, and don’t want to explore something new. Most of them stick with matlab.<p>What prevents matlab users from switching? The syntax is similar.
This makes a big difference in usability, before loading a big project was almost in the "coffee time" category, now it's more "wait a few seconds". It helps a lot to make the tool feel more responsive.
Congratulations on the release! Package extensions and package images are a huge boost to the usability of Julia.<p>To all Julia users: Go forth, and make use of PrecompileTools.jl in your packages! The latency only drops if you actually make use of precompilation, and it's pretty easy to use. I can't wait for more of the ecosystem to start making use of it.
Nice improvements<p>----------------------------<p>JULIA 1.8.5<p>julia> @time using Plots<p>11.341913 seconds (14.83 M allocations: 948.442 MiB, 6.88% gc time, 12.73% compilation time: 62% of which was recompilation)<p>julia> @time plot(sin.(0:0.01:π))<p>3.342452 seconds (8.93 M allocations: 472.925 MiB, 4.44% gc time, 99.78% compilation time: 78% of which was recompilation)<p>-----------------------------------<p>JULIA 1.9.0<p>julia> @time using Plots;<p>2.907620 seconds (3.43 M allocations: 195.045 MiB, 7.52% gc time, 5.61% compilation time: 93% of which was recompilation)<p>julia> @time plot(sin.(0:0.01:π))<p>0.395429 seconds (907.48 k allocations: 59.422 MiB, 98.54% compilation time: 74% of which was recompilation)
I really like "Julia, the programming language" and had a great experience using it on the few occasions, where it made sense. But whenever a colleague asks me, if I can recommend it, I have to say "no".
The crux is, that its "just-ahead-of-time" compiler disqualifies it for a lot of use cases: I actually would prefer it over Python for small scripts, but the compilation overhead is too long.
On the other hand I would use it over C++ for some applications, when it could easily produce portable binaries.<p>With the steady progress in improving precompilation, I'm optimistic to use it more often in the future, though.
I feel this release might finally make Julia worth considering again. Previously loading time for something as simple as opening a csv and plotting it was a deal breaker.
I used to have a reasonably simple notebook for a paper which took about 35 minutes to compile on an old university-provided CPU; even when opening it for the second time.<p>Therefore, I‘m really excited for the improvements in code caching! Thanks to Tim Holy, Jameson Nash, Valentin Churavy, and others for your work
we had a big julia push this month after 2 years of just messing around. It's better than APL to read ( so is Sanskrit) but we hit a SCREECHING halt when we realized that it wasn't going to happen that we could our streaming data with Pluto notebooks on the web. Pluto Notebooks are wonderful and can handle streaming data just not on a hosted web page with multiple people using it. We tried to use Stipple.jl ( part of GENIE.jl) and that kept freezing ( we suspect because of pacing issues so 1 sec plus should be fine). The point of all of this is that we have found julia to be GREAT to build the back end stuff but not for manipulation of streaming data on the web. We can easily fix this with ZMQ and send the data to Python but julia was supposed to be a 1 language solution. We're trying to dodge the web side of things with Humane so maybe we'll be happier bunnies in 2024
> Users can also create custom local "Startup" packages that load dependencies and precompile workloads tailored to their daily work.<p>That's big! Now I can add packages to my startup.jl without having to worry that every single REPL startup will be slowed down by them. This also eases the pain of things being moved away from the standard library, since we can just add them back to the base environment and load them at startup, making it basically the same thing.
Two very nice additions to the REPL that weren't mentioned in the highlights:<p>* `Alt-e` now opens the current input in an editor. The content (if modified) will be executed upon exiting the editor<p>* A "numbered prompt" mode which prints numbers for each input and output and stores evaluated results in Out can be activated with REPL.numbered_prompt!() (basically `In[3]` `Out[3]` markers like in Mathematica/Jupyter).
I'm quite interested in the interactive thread pool (although I assume it works based on conventions of everyone playing nice). Julia seems to have a powerful parallelism model but it couldn't apply it to responsive GUI and web frameworks that requires low latency, so it is nice if you indeed can have for example the tasks handling HTTP request focusing on handling it as fast as possible while the background working threads dealing with larger computations use all the speed of the Julia language without being constantly interrupted.
> Pkg.add can now be told to prefer to add already installed versions of packages (those that already have been downloadedd onto your machine)<p>> set the env var `JULIA_PKG_PRESERVE_TIERED_INSTALLED` to true.<p>How is this different from setting `Pkg.offline(true)` and then doing the `add`? I don't know the intricacies of how it works, but that's what I've been doing when I just need to try something out in a temp environment.
> We came to the conclusion that a global fastmath option is impossible to use correctly in Julia.<p>I'd assumed that global fastmath was a bad idea <i>in general</i>, and assumed that was the reason for making this a no-op. Is there a reason it's particularly bad in Julia, some assumptions the standard library makes or something?
"Together with PrecompileTools.jl, Julia 1.9 delivers many of the benefits of PackageCompiler without the need for user-customization."<p>does it mean I still have to invoke special workflows and commands to get compilation benefits or does it work out of the box for normal julia invocations?
I like to explore alternatives to Python and Julia has been one of the tools I am waiting to become mature enough to actually invest some time in. But every time I start reading threads, I see the comments from actual users reporting about half an hour minutes and “coffee time” project compilation. Then the dreaded ecosystem problem. Then I think to myself, well, it’s not the time yet.<p>Also, I wish Julia was as popular in Europe as it is overseas.
I didn't even know some of these things were being worked on until recently. I totally understand why devs don't treat development like a Twitter feed, posting every thought that pops into their head instead of working. However, it would be really interesting to follow some of these developments without having to deep lurk all the PRs.<p>Sorry, pretty shallow complaint. Great work!
The remaining issues I had are: I heard there are still bugs in the standard library regarding changing index offsets (from 1 to 0 for example), and IIRC also the language build depends on a fork of LLVM (<a href="https://github.com/JuliaLang/llvm-project">https://github.com/JuliaLang/llvm-project</a>)<p>Are both of those still true? I'm a zero-index guy, but having index offsets is fine as long as the standard library is high quality. As for LLVM, I'd prefer it not need a fork but that's less important.
> To analyze your heap snapshot, open a Chromium browser and follow these steps: right click -> inspect -> memory -> load. Upload your .heapsnapshot file, and a new tab will appear on the left side to display your snapshot's details.<p>Can the same be done with Firefox's `about:memory`'s `Load...` button, or is it Chromium specific?