I like Julia because in many ways, it's a better C than C:<p>* You can inspect a function's generated LLVM or (x86, ARM) assembly code from the REPL. (code_llvm or code_native)<p>* You can interface with C libraries without writing any C. That let me wrap a 5 kLoC C library with 100 lines of Julia:<p><a href="https://github.com/WizardMac/DataRead.jl" rel="nofollow">https://github.com/WizardMac/DataRead.jl</a><p>* You can use the dynamic features of the language to write something quickly, then add type annotations to make it fast later<p>* Certain tuples are compiled to SIMD vectors. In contrast, the only way to access SIMD features in C is to pray that you have one of those "sufficiently smart compilers".<p>* Like C, there's no OO junk and the associated handwringing about where methods belong. There are structures and there are functions, that's it. But then multiple dispatch in Julia gives you the main benefits of OO without all the ontological crap that comes with it.<p>For me, Julia feels like it's simultaneously higher-level and lower-level than C. The deep LLVM integration is fantastic because I can get an idea for how functions will be compiled without having to learn the behemoth that is the modern x86 ISA. (LLVM IR is relatively simple, and its SSA format makes code relatively easy to follow.)<p>Anyway, I only started with Julia recently, but I'm a fan. I should also mention that the members of the developer community are very, very smart. (Most are associated with MIT.) BTW I am starting a Julia meetup in Chicago for folks in the Midwest who want to learn more: <a href="http://www.meetup.com/JuliaChicago/" rel="nofollow">http://www.meetup.com/JuliaChicago/</a>
I have installed iJulia -- which involved compiling Julia from source.<p>What little bit I've tried with Julia so far is impressive.<p>What is the story anyway? Did a bunch of MIT people decide they needed something faster than R & numpy ?<p>Julia inventor Alan Edelman is an fellow alumnus of Hampshire College Summer Studies in Mathematics (HCSSiM). Scores points in my book. <a href="https://en.wikipedia.org/wiki/Alan_Edelman" rel="nofollow">https://en.wikipedia.org/wiki/Alan_Edelman</a><p>He wrote a terrific paper back in '95
<a href="http://arxiv.org/abs/math/9501224" rel="nofollow">http://arxiv.org/abs/math/9501224</a>
"How many zeros of a random polynomial are real?"
As a beginner in scientific/data computing, is it worth building a foundation in Julia or do with some other tool instead? I'm comfortable in Python but I don't know if its best to teach myself how to use NumPy and friends when I have the opportunity to immerse myself in this brand new shiny technology.
Very cool. I think the point about transcribing equations directly from papers glosses over some of the hard issues with building numeric libraries around floating point precision and numeric stability.<p>It's not that uncommon to see code that would have worked perfectly if double precision floating point numbers were actually real numbers, but which blows up badly once rounding errors happen.<p>Unfortunately there's not really any way to abstract those issues away. In general, I'd hope that any the authors of widely used numeric libraries have carefully thought through these issues.