TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Scientific computing’s future: Can Haskell, Clojure, or Julia top Fortran?

63 pointsby x43babout 11 years ago

9 comments

belochabout 11 years ago
A lot of people coming from a CPSC background view Fortran as a dinosaur. However, it's probably better viewed as a Crocodile in the sense that, even if it is ancient, it has evolved to fill a specific niche very, very well. Fotran allows complex math (especially linear algebra) to be expressed more compactly than most low level languages (e.g. C) are capable of while still offering excellent control over how hardware is utilized. Just as it's relatively easy to dance around a crocodile out of the water, it shouldn't be too difficult for languages like Python to challenge Fortran when ease of use matters but performance is of a lesser concern. However, going into the muddy water to wrestle with crocodiles is a different matter entirely! I would not be surprised if people rehash this conversation with an entirely new set of prospective croc-slayers another twenty years from now.
评论 #7727107 未加载
评论 #7727112 未加载
评论 #7727228 未加载
评论 #7727034 未加载
kunstmordabout 11 years ago
I&#x27;ve dealt with a lot of legacy Fortran (and Pascal) code, and while the I agree with the article in that Fortran has to go, Haskell and Clojure seem VERY weird and pointless choices in the area of computing (see the comments on the ars website, a lot of valid points there). But the biggest problems that I&#x27;ve encountered in the Fortran code I dealt with were not exactly Fortran-related: 1) Terrible variable naming (aaa = eee &#x2F; ccc + 1. and so on) 2) goto&#x27;s, huge chunks of code with no structure (or, even worse, structured with goto&#x27;s) 3) disregard for numeric accuracy and overflow&#x2F;underflow<p>And this, imo, has more to do with the way CS is taught to scientists - I had a two-year course in C&#x2F;C++, and we spent those two years writing all kinds of trees, lists and stuff like that. Needless to say, that is good and all, but that didn&#x27;t exactly help us with writing scientific code later on - a lot of people wrote terrible code to get their AVL trees working, for example, just to get a passing grade. No one taught coding style, working with CVS, computer arithmetics and such. The same goes for the MATLAB course I took.<p>In my opinion, it would&#x27;ve been a lot wiser to teach people scientific computing using Python. It has tons of scientific libraries (a lot of people that I know who are involved in scientific computations often neglect to re-use code, use publicly available libraries; teaching people how to use third-party packages&#x2F;libraries is important), forces programmers to indent (the amount of unindented C code I&#x27;ve dealth with makes me shudder), and makes them realize what makes a program fast or slow. Besides, using Numba&#x2F;Cython&#x2F;Theano&#x2F;multiprocessing), it is possible to give a more or less painless introduction to the world of parallel&#x2F;optimized computing. And only then start teaching C&#x2F;C++&#x2F;OpenMP&#x2F;MPI&#x2F;Fortran.<p>Now, I&#x27;m judging from my personal experience and from what I&#x27;ve seen at my university (which is the second-biggest research university in the country), there&#x27;s a huge difference between how CS is taught to CS students and science students (physics, mechanics). The knowledge that science students receive is subpar, and, unfortunately, enough to start writing computational code.
评论 #7726820 未加载
jnbicheabout 11 years ago
&quot;Scientific computing&quot; is such a broad term that it&#x27;s not terribly useful.<p>For numeric computing, Fortran, C, and C++ will likely remain at the top for years to come.<p>For statistical and exploratory data analysis, R has long been king (and closed-source tools before R), but Python is rapidly coming out on top here. Clojure could challenge here, but it&#x27;s far from having the popularity of R or even Python right now.<p>For machine learning, Matlab along with its open-source analog Octave, have long been de rigueur, but Python is rapidly gaining ground here, too. I think here is where Julia is hoping to gain ground, at least initially.<p>So it&#x27;s a bit odd to lump different areas of scientific computing together, but even odder to neglect the one language that has a chance of topping more than one of these areas. And I say that as someone who is moving from Python to Go and Rust for a lot of my software (but still stay with Python for data exploration).<p>Nonetheless, not a bad introduction to the languages in question.
gammaratorabout 11 years ago
As the Ars comments make clear, with the exception of Julia none of these languages has any chance of wide adoption under the broad umbrella of &quot;scientific computing.&quot;<p>For a defense of the numerical&#x2F;scientific computing tradition of which FORTRAN is the ne plus ultra, see this article: <a href="http://www.evanmiller.org/mathematical-hacker.html" rel="nofollow">http:&#x2F;&#x2F;www.evanmiller.org&#x2F;mathematical-hacker.html</a><p>It&#x27;s telling of the Ars author&#x27;s lispy blinders that he gives recursive examples for computing Fibonacci numbers. As the linked article makes clear, this is ridiculous because there&#x27;s a closed form solution:<p>&quot;&quot;&quot; long int fib(unsigned long int n) { return lround((pow(0.5 + 0.5 * sqrt(5.0), n) - pow(0.5 - 0.5 * sqrt(5.0), n)) &#x2F; sqrt(5.0)); }<p>No recursion (or looping) is required because an analytic solution has been available since the 17th century.&quot;&quot;&quot;
评论 #7726937 未加载
评论 #7726933 未加载
haddrabout 11 years ago
I was missing R and Python in the article. And who says we need a king? Maybe the rich ecosystem of many coexisting languages is better?
评论 #7727280 未加载
评论 #7728983 未加载
jeyabout 11 years ago
Julia is definitely going to be the winner. I&#x27;ve been using it for a few weeks and it is just so <i>natural</i> for scientific&#x2F;technical computing and effectively covers a wide range of use cases. The type system and syntax allow for code to be expressed in terms of the domain&#x27;s natural objects and notation, without having to do awkward translations between the math and the code. It has the mathematical features of MATLAB without compromising on speed (Python) or expressivity (C and FORTRAN).
评论 #7726939 未加载
Xcelerateabout 11 years ago
I do molecular dynamics simulations using LAMMPS on HPC systems. LAMMPS is written in C++. I&#x27;m not normally a fan of object-oriented languages, but this seems to work well for a system where you have an abstract base class (like an atomic pairwise potential) that allows users to easily derive their own potential class from it.<p>I wouldn&#x27;t say LAMMPS is super-optimized for a particular application compared to some other MD codes, but it is very good for a wide-variety of situations, kind of like C++. Just guessing, I&#x27;d say LAMMPS is easily within a factor of 2-3x of most hand-tuned assembly codes, but the generality of it really outweighs the performance penalty.<p>Personally, in terms of programming languages, Julia is really growing on me. I&#x27;ve been using it for performance-intensive, single-threaded programs and it works great. I&#x27;m actually considering experimenting with it for some of my web application projects (currently using Node.js for those) just because of how much I like the language design.
bayesianhorseabout 11 years ago
For quite a few cases, the &quot;symbolic&quot; route might be the future. In Python for example there is sympy, which is mostly a computer aided algebra toolkit, but it can translate formulas to Fortran, Theano and Javascript.<p>Theano on the other hand is also a symbolic toolkit designed to make linear algebra super-fast. It takes a symbolic representation of the computations, and then compiles it into C code or Cuda for GPUs.<p>Theano has been used extensively in deep learning, but it has other applications as well.<p>PyMC is a library implementing Bayesian inference through monte carlo methods. Version 3 implements samplers based on Theano. The advantage here is that Theano can automatically deduce derivatives, which allows for more sophisticated algorithms and better performance.
tom_jonesabout 11 years ago
I personally prefer Scala.<p>It supports functional programming (combining it nicely with object oriented programming), immutability, tail recursion, lazy evaluation (you have to specify what will be evaluated lazily), collections with parallel processing support, actor based processing, pattern matching and of course a REPL.<p>But none of that is mandatory, for example when needed, you can also use mutable variables and collections.<p>It runs on the JVM, and you can mix Java code and libraries with Scala code and libraries. And the ecosystem of Java libraries is huge.
评论 #7727803 未加载