This article's author seems to be very dismissive of Haskell. Which is a little odd because, at least as described in the article (I haven't had time to look at the paper yet), the approach taken by this compiler is actually very similar to what is already done in Haskell.<p>In particular, Haskell already supports initializing data structures mutably and then "freezing" them. I know this technique is used in the Vector library, for example.<p>Additionally, "pockets of imperative mutability" perfectly describes Haskell's "State Threads" (ST). Programming in Haskell already allows you to have arbitrary state constrained to a particular scope and controlled by the type system. Using things like ST with immutability by default is just tracking state in the type system. And--as long as you don't worry about how it's implemented under the hood--ST isn't terribly difficult to use.<p>I'm sure the auto-threading compiler is very novel and interesting research, and I'm sure it's very useful. However, I think that you shouldn't be this dismissive of Haskell, especially because Haskell already does essentially everything described in the article.
As far as the practical value of an auto-parallelizing compiler goes, I'd like to see it before I believe it. I've got no doubt that the compiler is pulling it off. But with how young the technology is, I'd be downright amazed if they've got something that can reliably outdo what a knowledgeable human can manage with a reasonable amount of effort. Not that I wouldn't like to see things get to that point.<p>For now, though, what's really interesting to me is the idea of language-level support for managing mutability. I agree with the author that that's something that both functional and imperative languages have generally fail to provide much help with. And I think there really is a need for language-level support for it. It's always possible to pass in a mutable subtype or a non-pure function as an argument to a procedure at run-time. So without some way to say, "This parameter requires pure arguments," it may not even be possible for a programmer to verify that a module can, e.g., cope with concurrency in code that uses any degree of inversion of control.
It's what people have been saying for a while now: mutable state is problematic.<p>Very cool, of course.<p>Interestingly, Rust is moving to a similar model, where the (im)mutability of a data structure is "inherited" from the (im)mutability of the thing that references it. So it's trivial to determine something like "for this section, this object is immutable and not referenced from anywhere else".
''The ability to have "pockets of imperative mutability"... connected by a "functional tissue," is not only clarifying, but works quite well in practice for building large and complex concurrent systems.''<p>That's always been a winning combination, and if you look for it you'll see that many examples of well-written code hold to the pattern: large swaths of code that are referentially transparent, with smaller chunks of imperative code piping them together.<p>This can be done in almost any language, but of course support from the toolchain is nice to have.
I remember something like this in a Sun compiler. I don't remember exactly what was being optimized, but it gave a significant boost to single threaded program.