This is great, I've waited 20 years for this (computer engineering degree 1999). For all the naysayers - what has gone wrong with computing, why Moore's law no longer works, etc etc is that we've gone from general purpose computing to proprietary narrow-use computing thanks to Nvidia and others. VHDL and Verilog are basically assembly language and are not good paradigms for multicore programming.<p>The best languages to take advantage of chips that aren't compute-limited* are things like Erlang, Elixir, Go, MATLAB, R, Julia, Haskell, Scala, Clojure.. I could go on. Most of those are the assembly languages of functional programming and are also not really usable by humans for multicore programming.<p>I personally vote no confidence on any of this taking off until we have a Javascript-like language for concurrent programming. Go is the closest thing to that now, although Elixir or Clojure are better suited for maximum scalability because they are pure functional languages. I would give MATLAB a close second because it makes dealing with embarrassingly parallel problems embarrassingly easy. Most of the top rated articles on HN lately for AI are embarrassingly parallel or embarrassingly easy when you aren't compute-limited. We just aren't used to thinking in those terms.<p>* For now lets call compute-limited any chip that can't give you 1000 cores per $100