I think the confluence of new technologies, and the re-emergence / rediscovery of older technologies is going to be the best combination. Whether it goes that way is not certain, since the best technology doesn't always win out. Here, though, the money should, since all would greatly reduce time and energy in mining and validating:<p>* Vector processing computers - not von Neumann machines [1].<p>* Array languages new, or like J, K, or Q in the APL family [2,3]<p>* The replacement of floating point units with unum processors [4]<p>Neural networks are inherently arrays or matrices, and would do better on a designed vector array machine, not a re-purposed GPU, or even a TPU in the article in a standard von Neumann machine.
Maybe non-von Neumann architectire like the old Lisp Machines, but for arrays, not lists (and no, this is not a modern GPU. The data has to stay on the processor, not offloaded to external memory).<p>I started with neural networks in late 80s early 1990s, and I was mainly programming in C. matrices and FOR loops. I found J, the array language many years later, unfortunately.
Businesses have been making enough money off of the advantage of the array processing language A+, then K, that the per-seat cost of KDB+/Q (database/language) is easily justifiable. Other software like RiakTS are looking to get in the game using Spark/shark and other pieces of kit, but a K4 query is 230 times faster than Spark/shark, and uses 0.2GB of memory vs. 50GB. The similar technologies just don't fit the problem space as good as a vector language.
I am partial to J being a more mathematically pure array language in that it is based on arrays. K4 (soon to be K5/K6) is list-based at the lower level, and is honed for tick-data or time series data. J is a bit more general purpose or academic in my opinion.<p>Unums are theoretically more energy efficient and compact than floating point, and take away the error-guessing game. They are being tested with several different language implementations to validate their creator's claims, and practicality. The Mathematica notebook that John Gustafson modeled his work on is available free to download from the book publisher's site.
People have already done some type of explorator investigations in Python, Julia and even J already. I believe the J one is a 4-bit implementation of enums based on unums 1.0. John Gustafson just presented unums 2.0 in February 2016.<p>[1] <a href="http://conceptualorigami.blogspot.co.id/2010/12/vector-processing-languages-future-of.html" rel="nofollow">http://conceptualorigami.blogspot.co.id/2010/12/vector-proce...</a><p>[2] jsoftware.com<p>[3] <a href="http://kxcommunity.com/an-introduction-to-neural-networks-with-kdb.php" rel="nofollow">http://kxcommunity.com/an-introduction-to-neural-networks-wi...</a><p>[4] <a href="https://www.crcpress.com/The-End-of-Error-Unum-Computing/Gustafson/p/book/9781482239867" rel="nofollow">https://www.crcpress.com/The-End-of-Error-Unum-Computing/Gus...</a>