Great job!<p>Technically, this is a supervised learning NN library that implements the canonical backprop algorithm. Also looks like all networks are feed-forward and fully connected, with neurons activated using a sigmoid function (1 / [1+e^-ab]). Appears that cross-validation is used as well, but I haven't looked into how or which kind.<p>You probably want to add momentum or some other form of local optima escape/avoidance mechanism.
This is awesome. I think more resources like this will help spread the ML field to those who otherwise wouldn't be exposed to it. We should really have a core set of tools/libs like pybrain and opencv in every language.<p>I'd also like to point out an interesting little set of slides by the same author: <a href="http://harthur.github.com/txjs-slides/" rel="nofollow">http://harthur.github.com/txjs-slides/</a>
harthur's Bayesian classifier module is also awesome: <a href="https://github.com/harthur/classifier" rel="nofollow">https://github.com/harthur/classifier</a>. Using it to classify videos based on tags was the first time I'd made anything with ML.