Just a thought:<p>When Cover & Hart proved that the error for k-NN classification is no worse than twice the Bayes (optimal) error, "machine learning" as a phrase had not yet been observed in the wild.<p><a href="http://ieeexplore.ieee.org/document/1053964/" rel="nofollow">http://ieeexplore.ieee.org/document/1053964/</a><p>EE, CS, stats -- these are your fundamentals...
For the js crowd, here's an implementation of KNN in Node I did a few years ago: <a href="https://github.com/axiomzen/Alike" rel="nofollow">https://github.com/axiomzen/Alike</a><p>And it's cousin KD-tree: <a href="https://github.com/axiomzen/look-alike" rel="nofollow">https://github.com/axiomzen/look-alike</a>
K-NN is one of the more concise classifiers to implement. I did a Python implementation a while ago that can fit into a tweet -[1]. Since maps/lambdas are available in Ruby, this should be possible in Ruby too. Sorry for the bad presentation - I am planning to migrate soon.<p>[1] <a href="http://quipu-strands.blogspot.com/2014/08/knn-classifier-in-one-line-of-python.html" rel="nofollow">http://quipu-strands.blogspot.com/2014/08/knn-classifier-in-...</a>