I'm only starting with all that machine-learning, NN stuff and as many others I want to ask for some guidance/resources/learning material. What I feel especially lacking is something very broad and generic, some overview of existing techniques (but not as naïve as Ng's ML course, I assume). There exist a lot of estimators and classifiers, there exist a lot of techniques and tricks to train models, there exist a lot of details on how to design a NN architecture. So how, for instance, do I even decide, that Random Forest is not enough for this task and I want to build some specific kind of neural net? Or maybe I don't actually need any of these fancy famous techniques, but rather there exist some very well defined statistical method to do what I want?<p>What should I read to start grokking this kind of things? I feel quite ready to go full "DIY math PhD" mode and consume some heavy reading if necessary, but where do I even start?