Seems to be cool, but, one of thing that most annoys me on studying machine learning is that I may dive as deep as it is possible in theory, but I can't see how it connects to the practice, i. e. how it makes me choose the correct number of neurons in a layer, how many layers, the activation functions, if I should use a neural network or other techniques, and so on...<p>If someone have something explaining that I'll be grateful
Very neat! Reminds me of Tom Yeh's "AI By Hand" exercises [0].<p>[0] <a href="https://www.byhand.ai/" rel="nofollow">https://www.byhand.ai/</a>
Looks neat! My only criticism would be that the solutions are given right after the questions so I couldn't help to read the answer of one question before thinking it through by myself.
This is really neat! I work in machine learning but still feel imposter syndrome with my foundations with math (specifically linear algebra and matrix/tensor operations). Does anyone have any more good resources for problem sets with an emphasis on deep learning foundational skills? I find I learn best if I do a bit of hands-on work every day (and if I can learn things from multiple teachers’ perspectives)
complete with solutions, beautiful, thank you for sharing!<p>I'd be interested in more of these pen and paper exercises, if there is such a term, for other topics.
Discussed at the time:<p><i>Pen and paper exercises in machine learning (2021)</i> - <a href="https://news.ycombinator.com/item?id=31913057">https://news.ycombinator.com/item?id=31913057</a> - June 2022 (55 comments)
Funny how mathematicians always try to sneak their linear algebra and matrix theory into ML. If you didn't know any better, you'd think academicians had invented LLMs and are the experts to be consulted with.<p>If anything academicians and theoreticians held ML back and forced generations of grad students doing symbolic proofs, like in this example, just because computational techniques were too lowbrow for them.