Hi HN!
About a year ago I bought A. Burkov's "The hundred-page machine learning book".<p>I thought it could be a good starting point for learning about transformers and GenAI.<p>But then I discovered here [1] the logic is far more complex, and a bit more specific (it also contains a lot of black magic in form of parameters to tune).<p>My impression is that a lot of 'plain vanilla' Machine Learning topics (ML) are faded away this last 2 years... so it makes sense to study ML today?<p>And if yes, in the next 2-5 years, what are the things a GenAI cannot do than ML will continue to be useful to?<p>[1] https://gioorgi.com/2024/gemma-gem/