wait what - another math textbook recommendation by academicians. ML and MLL are arts of tinkering not academic subjects.<p>Though Steven Johnson is the real deal and writes lots of code, Edelman is a shyster/imposter who used to ride the coattails of G. Strang and now shills for Julia where he makes most of his money. You don't need, and won't understand ML/LLM by reading textbooks.<p>1. If you want to have a little fun with ML/LLM, fire up Google Collab and run one of tutorials on the web - Karpathy, Hugging Face or PyTorch examples.<p>2. If you don't want to do, but just read for fun, Howard & Parr's essay as recommended by someone else here is much shorter and more succinct. <a href="https://explained.ai/matrix-calculus/" rel="nofollow">https://explained.ai/matrix-calculus/</a> this link renders better<p>3. If you insist on academic textbooks, Boyd & Vandenberghe skips calculus and has more applications (engineering). Unfortunately, code examples are in Julia!
<a href="https://web.stanford.edu/~boyd/vmls/vmls.pdf" rel="nofollow">https://web.stanford.edu/~boyd/vmls/vmls.pdf</a>
<a href="https://web.stanford.edu/~boyd/vmls/" rel="nofollow">https://web.stanford.edu/~boyd/vmls/</a>. link to python version<p>4. If you Want to become a tensor & differential programming ninja, learn Jax, XLA
<a href="https://docs.jax.dev/en/latest/quickstart.html" rel="nofollow">https://docs.jax.dev/en/latest/quickstart.html</a>
<a href="https://colab.research.google.com/github/exoplanet-dev/jaxoplanet/blob/main/docs/tutorials/introduction-to-jax.ipynb" rel="nofollow">https://colab.research.google.com/github/exoplanet-dev/jaxop...</a>