The new edition has been split in two parts. The pdf draft (921 pages) and python code [1] of the first part are now available. The table of contents of the second part is here [2].<p>From the preface:<p>"By Spring 2020, my draft of the second edition had swollen to about 1600 pages, and I was still not
done. At this point, 3 major events happened. First, the COVID-19 pandemic struck, so I decided
to “pivot” so I could spend most of my time on COVID-19 modeling. Second, MIT Press told me
they could not publish a 1600 page book, and that I would need to split it into two volumes. Third,
I decided to recruit several colleagues to help me finish the last ∼ 15% of “missing content”. (See
acknowledgements below.)<p>The result is two new books, “Probabilistic Machine Learning: An Introduction”, which you are
currently reading, and “Probabilistic Machine Learning: Advanced Topics”, which is the sequel to
this book [Mur22].<p>Together these two books attempt to present a fairly broad coverage of the field
of ML c. 2020, using the same unifying lens of probabilistic modeling and Bayesian decision theory
that I used in the first book.
Most of the content from the first book has been reused, but it is now split fairly evenly between
the two new books. In addition, each book has lots of new material, covering some topics from deep
learning, but also advances in other parts of the field, such as generative models, variational inference
and reinforcement learning. To make the book more self-contained and useful for students, I have
also added some more background content, on topics such as optimization and linear algebra, that
was omitted from the first book due to lack of space.<p>Another major change is that nearly all of the software now uses Python instead of Matlab."<p>[1] <a href="https://github.com/probml/pyprobml" rel="nofollow">https://github.com/probml/pyprobml</a><p>[2] <a href="https://probml.github.io/pml-book/book2.html" rel="nofollow">https://probml.github.io/pml-book/book2.html</a>
This is probably my favorite introductory machine learning book. The fact that he places almost everything in the language of graphical models is such a good common ground to build off.<p>This really sets you up to realize that there is (and should be) a lot more to doing a good job in machine learning than simply minimizing an objective function. The answers you get depend on the model you create as do the questions you can hope to answer.<p>I don't see a clear list of differences between this new edition. Does anyone know what's new?
For anybody truly serious about this field, I recommend the below book. It has some poor reviews on Amazon, which I was shocked to see, but it is my favourite book and taught me the core of probability theory and statistics, in a way most books don’t. Your understanding of Machine Learning will be better than 90% of those out there, if you can get through the principles in this book.<p>I topped statistics at the most prestigious university in my country both at the undergrad and postgrad level, and had no problem discussing advanced concepts with Senior PHDs in Quantitative Fields, and I thank this book the most for beginning my journey on this. But, and this is important, make sure to do all the exercises!<p><a href="https://www.amazon.com/John-Freunds-Mathematical-Statistics-Applications/dp/032180709X" rel="nofollow">https://www.amazon.com/John-Freunds-Mathematical-Statistics-...</a>
Looking at the table of contents (for someone who is not familiar with the term 'Probabilistic Machine Learning'), is this just covering typical ML methods through the lens of probability?
Why ML books should be so big? In many cases the books are various sub books pasted as one or extensive bibliography reviews that just list progress without any pedagogy. I would suggest splitting them in 200-250 pp format in parts that can serve independently.
while book 1 looks great, it appears that book 2 is still very rough: <a href="https://probml.github.io/pml-book/book2.html" rel="nofollow">https://probml.github.io/pml-book/book2.html</a>
also recommend probabilistic methods for hackers as another resource to explore this space:<p><a href="https://camdavidsonpilon.github.io/Probabilistic-Programming-and-Bayesian-Methods-for-Hackers/" rel="nofollow">https://camdavidsonpilon.github.io/Probabilistic-Programming...</a>
The original 2012 book was awesome!<p>Would love to get my hands on the draft for "Probabilistic Machine Learning: Advanced Topics".<p><a href="https://probml.github.io/pml-book/book2.html" rel="nofollow">https://probml.github.io/pml-book/book2.html</a>
In my opinion (having read both books and TA’d courses using both), Murphy is a significantly better book than Bishops’s Machine Learning. I’m very excited about a sequel!