I find JAX really exciting. The idea of numpy with autograd is exactly what Pythonistas want. The elephant in the room, though, is “why not Pytorch?”<p>Everyone knows JAX is what Google realised Tensorflow should have been when they realised how much of a joy Pytorch was to use. I actually think JAX does offer some advantages, not least true numpy interoperability. However, not mentioning *torch a single time in the blog post seems a little disingenuous for a Google-owned deep learning enterprise.