It was necessary to move away from Lua to stay relevant within the machine learning community. Python was a natural choice because there was Theano and TensorFlow.<p>PyTorch could make use of the best API ideas from the other frameworks (also higher-level like Keras). And it was executed well. All these core principles of easy debuggability are indeed very important to win developers. Clean code, understandable code, flexibility, these are all very related to that, or mostly the same thing.<p>It's easy to get bloated, complex and complicated for a successful framework though. I wonder how PyTorch will look in a few years. I also remember the first TensorFlow releases, where the whole source code was also quite easy to understand. Then TensorFlow added more and more things, and many different types of APIs, starting to deprecate some earlier things, etc. The PyTorch internal code is also already much more complex than it was initially.<p>One reason JAX is now popular is because it again started with a fresh API. Despite being based on a new kind of idea of code transformations, which seems nice and powerful.<p>When looking at these developments, I really wonder what the future will look like. It's good to have new ideas and new or improved APIs. It's also good to adapt things for new kinds of hardware (GPUs, TPUs, maybe other neuromorphic hardware later).