TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Growing open source from Torch to PyTorch

78 pointsby plinkplonkalmost 4 years ago

5 comments

albertzeyeralmost 4 years ago
It was necessary to move away from Lua to stay relevant within the machine learning community. Python was a natural choice because there was Theano and TensorFlow.<p>PyTorch could make use of the best API ideas from the other frameworks (also higher-level like Keras). And it was executed well. All these core principles of easy debuggability are indeed very important to win developers. Clean code, understandable code, flexibility, these are all very related to that, or mostly the same thing.<p>It&#x27;s easy to get bloated, complex and complicated for a successful framework though. I wonder how PyTorch will look in a few years. I also remember the first TensorFlow releases, where the whole source code was also quite easy to understand. Then TensorFlow added more and more things, and many different types of APIs, starting to deprecate some earlier things, etc. The PyTorch internal code is also already much more complex than it was initially.<p>One reason JAX is now popular is because it again started with a fresh API. Despite being based on a new kind of idea of code transformations, which seems nice and powerful.<p>When looking at these developments, I really wonder what the future will look like. It&#x27;s good to have new ideas and new or improved APIs. It&#x27;s also good to adapt things for new kinds of hardware (GPUs, TPUs, maybe other neuromorphic hardware later).
评论 #28067201 未加载
评论 #28070930 未加载
amkkmaalmost 4 years ago
As a julia user, thanks for this! Inspiring and packed with pearls. There&#x27;s a lot we can learn from the python community
posharmaalmost 4 years ago
PyTorch is amazing. The article was a good read. Although I&#x27;m confused. How can a ML framework be not obsessed with speed&#x2F;performance?
评论 #28066201 未加载
anonymoushnalmost 4 years ago
&gt; So, over the years, I absorbed and appreciated that Torch was a user-centric product, which stood for immediate-mode, easy-to-debug, stay-out-of-the-way explicitness. It was targeted at people somewhat familiar with programming matters, and who could reason about things like performance, and if needed, write a C function and bind it in quickly.<p>This paragraph sort of surprises me. In my experience if you want to do anything other than calling out to numeric libraries, you can do it in Lua and it will work, or you can do it in Python and suddenly your machine learning pipeline will spend 95% of its time running Python while your GPU idles. So the need to be able to drop down to C is much more severe in Python, and the difficulty of calling out to C is much greater.
评论 #28079062 未加载
bltalmost 4 years ago
This article does a good job explaining how PyTorch gained an advantage over TensorFlow. The 1.0 release of TensorFlow with graphs and feed_dicts was a little clunky but made sense. After 1.0 the second-system effect took hold quickly. Eager mode, Keras, TFX ... it all started to look like a mess.