TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Growing open source from Torch to PyTorch

78 点作者 plinkplonk将近 4 年前

5 条评论

albertzeyer将近 4 年前
It was necessary to move away from Lua to stay relevant within the machine learning community. Python was a natural choice because there was Theano and TensorFlow.<p>PyTorch could make use of the best API ideas from the other frameworks (also higher-level like Keras). And it was executed well. All these core principles of easy debuggability are indeed very important to win developers. Clean code, understandable code, flexibility, these are all very related to that, or mostly the same thing.<p>It&#x27;s easy to get bloated, complex and complicated for a successful framework though. I wonder how PyTorch will look in a few years. I also remember the first TensorFlow releases, where the whole source code was also quite easy to understand. Then TensorFlow added more and more things, and many different types of APIs, starting to deprecate some earlier things, etc. The PyTorch internal code is also already much more complex than it was initially.<p>One reason JAX is now popular is because it again started with a fresh API. Despite being based on a new kind of idea of code transformations, which seems nice and powerful.<p>When looking at these developments, I really wonder what the future will look like. It&#x27;s good to have new ideas and new or improved APIs. It&#x27;s also good to adapt things for new kinds of hardware (GPUs, TPUs, maybe other neuromorphic hardware later).
评论 #28067201 未加载
评论 #28070930 未加载
amkkma将近 4 年前
As a julia user, thanks for this! Inspiring and packed with pearls. There&#x27;s a lot we can learn from the python community
posharma将近 4 年前
PyTorch is amazing. The article was a good read. Although I&#x27;m confused. How can a ML framework be not obsessed with speed&#x2F;performance?
评论 #28066201 未加载
anonymoushn将近 4 年前
&gt; So, over the years, I absorbed and appreciated that Torch was a user-centric product, which stood for immediate-mode, easy-to-debug, stay-out-of-the-way explicitness. It was targeted at people somewhat familiar with programming matters, and who could reason about things like performance, and if needed, write a C function and bind it in quickly.<p>This paragraph sort of surprises me. In my experience if you want to do anything other than calling out to numeric libraries, you can do it in Lua and it will work, or you can do it in Python and suddenly your machine learning pipeline will spend 95% of its time running Python while your GPU idles. So the need to be able to drop down to C is much more severe in Python, and the difficulty of calling out to C is much greater.
评论 #28079062 未加载
blt将近 4 年前
This article does a good job explaining how PyTorch gained an advantage over TensorFlow. The 1.0 release of TensorFlow with graphs and feed_dicts was a little clunky but made sense. After 1.0 the second-system effect took hold quickly. Eager mode, Keras, TFX ... it all started to look like a mess.