TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Python 3.5 and Multitasking

84 点作者 sonicrocketman超过 9 年前

13 条评论

ikken超过 9 年前
I&#x27;ve been using the new async&#x2F;await syntax to write beautiful asynchronous websockets server and I fell in love with it. It handles hundreds of thousands of concurrent connections (on top of aiohttp) and the code is so much cleaner than it would be with i.e. NodeJS and Express and Promises. It reads like a serial code.<p>I think benchmarking asyncio with any type of CPU bound tasks misses the point. Previously we were relying on hacks like monkeypatching with gevent, but now we&#x27;ve been presented with clean, explicit and beautiful way to write massivly parralel servers in Python.
评论 #10274960 未加载
评论 #10274704 未加载
评论 #10275061 未加载
btreecat超过 9 年前
&gt;I did this for two reasons, the first being that I cannot, for the life of me, figure out how to use asyncio to do local file IO and not a network request, but maybe I&#x27;m just an idiot.<p>I don&#x27;t think you are an idiot I just think you didn&#x27;t search well enough. There is a reason there is no local file IO with asyncio.<p>Check these links for more info:<p>* <a href="https:&#x2F;&#x2F;stackoverflow.com&#x2F;questions&#x2F;87892&#x2F;what-is-the-status-of-posix-asynchronous-i-o-aio" rel="nofollow">https:&#x2F;&#x2F;stackoverflow.com&#x2F;questions&#x2F;87892&#x2F;what-is-the-status...</a><p>* <a href="http:&#x2F;&#x2F;blog.libtorrent.org&#x2F;2012&#x2F;10&#x2F;asynchronous-disk-io&#x2F;" rel="nofollow">http:&#x2F;&#x2F;blog.libtorrent.org&#x2F;2012&#x2F;10&#x2F;asynchronous-disk-io&#x2F;</a><p>From what I understand the way libuv (what node.js uses) gets around OS limits is with a thread pool.<p>Also, this documentation might be helpful:<p>* <a href="https:&#x2F;&#x2F;docs.python.org&#x2F;3.5&#x2F;library&#x2F;asyncio-dev.html#handle-blocking-functions-correctly" rel="nofollow">https:&#x2F;&#x2F;docs.python.org&#x2F;3.5&#x2F;library&#x2F;asyncio-dev.html#handle-...</a><p>My experiences don&#x27;t seem to mirror your frustration. I found asyncio quite useful for tinkering with a simple web scraper that hit multiple sites at once (each with a different response time) and munged all the data together into one data set.<p>Thanks for the write-up!
评论 #10276449 未加载
评论 #10274569 未加载
fijal超过 9 年前
Maybe it&#x27;s worth noting, maybe not, but this extremely trivial example (e.g. serial.py) gets executed 20x faster just by using pypy which speeds up serial execution. This is more than you would get by any sort of multiprocessing&#x2F;threading shengenians <i></i>just<i></i> by using an optimizing VM (and granted, this example is very simple, but maybe addressing basic performance problems should come first)
评论 #10276460 未加载
评论 #10274196 未加载
评论 #10274206 未加载
keypusher超过 9 年前
concurrency in python is kind of a disaster, in my opionion. there are a lot of different options but they all seem to have significant drawbacks, and not just limited to ease of use. i know concurrency is a hard problem, but i wish there was one really good, straightforward solution instead of 3 or 4 different half-baked convoluted solutions (threading, multiprocessing, asyncio, subprocess in stdlib, plus twisted, gevent, pulsar etc as third-party).
评论 #10274230 未加载
评论 #10274301 未加载
评论 #10274248 未加载
评论 #10275265 未加载
csytan超过 9 年前
While I haven&#x27;t had a chance to use 3.5&#x27;s async&#x2F;await syntax, I have used AsyncIO pretty heavily to deal with multiple sensor inputs&#x2F;outputs on a Raspberry Pi.<p>The author is right. If you write a coroutine, <i></i>any code that uses it must also be a coroutine<i></i>. This is pretty annoying when you&#x27;re trying to test something manually. It bubbles up this way until you eventually hit the event loop.<p>If you&#x27;re trying to debug a coroutine in the interactive shell you&#x27;ve got to do something like this:<p><pre><code> loop = asyncio.get_event_loop() # Blocking call which returns when the hello_world() coroutine is done loop.run_until_complete(hello_world()) loop.close() </code></pre> That&#x27;s my main beef with it. Debugging can also be painful because when you hit an exception, your stack trace will also involve the asyncio library. Aside from those complaints, I&#x27;m a fan. It works fine and reads better than callback-style code.
评论 #10274965 未加载
ak217超过 9 年前
This mirrors my experience precisely. I was very excited about async&#x2F;await, hoping that this would integrate coroutines into regular Python scripts, without the need to manage some complex dispatch engine. I was equally disappointed to learn that it&#x27;s business as usual, with painful and inadequate semantics out of the box.<p>At least we have pypy. The community should really be rallying behind that project.
评论 #10275030 未加载
1st1超过 9 年前
First of all, see this pic: <a href="https:&#x2F;&#x2F;pbs.twimg.com&#x2F;media&#x2F;COLLg0TUAAA4j79.jpg:large" rel="nofollow">https:&#x2F;&#x2F;pbs.twimg.com&#x2F;media&#x2F;COLLg0TUAAA4j79.jpg:large</a><p>asyncio doesn&#x27;t provide any nio abstractions for files because (a) it&#x27;s not really needed, and (b) there is no easy way to implement it.<p>(a) basically you shouldn&#x27;t expect your code to block on disk io. But even if it does block for a very short amount of time it&#x27;s probably fine.<p>(b) one way to implement nio for files is to use a threadpool. Maybe we&#x27;ll add this in later versions of asyncio, but it will require to write some pretty low level code in C (and reimplement big chunks of asyncio in C too). Another option is to use modern APIs like aio in Linux, but as far as I know almost nobody uses it for real.<p>Bottom line -- you don&#x27;t need coroutines or asyncio to do file io. What you need asyncio (and frameworks like aiohttp) is to do network programming in Python efficiently.
评论 #10277081 未加载
评论 #10275243 未加载
评论 #10275195 未加载
justinlardinois超过 9 年前
Tangentially related: of the people and projects that are using Python 3, why? I&#x27;ve found that aside from syntax and a few features here and there, Python 2 and 3 are more or less the same technology-wise, especially since 2.7 has a lot of features backported from 3. Thus to me it seems better to stick with 2 because there&#x27;s so many existing libraries and CPython is the only implementation with complete Python 3 support.<p>If Python 3 had good concurrency and optimization (something neither version has right now), I&#x27;d consider using it, but is there an already existing reason that I&#x27;m just not seeing?
评论 #10274997 未加载
评论 #10274895 未加载
评论 #10275025 未加载
评论 #10274961 未加载
评论 #10275000 未加载
评论 #10274866 未加载
评论 #10274861 未加载
评论 #10275015 未加载
aidenn0超过 9 年前
I don&#x27;t use python all that much, but out of curiosity, what&#x27;s the problem with multiprocessing? In the languages I <i>do</i> develop in, I find it much easier to reason about than multithreading.
评论 #10275553 未加载
n0us超过 9 年前
When he brings up requests in comparison to urllib he seems to not know about aiohttp <a href="https:&#x2F;&#x2F;github.com&#x2F;KeepSafe&#x2F;aiohttp" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;KeepSafe&#x2F;aiohttp</a>
Lofkin超过 9 年前
Better multitasking and concurrency syntax, with task scheduling and out of core support here: <a href="http:&#x2F;&#x2F;dask.pydata.org&#x2F;" rel="nofollow">http:&#x2F;&#x2F;dask.pydata.org&#x2F;</a>
vegabook超过 9 年前
In other words, a lost opportunity. Asyncio and this new syntax is both hard for beginners and experienced python coders alike, <i>and still doesn&#x27;t do multicore!</i>. We&#x27;re clocking up 3.x version numbers as if this will magically provide the illusion of progress, but the killer feature is still not there. Instead we get type annotations. In a dynamic language. Which doesn&#x27;t compile and therefore doesn&#x27;t need them. With no performance advantage. If I have to type declare everything, I want 10x performance. Okay?<p>If Python were a listed company, the CEO would have been replaced long ago. I&#x27;m tired of watching my favourite language flail around like this. Will Continuum Analytics or Enthought <i>please</i> fork 2.7?
评论 #10274223 未加载
评论 #10274214 未加载
评论 #10274259 未加载
评论 #10274998 未加载
评论 #10274422 未加载
gcb0超过 9 年前
so is it a little syntax sugar on top of the multiprocess module to please ios developers?<p>there&#x27;s nothing new that i can see under the hood
评论 #10274137 未加载
评论 #10274219 未加载