For an alternative, check out RQ: <a href="http://python-rq.org/" rel="nofollow">http://python-rq.org/</a><p>We use it in production and it's been rock-solid. The documentation is sparse but the source is easy to follow.
I'd recommend looking at alternative serialization formats. Pickle is a security risk that programmers writing distributed systems in Python should be educated about.
<p><pre><code> Having a way to pickle code objects and their dependencies is a huge win,
and I'm angry I hadn't heard of PiCloud earlier.
</code></pre>
That's a nice use of the cloud library, without using the PiCloud service. Unfortunately, the PiCloud service itself is shutting down on February 25th (or thereabouts).
Although Celery can use it, why is Amazon SQS treated as a second class citizen in python background worker systems?<p>I've yet to find/see a background worker pool that played nicely (properly) with SQS.
Thanks Jeff. As someone else mentioned, I love these little projects that demonstrate the basics of what the big projects actually do. Makes it much easier to understand the big picture.
I scratched an itch in this space to create, in Python, a web hook task queue. I wrote it up here <a href="http://ntorque.com" rel="nofollow">http://ntorque.com</a> -- would love to know if the rationale makes sense...
Are there any non-distributed task queues for Python? I need something like this for a tiny web application that just needs a queue for background tasks that is persisted, so tasks can resume in case the application crashes/restarts.
Installing Redis or even ZeroMQ seems kind of excessive to me, given that the application runs on a Raspberry Pi and serves maximum 5 users at a time.
I like these one-off projects that Jeff is doing, but it would be particularly instructive to see one, or a combination, make it to 'real' status.