I have 2 situations I want to ask about. I am currently using redis to process 10's of thousands of jobs a day now. I have been enjoying it and seems to be the right tool for the job for when I started. I have some interesting characteristics though.
Question 1) My jobs are smaller jobs that are part of bigger job. Each job takes 30 seconds and when all related jobs are done, the "master job" is considered complete. I have multiple people who are submitting to the queue. Since they each have 10k jobs each. Its the first 10k jobs from user A that get processed ,then the next 10k jobs from user B get processed. How could I mix the jobs so so that jobs from User A and User B are intermingled so that they both make progress and can review the results in realtime.<p>Question 2) Sometimes I inspect the smaller job results, and if they are bad, I delete the rest of the jobs. In redis I would need to select all jobs and filter them based off some criteria and then remove them. Im not sure how practical this is.<p>I have an accompanying database in postgresql. I was thinking about moving the queue to postgresql, but I think not giving the workers db access is safer/easier, but maybe I'm wrong? Another alternative I was thinking is to make a db/redis hybrid where postgresql is watching the redis queue and when the queue gets to under X items, it keeps filling up the queue. That seems like overkill.
Could try RQ <a href="https://python-rq.org/" rel="nofollow">https://python-rq.org/</a> it has enqueue funcs which can be called when one job finishes. Can fork, diff workers, etc. And simple