Celery workers don't fill concurrency slots

I have one worker with concurrency of 4. I see 4 processes started in flower and everything looks good.

If I do this in the shell, I see that 4 workers are doing tasks and the rest are reserved and process 4 at a time until the queue is empty.

[my_task.apply_async() for i in xrange(10)]

      

However, if I do it line by line, then only the first two tasks are actively running, and since then it only processes two at a time.

my_task.apply_async()
my_task.apply_async()
my_task.apply_async()
my_task.apply_async()
...

      

Any ideas?

+3


source to share


1 answer


This usually happens due to sub-processes filling the concurrency slots. Celery uses the preview as its execution pool by default, and every time you create a subprocess of a task (with a different fork) it is considered a running process to fill the concurrency slots.

The easiest way to avoid this is to use eventlet , which allows you to make multiple asynchronous calls for each task. However, this requires that none of your tasks block calls such as subprocess.communicate

since they block all tasks.



Otherwise, if you have the necessary blocking calls and you know that your tasks will only have one subprocess at a time, you can set CELERYD_CONCURRENCY

to double ( 8

) and set the start time of your tasks, so 8 tasks will not start right away (for example , using @app.task(rate_limit='10/m')

). This is a bit of a hack, however, and using eventlet would definitely be preferred.

+3


source







All Articles