Prevent Celery Beat from doing the same task

I have celery jobs scheduled every 30 seconds. I have a program that runs as a task daily and another that runs weekly with time and day of the week. It checks for "start time" and "next scheduled date". The next scheduled date is not updated until the task completes.

However, I want to know how to make sure the celery bit only does one job. I see that now, celery will execute a specific task multiple times until this task is updated after the next scheduled date.

+3


source to share


1 answer


To do this, you need to implement some kind of "distributed locking", and a simple and reliable approach to this problem is to use the django cache with the memcached fileserver and set the "flag" on it when starting the task before finishing, remove this flag. Another option is to use the "redis" lock as a "distributed lock". An example of using the django memcached cache as a backend:

@shared_task
def my_task(arg1, arg2, lock_expire=300):
    lock_key = 'my_task'
    acquire_lock = lambda: cache.add(lock_key, '1', lock_expire)
    release_lock = lambda: cache.delete(lock_key)

    if acquire_lock():
        try:
            # Execute your code here!
        except Exception:
            # error handling here
        finally:
            # release allow other task to execute
            release_lock()
    else:
        print("Other task is running, skipping")

      



The code above implements "distributed locking" to ensure that only one task is executed, no matter how many times you try to complete it. The lock can only be acquired by one task :), the other will just skip the "main block" and finish. Does this make sense to you?

Good luck!

+3


source







All Articles