Celery Deadlock when saving data in Django, why?

I have a Deadlock when using a Celery task to save new clients from CSV. This is what I have been working on so far.

  for line in csv.reader(instance.data_file.read().splitlines()):
        for index, item in enumerate(line):
            number = int(item)
            # TODO: Turn into task
            Customer.objects.create_customer(
                mobile=number,
                campaign=instance.campaign,
                reward_group=instance.reward_group,
                company=instance.company,
            )

      

No mistakes.

However, when adding this same code to the Celery task, I get the following error ...

Deadlock encountered while trying to acquire a lock; try restarting the transaction

So this makes me think I did something wrong with my celery setup here. Can anyone spot what?

Here is a new Celery task that gives a deadlock error. I use shared_task

since this task will run at some point on another machine without Django, but it doesn't matter at the moment.

The first line in the CSV import is ok, then I get a deadlock error ...

for line in csv.reader(instance.data_file.read().splitlines()):
    for index, item in enumerate(line):
        number = int(item)
        celery_app.send_task('test.tasks.create_customer_from_import', args=[number, instance.id], kwargs={})

      

tasks.py

# Python imports
from __future__ import absolute_import

# Core Django imports
from celery import shared_task

from mgm.core.celery import app as celery_app

@shared_task
def create_customer_from_import(number, customer_upload_id):
    customer_upload = CustomerUpload.objects.get(pk=customer_upload_id)
    new_customer = Customer.objects.create_customer(
        mobile=number,
        campaign=customer_upload.campaign,
        reward_group=customer_upload.reward_group,
        company=customer_upload.company,
    )
    return new_customer

      

celery.py

from __future__ import absolute_import

import os

from celery import Celery

from django.conf import settings

# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'test.settings')

app = Celery('test-tasks')

# Using a string here means the worker will not have to
# pickle the object when using Windows.
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)

      

This is the CustomerManager:

class CustomerManager(models.Manager):
    def create_customer(self, mobile, campaign, reward_group, company, password=None):.
        user = AppUser.objects.create_user(mobile=mobile)
        # Creates a new customer for a company and campaign
        customer = self.model(
            user=user,
            campaign=campaign,
            reward_group=reward_group,
            company=company
        )

        customer.save(using=self._db)

      

+3


source to share


1 answer


Your code doesn't look wrong, but you're probably getting a dead end due to the concurrency of multiple celery workers. From http://celery.readthedocs.org/en/latest/faq.html#mysql-is-throwing-deadlock-errors-what-can-i-do :



MySQL has a default isolation level set to REPEATABLE-READ, if you don't really need to, set it to COMMITTEE READ. You can do this by adding to your my.cnf:

[mysqld]
transaction-isolation = READ-COMMITTED

      

+3


source







All Articles