Celery with SQS and S3 azamon

I would like to use celery to consume S3 events supplied by Amazon to SQS. However, the S3 message format does not match what Celery expects .

How can I use these messages with minimal hack? Should I write my own serializer? Should I give up and make my own bridge using boto or boto3?

As a side piece, I also want to link Celery with another broker (RabbitMQ) for the rest of the application messaging, if that matters.

+3


source to share


3 answers


For my particular use case, it turned out that the easiest way to create a worker-bridge that polls SQS and sets celery to jobs with a default broker.



Not difficult to do (although boto and SQS may use additional documentation) and Celery is not suitable for connecting to two different brokers at the same time, so it feels like the best way to do it.

+2


source


You will need to create a service that listens for S3 notifications and then starts the appropriate celery task.

You have a variety of options - S3 notifications go out through SQS, SNS, or AWS Lambda.



In fact, the simplest option might not be using Celery at all and just write code to run on AWS Lambda. I have not used this service (Lambda is relatively new), but it looks like it would mean you don't need to start monitoring service or celery workers, for example.

+3


source


Set up an AWS S3 event to call an AWS Lambda function. A function should be written to convert the S3 event message to the Celery message format and then post the Selery message. Celery took a message from SQS.

S3 Event → Lambda → SQS → Celery

+1


source







All Articles