How to share tensor model in different celery problems

I have been doing some NLP and there are models trained by Tensorflow. I provide some APIs for doing word truncation using these models and submitting requests through Celery.

The point is as follows:

Celery will send jobs to various workers (about 4-5), so each worker has to load the models above and hence memory will take up a lot.

So, are there any ways to exchange models among workers? I don't really know about the underlying mechanism of the celery worker.

thank

+3


source to share


1 answer


You can take a look at Tensorflow Serving which serves as your model as a gRPC API. It supports batching , which sounds like what you are trying to do. If for some reason you really need celery (like doing these tasks in the background), you can simply call the Tensorflow Serving API from Celery Tasks.



+1


source







All Articles