Best Practice for Python Process Control

This is my first hack in system-level programming (mainly LAMPhp, Drupal in particular, web developer up to this point).

Due to the presence of a library with a very specific function, I am using Python for an upcoming project. I need to start, reload as needed, monitor and respond to the output of multiple Python script processes, ideally controlled via the HTTP API from another main program that stores a database of the processes to be started and some metadata about those processes (parameters , pid, etc.). I am planning to create this master program in PHP as I have much more experience with it, so I need to have a good HTTP API.

Is there any best practice for this type of system? Some initial research led me to a supervisor (who apparently built XML-RPC, but apparently), but I thought I would test the wisdom of the masses that were actually on the way before moving forward with testing.

+3


source to share


1 answer


I cannot say that I have walked this road, but I am working to walk this road. I would look at multiprocessing libraries for Python. There are network transparent libraries. Several routes that you could take with you: 1. Create a process that controls all other processes. Make this process a server that you can control with your PHP. 2. Determine how to get PHP to communicate with these Python network processes. However, they still need to be run from the central Python process.



0


source







All Articles