I want to create a "CGI script" in python that stays in memory and serves multiple requests

I have a website that is currently running creating static html pages from a cron job that is running at night.

I would like to add some search and filter functionality using a CGI script type, but my script will have enough startup time (maybe a few seconds?) That I want it to stay resident and serve multiple requests.

This is a side project I'm doing for fun and it won't be super hard. I don't mind using something like Pylons, but I don't feel like I need or need an ORM layer.

What would be the sane approach here?

EDIT: I would like to point out that for the download that I am expecting and processing, I need to be done on demand, I am sure that one python script in one process can handle all requests without any slowdowns, especially since my dataset will be resident.

+2


source to share


2 answers


What WSGI is for;)

I don't know what the easiest way is to turn a CGI script into a WSGI application though (I've always had this with a framework). However, it shouldn't be too difficult.



However, Introduction to the Python Web Server Gateway Interface (WSGI) seems like a sensible introduction, and you'll also want to take a look at mod_wsgi (assuming you're using Apache ...)

+4


source


perhaps you should direct your search to inter-process communication and make a search process that returns results to a web server. This search process will run all the time, assuming you have your own server.



-1


source







All Articles