Can PHP file handle multiple requests?

Instead of launching new instances of PHP script when receiving an HTTP request, is there a way for a single PHP script to handle multiple requests?

+2


source to share


6 answers


Didn't see any implementation of HTTP requests for this. All I was able to do is wait for all requests to return. You can do this from the command line by expanding the process and sending it to the background. Or you can use Gearman (distributed work) for this.



+1


source


PHP is built around the "Share Nothing" concept, which gives you the ability to load balance and scale your application more efficiently using a distributed network. Therefore, "no" is impossible to do. If you think the initiation costs are high, perhaps you can tune the architecture to conceptually cache your objects / data / views as much as possible. Use serialize () or whatever.



+2


source


If you make the file an HTTP server and run it as a process, then yes.

If it goes through Apache and mod_php, no.

(why don't you wish it?)

+1


source


As far as I know, this is not the case. The closest thing I can think of is using an oppode cache like (xcache or APC). They will cache the code for faster script execution. I believe each request will have its own script instance.

+1


source


What you want is to cache the data.

Your php script should just check if there is any valid request data in the cache. If not, then read your database, refresh the cache and return the results to the user.

I would suggest looking into the various caching libraries and looking closely at how you will scale your cache. One place to start is Zend_Cache, possibly from memcached to the back-end.

+1


source


Scripts that handle HTTP requests can receive data from a small PHP daemon using sockets.

Here is a useful library for PHP daemons: http://github.com/kvz/system_daemon

And some kind of documentation:

http://kevin.vanzonneveld.net/techblog/article/create_daemons_in_php/

0


source







All Articles