File_get_contents () error with multiple concurrent requests

I am running multiple PHP processes from the command line (using php.exe). Right now I think there are 16 processes (they all work the same, only for different clients, I won't go into too many details here for brevity).

All processes make a call file_get_contents()

that refers to the same file, which is a local file, not a URL.
The problem is that in some cases (and this is rather unstable) the process fails and there is an error that the file does not exist. But the file clearly exists and nothing happens and it will try to write to it, so I guess it doesn't work because it file_get_contents()

gets called to open it too many times at the same time.

I managed to get around it with some convoluted code that tries file_get_contents()

several times until it succeeds, but this seems like a hack. I was under the impression that there would be no problem for the read operation when I had many concurrent processes viewing the file, am I missing something here?

+3


source to share





All Articles