Secure concurrent writing to a file shared between Perl and PHP script

A Perl script (which I have no control over) periodically adds lines to the end of the text file.

I need my PHP script (which will run as a cron job) to read lines from this file, process them, and then delete them from the file. But it seems the only way to remove a line from a file with PHP is to read the file into a variable, delete one line, truncate the file, and then overwrite the file.

But what happens if:

  • PHP reads the file
  • Perl script adds a new line.
  • The PHP script writes the modified buffer back to the file.

In this case, the newline will be lost because it will be overwritten when the PHP script finishes and updates the file.

Is there a way to lock a file using PHP in a way that Perl will respect? It looks like the flock () function is PHP specific.

+2


source to share


4 answers


Do you have the freedom to change the design? Does the processed lines from the file remove an important part of your processing?



If you have that freedom how to allow the extension of a file generated by perl. Presumably the authors of the perl script already have some homework to do? Maintain your own "log" of what you have processed. Then, when your script runs, it reads the perl file up to the point recorded in your "log". Process the entry, update the log.

+3


source


If a Perl script that you have no control over already implements file locking via flock, then you're fine. If it isn't (and I'm afraid we have to accept it), you're out of luck.



+1


source


Another possibility is to instead of writing a perl script to a file, write it to a named pipe and your php script reads right on the other end and lets it write to a real file.

+1


source


Perhaps you could have let your php script run on a copy instead of working on the same file? I think it could work with three files:

  • Perl script file
  • File copy 1
  • Processed version of file 2

Then, when your php script runs, it checks if file 1 is newer than file 2, and if that makes a new copy, processes that (possibly skipping the number of previously processed lines) and writes that to file 3.

0


source







All Articles