Python: file stream safety

I am creating an application (application A) in Python that listens on a port, receives NetFlow records, encapsulates them, and reliably passes them to another application (application B). Appendix A also checks if the entry was successfully submitted. If not, you need to keep it. Application A waits for a few seconds and then tries to send it again, etc. This is the important part. If the submission fails, the entries should be saved, but at the same time, many other entries may appear and need to be saved as well. The ideal way to do this is in line. However, I need this queue to be in a file (on disk). I found for example this code http://code.activestate.com/recipes/576642/but it is "On open, loads full file into memory" and that is exactly what I want to avoid. I have to assume that this recording file will be up to two GB.

So my question is, what would you recommend keeping these records? It has to handle a lot of data, on the other hand, it would be great if it weren't too slow, because during normal activity, only one record was kept at a time and it was immediately read and deleted. So the underlying state is an empty queue. And it must be thread safe.

Should I be using a database (dbm, sqlite3 ..) or something like pickle, shelf or something?

I'm a little confused on this ... thanks.

+3


source to share


1 answer


You can use Redis as your database for this. It is very fast, it does it very surprisingly, and it can save its state to disk in several ways, depending on the level of resiliency you want. being an external process, you may not need to use a very strict retention policy, as if your program crashes, everything will be saved externally.



see here http://redis.io/documentation and if you want more details on how to do this in redis I would be happy to develop.

+1


source







All Articles