Use a timeout to prevent deadlock when opening a file in Python?
I need to open a file that is NFS installed on my server. Sometimes NFS mounts fail in such a way that all file operations are blocked. To prevent this, I need a way for the function open
to time out python after a given period. For example. something like open('/nfsdrive/foo', timeout=5)
. Of course, by default the procedure open
does not have timeout
or a similar keyword.
Does anyone know how to effectively stop trying to open a (local) file if it takes too long to open?
Note. I've already tried urllib2 module, but timeout parameters only work for web requests, not local ones.
source to share
You can try using stopit
from stopit import SignalingTimeout as Timeout
with Timeout(5.0) as timeout_ctx:
with open('/nfsdrive/foo', 'r') as f:
# do something with f
pass
In multi-threaded environments (like Django), problems with SignalingTimeout
. ThreadingTimeout
on the other hand, can cause resource problems on some virtual hosts when running too many "time-limited" functions
PS My example also limits the processing time of an open file. To restrict file opening you have to use a different approach with manual file opening / closing and manual exception handling
source to share