Paramiko ssh die / hang with great output
I am trying to back up a server using Paramiko and ssh to invoke a command tar
. When there are a limited number of files, everything works well, but when it is a large folder, the script waits indefinitely. The following test shows me that the problem comes from the stdout size.
Is there a way to fix this and run this command?
Big output:
query = 'cd /;ls -lshAR -h'
chan.exec_command(query)
while not chan.recv_exit_status():
if chan.recv_ready():
data = chan.recv(1024)
while data:
print data
data = chan.recv(1024)
if chan.recv_stderr_ready():
error_buff = chan.recv_stderr(1024)
while error_buff:
print error_buff
error_buff = chan.recv_stderr(1024)
exist_status = chan.recv_exit_status()
if 0 == exist_status:
break
Result (not ok - block - die?)
2015-07-25 12:57:07,402 --> Query sent
Small case lead:
query = 'cd /;ls -lshA -h'
chan.exec_command(query)
while not chan.recv_exit_status():
if chan.recv_ready():
data = chan.recv(1024)
while data:
print data
data = chan.recv(1024)
if chan.recv_stderr_ready():
error_buff = chan.recv_stderr(1024)
while error_buff:
print error_buff
error_buff = chan.recv_stderr(1024)
exist_status = chan.recv_exit_status()
if 0 == exist_status:
break
Result (okay)
2015-07-25 12:55:08,205 --> Query sent total 172K 4.0K drwxr-x--- 2 root psaadm 4.0K Dec 27 2013 archives 0 -rw-r--r-- 1 root root 0 Jul 9 23:49 .autofsck 0 -rw-r--r-- 1 root root 0 Dec 27 2013 .autorelabel 4.0K dr-xr-xr-x 2 root root 4.0K Dec 23 2014 bin 2015-07-25 12:55:08,307 --> Query executed (0.10)
source to share
If it ls -R
prints a lot of errors (probably if the current user doesn't have root
=> doesn't have access to all folders), your code will end up blocking.
This is because the output buffer of the error stream eventually fills up, so it ls
stops working, waiting for you to read the stream (with an empty buffer).
While you wait for the regular output stream to complete, which it never does, it ls
expects you to read the error stream, which you never do.
You must read both streams in parallel.
Or even easier, use Channel.set_combine_stderr
to combine both streams into one.
source to share