Best way to monitor services on multiple servers using python

What would be the best way to monitor services like HTTP / FTP / IMAP / POP3 / SMTP for multiple servers from python? Using sockets and trying to connect to the service port http-80, ftp-21, etc ... and if the connection is successful then the service is approved or uses python libraries to connect to the specified services and handle exceptions / return codes / etc. ..

For example, for ftp it is better

import socket
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
try:
    s.connect(("ftp.domain.tld", 21))
    print "ok"
except:
    print "failed"
s.close()

      

or

from ftplib import FTP

try:
    ftp = FTP('ftp.domain.tld')
    print "FTP service OK"
    ftp.close()
except:
    print "FTP err"

      

the same goes for the rest, socket vs urllib2, imaplib, etc ... and if i go in the lib path how can i check smtp?

+1


source to share


1 answer


update 1:

Actually, there is no difference in your code between the two parameters (in FTP). The second option should be preferred for reading code. But how not to log into the ftp server, or maybe read a file?



update 0:

When monitoring, it is better to check the full stack. Because otherwise, you might miss out on problems that don't show up at the socket level. Smtp can be tested with smtplib. The best way is to post mail. And read it from target with imaplib. Or you can just use smtplib to talk to the SMTP server. Better to do everything from end to end. But development and configuration resources should be considered against the impact of missing issues.

+2


source







All Articles