How do I create a Cron job to back up MySQL and FTP backup to my backup server?

I want to set up a cron job to run so that it automatically backs up my MySQL database while the database is running and then FTP that backs up to my backup server.

I am guessing I can do this using a bash script.

Does anyone know a good way to do this?

Thanks in advance.

+2


source to share


2 answers


This is a very simple approach using the lftp ftp client:

backup.sh:

mysqldump -f [database] | gzip > /backup/[database].dump.gz
lftp -f /backup/lftp.script

      

lftp.script:



open backup.ftp.example.com
user [username] [password]
cd /backup
mv webflag.dump.gz.8 webflag.dump.gz.9
mv webflag.dump.gz.7 webflag.dump.gz.8
mv webflag.dump.gz.6 webflag.dump.gz.7
mv webflag.dump.gz.5 webflag.dump.gz.6
mv webflag.dump.gz.4 webflag.dump.gz.5
mv webflag.dump.gz.3 webflag.dump.gz.4
mv webflag.dump.gz.2 webflag.dump.gz.3
mv webflag.dump.gz.1 webflag.dump.gz.2
mv webflag.dump.gz webflag.dump.gz.1

      

Note. This approach has several problems:

  • ftp is unencryped, so anyone who can sniff the network can see both the password and the database details. Piping through gpg -e [key] can be used to encrypt the dump, but ftp passwords remain unencrypted (sftp, scp are the best alternatives).
  • If someone breaks into the database server, they can use the user information in this script to access the ftp server and depending on the rights, delete the backups (this happened in the real world: http://seclists.org/fulldisclosure/2009/ Jun / 0048.html )
+2


source


If the database is very large, the backup file may not fit the server hard drive. In this case, I suggest you the following way of using pipes and ncftpput :

mysqldump -u <db_user> -p<db_password> <db_name> | gzip -c | ncftpput -u <ftp_user> -p <ftp_password> -c <ftp_url> <remote_file_name>

      



This works great for me.

0


source







All Articles