You’ve spent hours configuring your Virtual Private Server exactly the way you want. You have everything working, even that infuriating WSGI module in Apache to run multiple Python apps… Maybe I’ll do a write up of that soon. But for now what if your VPS provider has a bad day and loses everything? It would be silly to not backup your VPS. And it’s easy to do. I’ve written an automated script to take care of it.
Let’s create a backup script.
nano /home/username/scripts/backup.sh
Alright. Lets put the following in the file. Tailor to your needs. I’ll explain after.
#!/bin/bash
#Backup script for VPS
#Specify that we want to use highest compression
GZIP_OPT=-9
#Have to remove the old one or we will put the old backup in the new backup!
rm /var/www/html/backup.tar.gz
#Create a directory to put all our backups. We'll us /tmp, that's what it's for
mkdir /tmp/backup
#Here are all the folders we are going to backup
tar cpzf /tmp/backup/etc.tar.gz /etc
tar cpzf /tmp/backup/var-www.tar.gz /var/www
tar cpzf /tmp/backup/home-username.tar.gz /home/username
#This one is special, it backs up mysql databases
mysqldump --defaults-extra-file=/home/username/.msql.cnf -u root --single-transaction --quick --lock-tables=false --all-databases | gzip -9 > /tmp/backup/mysql.sql.gz
#compress the whole backup directory and put it where it will be downloadeable
tar cpzf /var/www/html/backup.tar.gz /tmp/backup
#Cleanup the files in the temp directory
rm -R /tmp/backup
Easy peasy. So what did we just do? Well, first we set a flag in your environment to make sure that gzip is using the highest compression. There are better ways to compress than gzip, but I find it easy and I’m not backing up a million things.
Second we remove the old backup file. You very specifically want to do this because we are storing the backup in one of the folders we are backing up. We don’t want our old backup in our new backup! The size would balloon every time we ran this. If this is the first time you’re running this script, that command will fail, but no big deal. Next time it’ll delete the old backup before creating the new one. If you don’t have space concerns you can store several backups in different files. I do this on the server that is downloading the backup, not on the one I’m backing up.
Then we make a backup directory to store all our backup data while we work with it.
Next is the list of all the folders to backup. Yours may vary. I’ve chosen to backup all my config files from /etc. All the website data in /var/www. And my home directory. Nothing else on the system seemed irreplaceable.
Now comes the fun part. We need to backup Mysql. That’s less easy than just compressing a folder from somewhere. We have to provide mysql with the root password to get to all the databases, this can be dangerous. So we will store that root password in a file and lock that file down as best we can.
nano /home/username/.msql.cnf
chmod 400 /home/username/.msql.cnf
This creates a file to store our password and makes it where no one except root and the owner can read it. Of course this is still not incredibly secure. Anyone with sudo privileges on your machine can see your password in plain text, but I thought it was a good trade off and better than putting it in the backup script. Put the following in the file.
[client]
password=ThisIsMyPassword
Now we can put all the archives together and move them somewhere to download onto our backup machine. Yes, I did compress everything again when putting it together. I could just tar it because it’s already compressed. Why did I do this? Not sure, seemed easier to stick with the same commands. I’m under the assumption I’m not actually compressing things more, but just wasting CPU cycles. Feel free to edit to your liking.
Note: I am moving the backup to a website hosted by apache. This will require that your apache is setup and able to host files. Which it is by default just by typing “sudo install apache2” (and opening port 80 in ufw if applicable). This is very insecure as anyone can download this file and you don’t want that. I have it downloading from a site that is only available over a tinc VPN. So my remote server connects on a VPN network to download the file. Much safer. That website is not viewable outside the VPN. This is up to you to setup. I’m just showing you how to put the file on the default apache website for anyone to download if they know it’s there. You should probably use SFTP instead, but I had to get it to a Windows server at one point and this seemed like a good method to do so. Feel free to use my script and then transfer with SFTP instead of using your web server. SCP is also a fine option. Or sharing NFS over your VPN connection. You get the picture. Just be warned if you do what I wrote here, someone else can download your file.
Finally, we clean up the directory we created in /tmp by removing it and all the files.
Note: I have not written a script to restore a VPS. It will be a little more intensive and I’ve not had to do it yet. But at least you will have your database, files, and configurations to make setup in the future an easy task. You will have to set up a new user and apt-get all the stuff you need, copy the selected config files out your /etc backup. You can straight copy your /var/www and home directories assuming your username is the same. Then run some apache commands to enable sites, blah blah blah. Your config will vary, but it will be much more intensive than the backup. However, you will have everything you require to make the setup MUCH easier than the first time and you won’t lose any irreplaceable data.
Finally we need to run the script automatically. Let’s use crontab for this. One hiccup here. This script has be run by root, at least on my setup. I have a few areas inside /etc that my normal user account can’t read (ssl certificates). You may not have this problem. So start crontab as root.
sudo crontab -e
and add the following line at the end. This will run the script as root, everyday, at 5am.
0 5 * * * /home/username/scripts/backup.sh
Are we done? No! The backup is still on our server. That doesn’t do us any good. We need to put it offsite. This is going to be setup dependent. Do you have Windows or Linux storing the backup?
If Linux is on your remote server open crontab
crontab -e
And put in a line telling it to download the backup everyday at 5:15am. Put in your IP address obviously
15 05 * * * wget http://10.0.0.xx/backup.tar.gz
This will put the backup in your home directory. Or you can transfer it with SFTP directly from the VPS. Dealer’s choice on how to get this file out of here. SCP works too if going to Linux.
If Windows is on your remote system. Set up a new scheduled task through the GUI and have it run the following powershell command to download a file automatically.
powershell.Exe -command "& {$client = new-object System.Net.WebClient;$client.DownloadFile('http://10.0.0.xx/backup.tar.gz','c:\Users\username\backup.tar.gz')}"
Now we are good. Your system will back itself up every day and your remote server will download it every day. Do what you need to with the backups now that they are on your remote server.