Like a lot of people I can admit that my backup strategy is not nearly as good as it should be. But with 80TB of data, backing up is going to take some serious thought and planning.
Luckily, the vast majority of that data is Plex media that is easily replaced. Therefore I can save myself a huge amount of stress thinking about where I can dump 80TB data.
Actually, when I look at what my important irreplaceable data is, I am looking at a much more manageable number. Around 200GB.
All of my data lives on a single Unraid server, a Dell R720 with a Netapp DS4246 disk shelf and a random selection of disks. On here I run multiple containers and a couple of VMs.
I then have a backup server which was made using the old parts from the server that the R720 replaced. A whitebox server with an i7-3770, 32GB RAM and 16x2TB drives. This server runs Truenas Core. After a bit of a rude awakening from the power company, to save power this server is not powered on unless it is performing a backup.
So what exactly am I backing up? Most of my important data is stored in Nextcloud, all my photos are backed up to Nextcloud as well as my partners photos. As well as that, I host my own mail server on a VM and that backs up to the unraid array so I need to backup those files as well as all my add data from the docker containers and finally the databases from the few database containers too.
Ok so now I know what I need to backup and where they will be going I can start to think about how. You can use Unraid plugins to help with this, like the CA backup plugin which will stop your containers and create a backup of the appdata. I didn't like using this as at the time of writing you can only back up containers from a single location and I have a dedicated SSD for most of my containers with a couple hanging onto the appdata share on the cache because they didn't like being moved onto the unassigned disk.
I started my journey using the backup app Duplicati running on my unraid server. It had a nice easy to understand GUI for setting up backups and watching their progress, but I was plagued with issues. It would work fine for a while, then it would just stop. There were constant issues with the database getting out of sync and needing to be rebuilt, which could take days.
I then decided to write my own script that I could modify easily and perform advanced functions like keeping some containers running, dumping the databases, entering maintenance mode for nextcloud and set up a be right back page on the external access for nextcloud.
I used rclone to copy over my directories to my backup server but I was getting errors about files being open and the script would stop on error. It also meant that I would need to manually manage the file retention and set the dates etc of all my backups. It was also really slow in my experience. One benefit was that the files were stored plainly on the backup server, so I could mount the backup directory and browse the backup files when and if I needed to restore. This may or may not be a good thing depending on where your backup is going. If it was going into the cloud, then encrypted backups may help you sleep better at night.
Enter Restic. I had heard Restic being mentioned here and there around the home-lab community but never looked into it until now.
For those who don't know Restic is a CLI backup tool that creates snapshots of your data. It is incredibly light weight. It stores your files in chunks rather than plain files which can be either a good or bad thing depending on how you feel. It also encrypts your backups as standard. It is also incredibly fast. It is compatible with sftp, s3, Azure and can even use rclone to backup to every backend compatible with that server too.
To put into perspective how fast it is, my backup using rclone would take 3-4 hours to complete, with restic it takes less than 2 hours for the first backup. What's more is that once your first seed backup has completed it gets much faster, completing in less than 30 minutes. This is because it creates incremental snapshots of your data so it doesn't need to re-create all the data once it's on your server.
So in summary, Restic is lightweight fast incremental snaptshots of your data that can go to nearly any cloud or local storage you can throw at it.
In the next part I will walk you through my script for backing up my unraid server to my local backup server.