Backing up docker volumes
In today’s I-can’t-believe-I’m-doing-this-in-2024
I needed to re-build my webserver because it kept hard-freezing every week (another post for another day). Since I use a docker setup for this, my setup is pretty turnkey – I just needed to copy over my docker volumes from the old host to the new host.
That turned out to be a lot more annoying than what I wanted. See, this functionality hasn’t existed for a long time. You had to use some DIY StackOverflow scripts. Apparently, this functionality is now built into Docker Desktop, but A. I’m ssh’d into a server and B. Docker Desktop is the trojan horse where they extort people for licenses. In either case, I just have access to the docker daemon.
So, I had to make do myself. Mostly, I grabbed the Vackup script to power the primary logic. However, this was only partially working, because docker compose yells and refuses to start if the special “I created this with docker compose” labels aren’t added to the volumes.
The backup script grabs all the relevant metadata as a JSON blob, and then saves a .tar.gz for every named volume:
#! /bin/bash
set -euo pipefail
SCRIPT_DIR=$(realpath "$(dirname "$0")")
current_time=$(date "+%Y.%m.%d-%H.%M.%S")
backup_dir="$HOME/backups/$current_time"
echo "Creating backup directory $backup_dir"
mkdir -p "$backup_dir"
volumes=$(sudo docker volume ls -q)
metadata=$(sudo docker volume inspect $volumes --format json | jq 'map({name: .Name, labels: (.Labels // {})})')
echo "$metadata" >> "$backup_dir/metadata.json"
echo "Downloading vackup"
curl "https://raw.githubusercontent.com/BretFisher/docker-vackup/refs/heads/main/vackup" -s -o "$SCRIPT_DIR/vackup.sh"
chmod +x "$SCRIPT_DIR/vackup.sh"
for volume in $volumes; do
echo "Backing up $volume"
sudo "$SCRIPT_DIR/vackup.sh" export "$volume" "$backup_dir/$volume.tar.gz"
done
and then the restore script parses the metadata.json and undoes the process
#! /bin/bash
set -euo pipefail
backup_dir=${1:?Missing backup directory}
metadata=$(cat "$backup_dir/metadata.json")
if [ -z "$metadata" ]; then
echo "Missing $backup_dir/metadata.json"
fi
echo "Downloading vackup"
curl "https://raw.githubusercontent.com/BretFisher/docker-vackup/refs/heads/main/vackup" -s -o "$SCRIPT_DIR/vackup.sh"
chmod +x "$SCRIPT_DIR/vackup.sh"
echo "$metadata" | jq -c '.[]' | while IFS= read -r item; do
name=$(echo "$item" | jq -r '.name')
labels=$(echo "$item" | jq -r '.labels | to_entries | map("--label \(.key)=\(.value)") | join(" ")')
backup_file="$backup_dir/$name.tar.gz"
if [ -z "$backup_file" ]; then
echo "Missing $backup_file"
fi
echo "Creating $name"
sudo docker volume create "${name}" $labels
echo "restoring $name"
sudo "$SCRIPT_DIR/vackup.sh" import "$backup_file" "${name}"
echo "$name restored"
done
(Yes I know about the risks of executing random scripts off the internet. One time purpose and all). The only obvious downside is that this requires downtime, since the volumes aren’t kept in any kind of snapshot, but I was migrating anyways.
I once again am both happy of the power of jq for helping to parse data in bash scripts, as well as forgetful of the syntax every time & needing to search/LLM for the right technique
gmxuu8
oxcxwv
apuhkh
wedld0
qwaz15