I had this backup code working reliably for years, using local file system, vps/dedicated server, or remote storage for backup, then I finally get time to wrap README, iron few missing switches and publish. Should be production ready and reliable, so it could be useful to others. Contributors are welcome.<p><<a href="https://github.com/dusanx/saf">https://github.com/dusanx/saf</a>>
How do you automate the checking if the backup worked correctly, in face of saf bugs, rsync bugs/misconfiguration, or bit rot?<p>My solution is to pick a few random files (plus whatever is new), and compute their hashes on both local and remote versions. But it's slow and probabilistic. ZFS also helps, but I feel it's too transparent to rely on (what if the remote storage changes filesystem).
Wow, I like this a lot, as it looks easy to run and it can sync to multiple targets. My local backup consists of JBOD (not RAID, ZFS or BTRFS) so I think this should work nicely. I've been using a shell script for doing something similar for backup, but it lacked a lot of the features.