If your backup is large and on a storage VPS your provider may not like you keeping a CPU core busy for the length of time needed to checksum everything - use cpulimit to throttle down as needed (which indirectly limits IO throughput over large files too, useful if that is your pain point, though not so much IOPS when scanning many small files).įor older snapshots, a similar checksum process occasionally runs and compares the results to the previous one rather than comparing to current data. I run that on the live data and the backups via cron once a week, if there is a difference between them I get a message (and I check manually occasionally for paranoia's sake). I have simple port monitoring on them so if either is down for any length of time (for the mail server this usually means the restore failed and Zimbra didn't start properly because of that) I get a text.įor current files I have a script that scans for all files not modified in the last day (so might not be on the latest daily backup image), sorts that list and runs sha256 over the list. My backups and tests are strung together manually using rsync+cron and similar, nothing by the way of specific tools.įor automated testing of my mail and web servers I have small VMs running the same OS that restore from the latest backup daily - I check them occasionally to make sure they have the most recent things they should have (I could automate this bit but have never got around to it). If you know any tool to test i would be interested Important question here! Most of the time i need one file sometime (like more kind of versionning). Thanks a lot for all your advices and app said: Which of course highlights the vital follow-on question: how are you all testing your backups, and how often?! Useful frontend when your team is not that advanced with SSH, you can keep the compression locally to the BackupPC server even if you get file from remote, etc. I'm not that used to advanced security like that but as most of us i try to make the best choices for long term use. On the side i've read few discussions regarding encryption methods, looks like some use good encryption, some others use double encryption which sounds a bad idea (and also use of OpenSSL) regarding some people with security interest, some use also homemade encryption while there is existing good encryption algorithms. What do you use for your backups that support S3 as backend? I also want to use encryption, compression and dedup. I would prefer something free so Duplicacy, which i use and like a lot for my home personnal usage, is not an option. I've heard few good things about Kopia and feel like it's the way i may go. For my Linux server backups, i really love BorgBackup but i'm looking now for something that natively support S3.
0 Comments
Leave a Reply. |