1

I need to back up my data and I have not found a good way so far.

Just say I have 1 TB of a non-system disk with 50-100 GB of user data (binary files, source code, images, etc.). And another big disk, where I could save backups. I could use rsync or just cp, but I think it is not what I want.

I want an incremental backup. Restore a file/folder/whole drive from some point in time. Load a backup from some point in time to another disk (copy or just open read-only). See changes between backups and add an optional comment would be nice. Does anybody know a good cli backup tool? Maybe some snapshots tools? Or a git? But git for 50 GB of user data; isn't it nonsense? :D

Paulo Tomé
  • 3,754
  • 6
  • 26
  • 38

1 Answers1

0

+: Backintime would be a good option: "Back In Time is a simple backup solution for Linux Desktops. It is based on rsync and uses hard-links to reduce space used for unchanged files." It is generally GUI based, but has a client version. It does incremental backups by default, one can say, as already existing files are just hard linked, for the next backup, not copied again. You can set intervals to delete old backups. To restore you just copy the folder (or use the GUI or client version). A config file can be used (not visible at GUI) to set further options, like starting script before or after a backup, or which get triggered on errors. Backups can be triggered by certain actions (like connecting your USB backup disk), and of course by cron jobs. There is a log file which shows what has been newly backed up. You can have different backup profiles with complete different settings.

-: no disk images can be created, just files/folder are backed up. You can add no separate comment (but your profiles have a name and backups a timestamp)

Jaleks
  • 2,499
  • 1
  • 17
  • 34