Questions tagged [backup]

Backing up is the process of making copies (a backup) of data which may be used to restore the original after a data loss event.

1360 questions
413
votes
2 answers

Compress a folder with tar?

I'm trying to compress a folder (/var/www/) to ~/www_backups/$time.tar where $time is the current date. This is what I have: cd /var/www && sudo tar -czf ~/www_backups $time" I am completely lost and I've been at this for hours now. Not sure if…
qwerty
  • 4,271
  • 5
  • 16
  • 13
205
votes
11 answers

Rsync filter: copying one pattern only

I am trying to create a directory that will house all and only my PDFs compiled from LaTeX. I like keeping each project in a separate folder, all housed in a big folder called LaTeX. So I tried running: rsync -avn *.pdf ~/LaTeX/ ~/Output/ which…
Seamus
  • 3,553
  • 7
  • 25
  • 25
121
votes
7 answers

Create a tar archive split into blocks of a maximum size

I need to backup a fairly large directory, but I am limited by the size of individual files. I'd like to essentially create a tar.(gz|bz2) archive which is split into 200MB maximum archives. Clonezilla does something similar to this by splitting…
Naftuli Kay
  • 38,686
  • 85
  • 220
  • 311
118
votes
20 answers

What is the fastest way to send massive amounts of data between two computers?

This is a situation I am frequently in: I have a source server with a 320GB hard-drive inside of it, and 16GB of ram (exact specs available here, but as this is an issue I run into frequently on other machines as well, I would prefer the answer to…
IQAndreas
  • 10,145
  • 21
  • 59
  • 79
97
votes
6 answers

Clear unused space with zeros (ext3,ext4)

How to clear unused space with zeros ? (ext3,ext4) I'm looking for something smarter than cat /dev/zero > /mnt/X/big_zero ; sync; rm /mnt/X/big_zero Like FSArchiver is looking for "used space" and ignores unused, but opposite site. Purpose: I'd…
Grzegorz Wierzowiecki
  • 13,865
  • 23
  • 89
  • 137
93
votes
9 answers

Recovering accidentally deleted files

I accidentally deleted a file from my laptop. I'm using Fedora. Is it possible to recover the file?
C.S.
  • 1,783
  • 2
  • 18
  • 21
83
votes
7 answers

What directories do I need to back up?

What are the directories one should back up, in order to have a backup of all user-generated files? From a vanilla debian install, I can do enough apt to get the packages that I want. So if I don't want to backup the entire system, where all in the…
user394
  • 14,194
  • 21
  • 66
  • 93
81
votes
8 answers

rsync all files of remote machine over SSH without root user?

I have this command to backup a remote machine. The problem is that I need root rights to read and copy all files. I have no root user enabled for security reasons and use sudo the Ubuntu way. Would I need some cool piping or something to do…
redanimalwar
  • 1,027
  • 1
  • 10
  • 13
72
votes
1 answer

force rsync to overwrite files at destination even if they're newer

I have an rsync backup script I run, which also restores files back where they came from when I ask. But if the files at the destination are newer than those in the backup when I try to restore, it will not replace them. I really want to replace the…
jedipixel
  • 829
  • 1
  • 6
  • 4
61
votes
4 answers

how to tell rsync to preserve time stamp on files when source tree has a mounted point

Related to this question Short description of the problem: When source tree has a mounted point inside it, then time stamps on files inside that mounted point when copied to target tree are not preserved even when using -a option Detailed…
Nasser
  • 901
  • 2
  • 8
  • 12
55
votes
9 answers

Easy incremental backups to an external hard drive

For a while I used Dirvish to do incremental backups of my machines, but it is slightly cumbersome to configure, and if you do not carry a copy of your configuration it can be hard to reproduce elsewhere. I am looking for backup programs for Unix,…
miguel.de.icaza
  • 5,249
  • 3
  • 30
  • 25
40
votes
4 answers

Fastest way combine many files into one (tar czf is too slow)

Currently I'm running tar czf to combine backup files. The files are in a specific directory. But the number of files is growing. Using tzr czf takes too much time (more than 20 minutes and counting). I need to combine the files more quickly and in…
Najib-botak Chin
  • 493
  • 2
  • 5
  • 7
34
votes
4 answers

How to run a command when a directory's contents are updated?

There is a directory A whose contents are changed frequently by other people. I have made a personal directory B where I keep all the files that have ever been in A. Currently, I just occasionally run rsync to get the files to be backed up from A to…
oadams
  • 2,305
  • 5
  • 21
  • 20
33
votes
9 answers

How to keep track of changes in /etc/

I would like to keep track of changes in /etc/ Basically I'd like to know if a file was changed, by yum update or by a user and roll it back if I don't like the chage. I thought of using a VCS like git, LVM or btrfs snapshots or a backup program for…
taffer
  • 1,553
  • 11
  • 19
29
votes
8 answers

What to use to backup files, preserving ACLs?

When using the tar utility to store files in backups one loses the extended ACLs. Is there some commonly used and not hackish solution (like: create a script that will recrate the ACLs from scratch) to preserve the ACLs?
silk
  • 1,512
  • 2
  • 14
  • 12
1
2 3
90 91