I have a directory containing a high number of files (like logs for every day of the year).
I would like to remove all files created before let's say 22/11. How can I achieve that ? Must I use find then exec -rm? I'm using ksh.
- 107,579
- 20
- 231
- 279
- 4,038
- 9
- 29
- 31
-
1The usual caveat is that Unix filesystems don't generally contain a record of when the file was *created* - only the times that the inode and the content were *last modified* are available. – Toby Speight Oct 06 '16 at 15:54
3 Answers
Using find is still the preferred way of deleting files. See http://mywiki.wooledge.org/UsingFind for more.
One way of doing this is to create a file with the time-stamp in it. e.g
touch -t 201311220000 /tmp/timestamp
Now delete the files GNUfind (assuming in the current directory) that match the time-stamp e.g:
find . -type f ! -newer /tmp/timestamp -delete
or non GNU find
find . -type f ! -newer /tmp/timestamp -exec rm {} \;
- 234,489
- 66
- 447
- 667
- 9,244
- 3
- 25
- 38
-
3What about this ? Does it work ? find -type f -mtime +5 -exec rm -f {} \; – user1058398 Nov 27 '13 at 13:56
-
1@user1058398 That will delete files older than 5 days from now. – Valentin Bajrami Nov 27 '13 at 14:23
-
-
1@Avatar no because the `type -f` limits the action to the files only. – Valentin Bajrami Jun 15 '17 at 08:20
With GNU or some BSD finds:
find . ! -newermt 2013-11-22 ! -type d -delete
Note that it checks the last modification time of the files. On some BSDs, you can use -newerBt in place of -newermt to check the file's inode birth time if available instead.
Note that it will also delete the files created at 2013-11-22 00:00:00.0000000000 exactly, not that any clock is that precise anyway, but that could cause problems for files whose timestamp has been arbitrary set, such as with touch -d 2013-11-22T00:00:00 some-file (or touch -d 2013-11-22 with some touch implementation). You could always change it to ! -newermt '2013-11-21 23:59:59.999999999999' (GNU) or ! -newermt '2013-11-21 23:59:59' (BSDs, though that would miss the files last modified within the last second of 2013-11-21).
- 522,931
- 91
- 1,010
- 1,501
find /path/to/directory/ -mtime +<number of days> -name '<file name>' -exec rm -rf {} \;
example:
find /Netap_fileshare_backup/SQL/DB_backups/xeo/ -mtime +15 -name 'ORA_XEO*' -exec rm -rf {} \;
In this case it will remove all files that start whith "ORA_XEO" with more than 15 days.
- 21
- 4
-
Though better watch it with the `rm -rf`, the first command would remove everything contained in any directories that are older than those 15 days. (Also, why the parenthesis around `-name`?) – ilkkachu Oct 06 '16 at 14:01
-
-
but if we want to add a condition it can be usefull. for example if we want to remove all files that ends with ".jar" or ".cp", and start with "ex". `example.jar` - it will be removed `example.cp` - it will be removed `example.tar` - it wont be removed – calafate Oct 06 '16 at 15:03
-
in this case we can use: `find /path/to/directory/ -mtime +
\( -name '*.jar' -o -name '*.cp' \) -name 'ex*' -exec rm -rf {} \;` – calafate Oct 06 '16 at 15:07 -
That will remove some files from the _
_ rather than all files older than _ – roaima Oct 06 '16 at 15:58_ unless you run it exactly at midnight and it finishes within a second. With GNU `find` you should use the `-daystart` modifier to coerce the `-mtime` value to midnight. -
yes thanks. In my case i run this with crontab at all sundays at midnight so i dont have problem with the hour. but thanks anyway -daystart its a good tip to use! – calafate Oct 06 '16 at 16:22