43

I am trying to write an if statement to test whether there are any files matching a certain pattern. If there is a text file in a directory it should run a given script.

My code currently:

if [ -f /*.txt ]; then ./script fi

Please give some ideas; I only want to run the script if there is a .txt in the directory.

Gilles 'SO- stop being evil'
  • 807,993
  • 194
  • 1,674
  • 2,175
user40952
  • 467
  • 1
  • 5
  • 5
  • 4
    Are you sure "the directory" is supposed to be `/`? Also, you're missing a semicolon before `fi`. – depquid Jun 13 '13 at 15:44
  • The cleanest robust solution I've encountered is to use `find` as explained [here on stackoverflow](https://stackoverflow.com/a/4264351/411282). – Joshua Goldberg Nov 18 '19 at 17:48

13 Answers13

51
[ -f /*.txt ]

would return true only if there's one (and only one) non-hidden file in / whose name ends in .txt and if that file is a regular file or a symlink to a regular file.

That's because wildcards are expanded by the shell prior to being passed to the command (here [).

So if there's a /a.txt and /b.txt, [ will be passed 5 arguments: [, -f, /a.txt, /b.txt and ]. [ would then complain that -f is given too many arguments.

If you want to check that the *.txt pattern expands to at least one non-hidden file (regular or not), in the bash shell:

shopt -s nullglob
set -- *.txt
if [ "$#" -gt 0 ]; then
  ./script "$@" # call script with that list of files.
fi
# Or with bash arrays so you can keep the arguments:
files=( *.txt )
# apply C-style boolean on member count
(( ${#files[@]} )) && ./script "${files[@]}"

shopt -s nullglob is bash specific (shopt is, nullglob actually comes from zsh), but shells like ksh93, zsh, yash, tcsh have equivalent statements.

With zsh, the test for are there files matching a pattern can be written using an anonymous function and the N (for nullglob) and Y1 (to stop after the first find) glob qualifier:

if ()(($#)) *.txt(NY1); then
  do-something
fi

Note that those find those files by reading the contents of the directory, it doesn't try and access those files at all which makes it more efficient than solutions that call commands like ls or stat on that list of files computed by the shell.

The standard sh equivalent would be:

set -- [*].txt *.txt
case "$1$2" in
  ('[*].txt*.txt') ;;
  (*) shift; script "$@"
esac

The problem is that with Bourne or POSIX shells, if a pattern doesn't match, it expands to itself. So if *.txt expands to *.txt, you don't know whether it's because there's no .txt file in the directory or because there's one file called *.txt. Using [*].txt *.txt allows to discriminate between the two.

Now, if you want to check that the *.txt matches at least one regular file or symlink to regular file (like your [ -f *.txt ] suggests you want to do), or that all the files that match *.txt are regular files (after symlink resolution), that's yet another matter.

With zsh:

if ()(($#)) *.txt(NY1-.); then
  echo "there is at least one regular .txt file"
fi
if ()(($#)) *.txt(NY1^-.); then
  echo "there is at least one non-regular .txt files"
fi

(remove the - if you want to do the test prior to symlink resolution, that is consider symlinks as non-regular files whether they point to regular files or not).

Stéphane Chazelas
  • 522,931
  • 91
  • 1,010
  • 1,501
  • `[ -f /*.txt ]` is quite fast in comparison to `compgen`. – Daniel Böhmer Nov 23 '16 at 22:06
  • @DanielBöhmer `[ -f /*.txt ]` would be wrong, but in my test on a directory that contains `3425` files, `94` of which are non-hidden txt files, `compgen -G "*.txt" > /dev/null 2>&1` appear to be as fast as `set -- *.txt; [ "$#" -gt 0 ]` (20.5 seconds for both when repeated 10000 times in my case). – Stéphane Chazelas Nov 24 '16 at 09:51
  • Failure: The "standard shell `sh` solution" counts non-regular files (i.e: directories) also. Any directory that match the pattern will also be included. –  Apr 24 '22 at 21:33
17

You could always use find:

find . -maxdepth 1 -type f -name "*.txt" 2>/dev/null | grep -q . && ./script

Explanation:

  • find . : search the current directory
  • -maxdepth 1: do not search subdirectories
  • -type f : search only regular files
  • name "*.txt" : search for files ending in .txt
  • 2>/dev/null : redirect error messages to /dev/null
  • | grep -q . : grep for any character, will return false if no characters found.
  • && ./script : Execute ./script only if the previous command was successful (&&)
terdon
  • 234,489
  • 66
  • 447
  • 667
  • 5
    `find` only returns false if it has trouble looking for files, not if it doesn't find any file. You want to pipe the output to `grep -q .` to check if it finds something. – Stéphane Chazelas Jun 13 '13 at 16:57
  • @StephaneChazelas you're quite right of course. Weird though, I'd tested it and it seemed to work. Must have done something strange because it doesn't any more. When will find "have trouble finding files"? – terdon Jun 13 '13 at 17:10
  • @terdon, like when some directory is inaccessible, or I/O errors or any error returned by any system call it makes. In that case, try after `chmod a-x .`. – Stéphane Chazelas Jun 13 '13 at 17:33
16

A possible solution is also Bash builtin compgen. That command returns all possible matches for a globbing pattern and has an exit code indicating whether any files matched.

compgen -G "/*.text" > /dev/null && ./script

I found this question while looking for solutions that are faster though.

Daniel Böhmer
  • 488
  • 4
  • 15
9

Here's a one liner to do it:

$ ls
file1.pl  file2.pl

files exist

$ stat -t *.pl >/dev/null 2>&1 && echo "file exists" || echo "file doesn't exist"
file exists

files don't exist

$ stat -t -- *.txt >/dev/null 2>&1 && echo "file exists" || echo "file don't exist"
file don't exist

This approach makes use of the || and && operators in bash. These are the "or" and "and" operators.

So if the stat command returns a $? equal to 0 then the first echo is called, if it returns a 1, then the second echo is called.

return results from stat

# a failure
$ stat -t -- *.txt >/dev/null 2>&1
$ echo "$?"
1

# a success
$ stat -t -- *.pl >/dev/null 2>&1
$ echo "$?"
0

This question is extensively covered over on stackoverflow:

slm
  • 363,520
  • 117
  • 767
  • 871
  • 1
    Why use the non standard `stat` when `ls -d` can do the same? – Stéphane Chazelas Jun 13 '13 at 17:03
  • I thought `ls -d` lists a directory? Didn't seem to work when I just tried listing a directory with files in it `ls -d *.pl` for example. – slm Jun 13 '13 at 17:36
  • You can replace the statement to the left of `&&` by `ls *.txt` and it will work as well. Make sure you send the stdout and stderr to `/dev/null` as suggested by @slm. – unxnut Jun 13 '13 at 17:50
  • 1
    If you use `ls *.txt` and there are no files present within the directory this will return a `$? = 2`, which will still work with the if then, but this was one of my reasons for choosing `stat` over `ls`. I wanted a 0 for success, and a 1 for a failure. – slm Jun 13 '13 at 17:57
  • `ls -d` is to list directories instead of their content. So `ls -d` just does the `lstat` on the file, just like GNU `stat` does. What non-zero exit status commands return on failure is system specific, it makes little sense to make assumptions on them. – Stéphane Chazelas Jun 13 '13 at 18:58
  • Is it OK to assume that 0 is a success, and anything else is a failure then? – slm Jun 13 '13 at 18:59
  • It's the shell that expands the wildcard, your stat or `ls -d` is just there to return an error when the pattern expands to itself when there's no matching file. In this case, because the pattern matches itself, it will generally work (though could fail if some files have disappeared or appear in between the time the shell reads the content of the directory and `stat` does an `lstat(2)` on those files, or if `lstat(2)` fails for any reason), but in cases like `stat -- *.[cC]`, if there's no `c` nor `C` file in the current directory, but there's a file called `*.[cC]`, your `stat` will succeed. – Stéphane Chazelas Jun 13 '13 at 19:02
  • Sorry if I"m being dense but I don't understand the difference still b/w `stat` vs. `ls -d`. The man page states the following for `lstat()` - lstat() is identical to stat(), except that if path is a symbolic link, then the link itself is stat-ed – slm Jun 13 '13 at 19:29
6

As Chazelas points out, your script would fail if wildcard expansion matches more than one file.

However, there is a trick I use (even I don't like it very much) to get around:

PATTERN=(/*.txt)
if [ -f ${PATTERN[0]} ]; then
...
fi

How it works?

Wildcard expansion will match an array of filenames, we get the first one if there are some, otherwise null if no match.

peizhao
  • 521
  • 1
  • 7
  • 9
  • 2
    IMO this is the least bad answer here. They all seem pretty horrible though, as though a basic feature is missing from the language. – plugwash Dec 27 '15 at 18:29
  • @plugwash it's intentional... *nix shell scripts have some basic flow control and a few other odds and ends but at the end of the day it's job is to glue together other commands. If bash sucks... it's because the commands you use from it suck – cb88 Aug 11 '16 at 20:45
  • 3
    That's the wrong logic (and you're missing quotes). That checks if the first matching file is a regular file. It may well be a non-regular files but there may be several other `.txt` files that are of type _regular_. Try for instance after `mkdir a.txt; mkfifo b.txt; echo regular > c.txt`. – Stéphane Chazelas Nov 24 '16 at 09:14
1

I find this approach pretty simple and readable in bash, without invoking too many magic bashisms:

if [[ $(ls /*.txt 2>/dev/null) != "" ]]; then ./script; fi
neu242
  • 1,650
  • 2
  • 13
  • 17
0

Simple as:

cnt=`ls \*.txt 2>/dev/null | wc -l`
if [ "$cnt" != "0" ]; then ./script fi

wc -l counts the lines in the expanded wildcard.

  • 2
    Downvote: This collects an amazing number of beginner antipatterns in a small amount of code. [You should not parse `ls` output](http://mywiki.wooledge.org/ParsingLs) and almost never examine `$?` directly because `if` already does that. Also, [using `wc` to see if something happened](http://www.iki.fi/era/unix/award.html#wc) is similarly misdirected. – tripleee May 13 '17 at 11:36
0

I like the previous array solution, but that could become wasteful with large numbers of files - the shell would use a great deal of memory to build the array, and only the first element would ever be tested.

Here is an alternate structure that I tested yesterday:

$ cd /etc; if [[ $(echo * | grep passwd) ]];then echo yes;else echo no;fi yes $ cd /etc; if [[ $(echo * | grep password) ]];then echo yes;else echo no;fi no

The exit value from the grep appears to be determining the path through the control structure. This also tests with regular expressions rather than shell patterns. Some of my systems have the "pcregrep" command which allows much more sophisticated regex matches.

(I did edit this answer to remove an "ls" in the command substitution after reading the above criticism for parsing it.)

Charlie
  • 1
  • 1
0

A more versatile method is:

IFS=
shopt -s nullglob
txt=('/'*'.txt')
if [ -f "$txt" ]; then ./script fi

But this method works only if there is a single match.

Mario Palumbo
  • 175
  • 12
0

You can simply use ls command and properties of " and () symbol:

if [ "$( ls *the folder you want to check* | grep *any pattern* )" != "" ]; then
    : *do anything you like*
fi
Hauke Laging
  • 88,146
  • 18
  • 125
  • 174
0

I would use a simple case statement within a for loop.
Given this list of files:

a1.txt
a2.jpg
y4.txt
z5.doc

Then execute my_script only against files with txt extension:

for file in * ; do
    case "${file}" in
        *.txt) ./my_script "${file}"
        ;;
    esac
done
baselab
  • 621
  • 4
  • 14
-1

A way, in : if you want to know if there's some files matching the pattern *.txt, then consider:

(
    shopt -s nullglob
    compgen -W *.txt &>/dev/null

    case $? in
        0) echo 'one match' ;;
        1) echo 'more than one match' ;;
        3) echo 'no match at all' ;;
    esac
)

The subshell ( ) is here only to reset shopt as default settings. Can use

shopt -u nullglob 

too.

Note

It's different than compgen -G, because here we can discriminates more cases

Gilles Quénot
  • 31,569
  • 7
  • 64
  • 82
  • No, that's that how `compgen -W` is meant to be used nor how it works. – Stéphane Chazelas Jul 19 '20 at 06:19
  • Removed the mention 'proper way', but still relevant answer. This code works as intended in my answer. Just a little hack – Gilles Quénot Jul 19 '20 at 12:57
  • `compgen -W something` splits `something` on $IFS and prints the resulting words one per line, except the empty ones. `compgen -W` does give a syntax error indeed. `compgen -W something something-else` could do anything, especially if `something-else` starts with `-` (think of a `-Creboot .txt` for instance). – Stéphane Chazelas Jul 19 '20 at 13:21
  • Ah, yes, sad that `compgen -W -- *.txt` not helps here – Gilles Quénot Jul 19 '20 at 13:26
-3

if you want to use an if clause, evaluate the count:

if (( `ls *.txt 2> /dev/null|wc -l` ));then...
Rusty75
  • 99
  • 3