0

I tried to open a lot of files (10,000+) with cat and them merging them into one like this:

cat * > ~/Desktop/lol.xml

But it returned this error:

-bash: /bin/cat: Argument list too long

This means that its too long/large, what other way could I do this?

terdon
  • 234,489
  • 66
  • 447
  • 667
DisplayName
  • 11,468
  • 20
  • 73
  • 115
  • You could check the answer [here](http://stackoverflow.com/a/18699165/1742825). – Ramesh Nov 10 '14 at 16:25
  • 1
    Also, some more useful read from [here as well](http://unix.stackexchange.com/questions/118244/fastest-way-to-concatenate-files). – Ramesh Nov 10 '14 at 16:30

2 Answers2

4
find . -maxdepth 1 -type f --exec cat {} + > ~/Desktop/lol.xml

This calls cat with the maximum possible number of arguments. For the remaining arguments new instances of cat are started.

Hauke Laging
  • 88,146
  • 18
  • 125
  • 174
  • Huh, that was quick, could you provide an explanation for this? It seems like it launches another `cat` process for every file. – DisplayName Nov 10 '14 at 16:25
  • 2
    `-maxdepth` is not POSIX, you can use `find ! -name . -prune -type f` instead. – cuonglm Nov 10 '14 at 16:28
  • Is this compatible with FreeBSD (OS X) tools? Just checking before. – DisplayName Nov 10 '14 at 16:33
  • @DisplayName I don't know. `-exec +` is part of the standard but has not always been. If that doesn't work then you can get a similar effect with `find ... -print | xargs -d '\n' cat >...` assuming there are no newlines in the file names. – Hauke Laging Nov 10 '14 at 17:21
3

There's a limit for the number of arguments a command can take. A workaround is to use a for loop :

for file in *; do cat "$file"; done

The maximum can be displayed with :

$ getconf ARG_MAX 
Gilles Quénot
  • 31,569
  • 7
  • 64
  • 82