How can I list the number of lines in the files in /group/book/four/word, sorted by the number of lines they contain?
The ls -l command lists the files but does not sort them.
How can I list the number of lines in the files in /group/book/four/word, sorted by the number of lines they contain?
The ls -l command lists the files but does not sort them.
You should use a command like this:
find /group/book/four/word/ -type f -exec wc -l {} + | sort -rn
find : search for files on the path you want. If you don't want it recursive, and your find implementation supports it, you should add -maxdepth 1 just before the -exec option.exec : tells the command to execute wc -l on every file.sort -rn : sort the results numerically in reverse order. From greater to lower.(that assumes file names don't contain newline characters).
Probably the simplest version if you don't need recursivity :
wc -l /group/book/four/word/*|sort -n
wc counts lines (option -l) in every (but hidden) (*) files under /group/book/four/word/, and sort sorts the result (through the pipe |) numerically (option -n).
Someone made a comment to this answer mentioning grep -rlc, before to suppress it. Indeed grep is a great alternative, especially if you need recursivity :
grep -rc '^' /group/book/four/word/|tr ':' ' '|sort -n -k2
will count (option -c) recursively (option -r) lines matching (grep) '^' (that is, beginning of lines) in the directory /group/book/four/word/. Then you have to replace the colon by a space, e.g. using tr, to help sort, which you want to sort numerically (option -n) on the second column (option -k2).
Update : See Stephane's comment about possible limitations and how you can actually get rid of tr.
With zsh:
lines() REPLY=$(wc -l < $REPLY)
print -rC1 /group/book/four/word/*(.no+lines)
We define a new sorting function lines that replies with the number of lines in the file. And we use the o+lines glob qualifier which together with n (for numeric sort), defines how the results of the glob are ordered. (. also added to only check regular files).
That makes no assumption on what character the file names may contain other than hidden files (those starting with .) are omitted. Add the D glob qualifier if you want them as well.
Defining a lines function is useful when you often do something like that, but for a one-off, you could also do it in one go with:
print -rC1 /group/book/four/word/*(.noe['REPLY=$(wc -l < $REPLY)'])
From another shell, simply run:
zsh -c '
print -rC1 /group/book/four/word/*(.noe['\''REPLY=$(wc -l < $REPLY)'\''])'
Or to store it in a ksh93/bash array if you really had to use those shells:
typeset -a array
eval "
array=(
$(
zsh -c '
() {
print -r -- "${(qq)@}"
} /group/book/four/word/*(N.noe['\''REPLY=$(wc -l < $REPLY)'\''])'
)
)
"
(here using proper single-quote quoting (with the qq parameter expansion flag) for the evaluation to be safe).
You don't specify whether you also want the files in any subdirectories of /group/book/four/word. The find solution in jherran's answer will descend into subdirectories. If that is not wanted, use the shell instead:
for file in ./*; do [ -f "$file" ] && wc -l "$file"; done | sort -n
If your file names can contain newlines, you can use something like:
for file in ./*; do
[ -f "$file" ] &&
printf "%lu %s\0" "$(wc -l < "$file")" "$file"
done | sort -zn | tr '\0' '\n'
Finally, if you do want to descend into subdirectories, you can use this in bash 4 or above:
shopt -s globstar
for file in ./**/*; do [ -f "$file" ] && wc -l "$file"; done | sort -n
Note that versions of bash prior to 4.3 were following symlinks when recursively descending the directory tree (like zsh's or tcsh's ***/*).
Also, all solutions above will ignore hidden files (those whose name starts with a ., use shopt -s dotglob to include them) and will also include the line count of symbolic links (which the find approach will not).
If you want to install fd a really fast file finder written in Rust (you should install it, it's great to have anyway)
fd --type=file . | xargs wc -l | sort -n
Basically fd lists the files, xargs will pass the list of files to wc (stands for word count but passing -l will make it count lines) then finally it's sorted from least number of lines to greatest using sort -n.
Since the solution provided by @SkippyleGrandGourou didn't work for me, here is my recursive solution using find:
find <folder> -name "<filter>" -exec wc {} \; | sort
Example:
find . -name "*.jsp" -exec wc {} \; | sort