1

I know I can get disk usage of the files/directories in a directory like this:

for file in $(ls); do du --hum --sum $file; done

That seems to break down if the files/directories have spaces in their names. So I tried this:

find . -maxdepth 1 -type d -print0 | xargs -0 du --hum --sum

That yields only this:

2.3G    .

Whereas there are 8 subdirectories in my directory.

jsf80238
  • 143
  • 6
  • 1
    see: https://mywiki.wooledge.org/ParsingLs, https://mywiki.wooledge.org/WordSplitting and [Why does my shell script choke on whitespace or other special characters?](https://unix.stackexchange.com/questions/131766/why-does-my-shell-script-choke-on-whitespace-or-other-special-characters) – ilkkachu Apr 06 '21 at 23:17
  • 1
    Also, is there something wrong with just `du -h -s ./*`? Anyway, that `find` fails because it includes `.` in the output, and `du` is smart enough to notice all the other directories are under it. – ilkkachu Apr 06 '21 at 23:22

2 Answers2

1

You can also just apply the max-depth directive and take out --sum in your du invocation like so:

du --hum --max-depth=1

It will also display directories with spaces.

Here's example output demonstrating that directories with spaces will show up:

4.0K    ./regular_dir1
4.0K    ./regular_dir2
4.0K    ./dir with spaces
EWJ00
  • 361
  • 1
  • 6
0

Not as good as EWJ00:

find . -maxdepth 1 -type d -print0 | while read -d $'\0' file; do du --hum --sum "$file"; done
jsf80238
  • 143
  • 6