coreutils
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

finding directories with many files in a file system


From: Bernhard Voelker
Subject: finding directories with many files in a file system
Date: Thu, 18 Jul 2013 11:25:29 +0200
User-agent: Mozilla/5.0 (X11; Linux x86_64; rv:17.0) Gecko/20130329 Thunderbird/17.0.5

I have e.g. a file system where most of the inodes space is used,
let's say 450k of 500k.  What command(s) would I use to find out
which sub-directories are eating most of the inodes?

Well, I could use something like this:

  $ find .  -xdev -type d \
    | while read f ; do \
        printf "%d %s: " "$(find "$f" -xdev | wc -l)" "$f" ; \
      done \
      | sort -k1,1n

But this a) is lame and b) doesn't count hardlinks well.

Do we have such a command (option) already?
If not, what about a new du(1) option that reports inodes
instead of blocks/bytes statistics, i.e. "du --inodes"?

Have a nice day,
Berny



reply via email to

[Prev in Thread] Current Thread [Next in Thread]