coreutils
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: finding directories with many files in a file system


From: Joseph D. Wagner
Subject: Re: finding directories with many files in a file system
Date: Thu, 18 Jul 2013 08:55:33 -0700
User-agent: Roundcube Webmail/0.8.6

On 07/18/2013 2:25 am, Bernhard Voelker wrote:

I have e.g. a file system where most of the inodes space is used,
let's say 450k of 500k. What command(s) would I use to find out
which sub-directories are eating most of the inodes?

Well, I could use something like this:

$ find . -xdev -type d
| while read f ; do
printf "%d %s: " "$(find "$f" -xdev | wc -l)" "$f" ;
done
| sort -k1,1n

But this a) is lame and b) doesn't count hardlinks well.

Do we have such a command (option) already?
If not, what about a new du(1) option that reports inodes
instead of blocks/bytes statistics, i.e. "du --inodes"?

Have a nice day,
Berny

It is sometimes hard to find a generic solution to a filesystem specific problem.

Which filesystem are you using?

Joseph D. Wagner



reply via email to

[Prev in Thread] Current Thread [Next in Thread]