coreutils
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: finding directories with many files in a file system


From: Philip Rowlands
Subject: Re: finding directories with many files in a file system
Date: Fri, 19 Jul 2013 00:11:35 +0100
User-agent: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:17.0) Gecko/20130620 Thunderbird/17.0.7

On 18/07/2013 10:25, Bernhard Voelker wrote:
I have e.g. a file system where most of the inodes space is used,
let's say 450k of 500k.  What command(s) would I use to find out
which sub-directories are eating most of the inodes?

Well, I could use something like this:

   $ find .  -xdev -type d \
     | while read f ; do \
         printf "%d %s: " "$(find "$f" -xdev | wc -l)" "$f" ; \
       done \
       | sort -k1,1n

But this a) is lame and b) doesn't count hardlinks well.

This gives the non-cumulative total per directory:
$ find . -xdev -printf '%h\n' | sort | uniq -c | sort -n

but doesn't handle hard links. You could use -printf '%h %i\n' and post-process the duplicate inodes (no per-file stat since find v4.5.4).

Cheers,
Phil



reply via email to

[Prev in Thread] Current Thread [Next in Thread]