[TriLUG] Hunting Down Large Files

Knowles, Christopher cknowles at sumitomoelectric.com
Thu Mar 17 07:53:51 EST 2005


Well, given that you need space to install it, you'll probably want to go with 
Chip's suggestion at first...

But for graphical clicky tools, I really like kdirstat.

http://kdirstat.sourceforge.net/

It is recursive, scans through the directory of your choice and allows you to 
delete, move or compress files, directories or whatever.  Pretty nice.  Has 
helped me with my occasional hoarding habits.

CJK

On Thursday 17 March 2005 01:48 am, Aaron Bockover wrote:
> Nautilus gave me a shocking message a few moments ago: not enough disk
> space.
>
> A quick df -h showed that my primary volume was indeed 100% used. Sad
> really. I am a data hoarder. It's compulsive.
>
> My question is this: is there a shell tool or script in existence that
> can be run on a given base directory and calculate the total size of the
> *files* in a directory (not recursively), and report those larger than
> some limit? Also something to report single large files larger than some
> limit?
>
> If a tool like this exists, it might help me reduce some of my clutter.
> I know I should probably get rid of those massive ISOs from five years
> ago, but what if I need RH 6 next week?! I'm trying to avoid that route.
>
> If something like this doesn't exist, I think I may have to write one.
> I'd love to hear thoughts on how others manage their heaps of data.
> Fortunately most of mine is somewhat organized, but organization comes
> in phases... dig through data -- organize data -- collect data --
> realize data is unorganized -- repeat.
>
> Regards,
> Aaron Bockover



More information about the TriLUG mailing list