[TriLUG] Hunting Down Large Files
Aaron Bockover
abockover.trilug at aaronbock.net
Thu Mar 17 01:48:39 EST 2005
Nautilus gave me a shocking message a few moments ago: not enough disk
space.
A quick df -h showed that my primary volume was indeed 100% used. Sad
really. I am a data hoarder. It's compulsive.
My question is this: is there a shell tool or script in existence that
can be run on a given base directory and calculate the total size of the
*files* in a directory (not recursively), and report those larger than
some limit? Also something to report single large files larger than some
limit?
If a tool like this exists, it might help me reduce some of my clutter.
I know I should probably get rid of those massive ISOs from five years
ago, but what if I need RH 6 next week?! I'm trying to avoid that route.
If something like this doesn't exist, I think I may have to write one.
I'd love to hear thoughts on how others manage their heaps of data.
Fortunately most of mine is somewhat organized, but organization comes
in phases... dig through data -- organize data -- collect data --
realize data is unorganized -- repeat.
Regards,
Aaron Bockover
More information about the TriLUG
mailing list