[TriLUG] Hunting Down Large Files
Aaron S. Joyner
aaron at joyner.ws
Thu Mar 17 07:40:47 EST 2005
Chip Turner wrote:
>Who needs tools when the OS comes with all you need!
>
>
Good call Chip, you beat me to it! :) Now for a little obsessive
optimization...
># top five largest files/dirs
>du -s * | sort -n | tail -5
>
>
Looks great!
># largest 5 files
>ls -lSr | tail -5
>
>
You can use head instead of tail here, and perhaps save a tiny touch of
time, something like this:
ls -lS | head -5
It doesn't seem like much, but in a series of directories with tiny
files, it can add up, especially when we start stacking it in a
minute... (this actually doesn't work out, as find and shell redirection
don't mix for me as well as I'd hoped).
># largest 5 space consumers, either dirs or files, recursively
>du | sort -n | tail -5
>
>Add a -x to du if you don't want walk across filesystems boundaries.
>
>
And the icing on the cake, if you want one command which tells you the
20 largest files on your filesystem, you can use something like this:
find / -type d -exec ls -sS \{\} \; > /tmp/asjout.tmp; cat
/tmp/asjout.tmp | sort -rn | head -20
It's only sort of one command as I had to cheat and use a temporary
file, because of the way find generates it's output. :( Also, beware
that this command can take a while to run. It also doesn't do any
throwing-away of error output, so you'll want to run it as root mostly
to get accurate counting and also to minimize trash on the screen.
Running across my 250G of disk space in one box, which is mostly full,
it took about 69 seconds. That wasn't my first time running find
though, so it was probably all cached at that point. Your results may
vary. :)
Enjoy!
Aaron S. Joyner
More information about the TriLUG
mailing list