[TriLUG] question about 2 GB file size limitation in Linux
Jeremy P
jeremyp at pobox.com
Thu Dec 20 12:05:38 EST 2001
On Thu, 20 Dec 2001, Geoffrey Douglas Purdy wrote:
> I'm running into a problem analyzing large files ( > 2 GB ) under
> Linux. The analysis application fails with input files greater than 2
> GB in size but works on input of 1.9 GB or less. I hope to identify
> whether the 2 GB limitation lies at the application level, in the ext2
> filesystem, or the Linux kernel.
>
> I can perform copy, cat, tail operations on files as large as 5 GB
> without problems so I suspect that the limitation isn't the 2.4 kernel
> or ext2, however I've seen a "2 GB file size limit in Linux" mentioned
> frequently. Does this limit refer only to older kernels?
The 2GB file limit is now only a problem in older versions of [g]libc, not
the kernel or filesystem. Make sure you're running the latest glibc (and
understand the caveats with upgrading it). But you're right that it's
probably an application issue -- large files require a different API, so
if your application doesn't know how to use them, it will fail. It might
not be too hard to modify it if you know a little C (and the app is open
source). If your app can work on a stdin/stdout basis that may help get
around the problem too.
A sligthly outdated document here describes some of the issues:
http://www.suse.de/~aj/linux_lfs.html
Read the glibc documetation for details on large file API.
Hope this helps,
Jeremy
More information about the TriLUG
mailing list