[TriLUG] compiling binaries w/ large file support
Douglas Kojetin
djkojeti at unity.ncsu.edu
Sat Jul 24 11:03:21 EDT 2004
Hi All-
This question is a bit out of my realm of knowledge (i'm not a
programmer; i more or less hack at things until they work -- but mostly
in scripting languages), but I'm desperate for a workaround. I have
the source code (C) to a program that I use on a regular basis. The
code was initially written years ago, and while it has been updated on
a regular basis, one thing apparently lacks that I am having trouble
with: large file support.
Specifically, I am working with many input files (in this particular
case, # = 14723 ... each about 284K). The output file the program
creates for this particular conversion is approximately 2.2G. I get to
a certain point where the conversion coredumps (and it spits out this
number: 2147481600). A search on Google for that number reveals a few
people have seen this filesize limit (with other programs/processes).
Apparently the way to resolve this issue is to recode/compile the
binary for large file support (LFS). According to a few websites, they
suggest adding this to CPFLAGS in the Makefile:
-D_FILE_OFFSET_BITS=64 -D_LARGEFILE64_SOURCE
I did this, but it did not appear to resolve the issue. I also noticed
that I could use the program 'strace' to see if the file open's used
the O_LARGEFILE flag -- which they apparently do not.
I know the most efficient way of getting this resolved might be to have
the original author of the code do it, but can anyone point me toward
some references in adding the O_LARGEFILE flag in the file open's?
(i'm using gcc, by the way).
Thanks,
Doug
More information about the TriLUG
mailing list