[TriLUG] clock() from time.h
Randall Barlow
randall_barlow at ncsu.edu
Thu May 18 13:22:17 EDT 2006
Hi,
I'm developing a C++ program that times itself, and needs to run on
Windows (yeah, I know) as well as Linux. Now, this isn't a big problem,
but it makes me curious. I use the clock() function from time.h to get
the time at the beginning of my simulation and at the end, and then use
the difference to measure how long the simulation ran. I'm noticing a
difference in behavior between Windows and Linux.
In windows, I will end up getting how much real time the simulation
took (i.e., if I suspend the machine for half a day and then let the
simulation finish, that half a day would be in the result). However, in
Linux, it seems to give me how much processor time the simulation took.
Can anybody explain why the same function from the same header file
behaves so differently?
Randy
More information about the TriLUG
mailing list