[TriLUG] clock() from time.h
Rodney Radford
rradford at mindspring.com
Thu May 18 13:56:33 EDT 2006
The man page states that clock "returns an approximation of processor time used by the program", which matches what you are getting under Linux. However, this link, http://www.cygwin.com/ml/cygwin/2003-10/msg01086.html, implies that under a Windows box clock actually returns the sum of the user-used-time and the system-used-time - ie: wall clock time. Now that doesn't answer your question of 'why', but does agree with what you are seeing. Perhaps the issue is that Windows doesn't differentiate between windows and user time, or any time per process.
Which did you want to see - system time or wall-clock time as perhaps there is a better solution than using clock().
-----Original Message-----
>From: Randall Barlow <randall_barlow at ncsu.edu>
>Sent: May 18, 2006 1:22 PM
>To: Triangle Linux Users Group discussion list <trilug at trilug.org>
>Subject: [TriLUG] clock() from time.h
>
>Hi,
>
> I'm developing a C++ program that times itself, and needs to run on
>Windows (yeah, I know) as well as Linux. Now, this isn't a big problem,
>but it makes me curious. I use the clock() function from time.h to get
>the time at the beginning of my simulation and at the end, and then use
>the difference to measure how long the simulation ran. I'm noticing a
>difference in behavior between Windows and Linux.
>
> In windows, I will end up getting how much real time the simulation
>took (i.e., if I suspend the machine for half a day and then let the
>simulation finish, that half a day would be in the result). However, in
>Linux, it seems to give me how much processor time the simulation took.
>Can anybody explain why the same function from the same header file
>behaves so differently?
>
>Randy
>
>--
>TriLUG mailing list : http://www.trilug.org/mailman/listinfo/trilug
>TriLUG Organizational FAQ : http://trilug.org/faq/
>TriLUG Member Services FAQ : http://members.trilug.org/services_faq/
More information about the TriLUG
mailing list