[TriLUG] High resolution timer calls and the kernel
Sam Kalat
samkalat at sneakrets.com
Tue Jan 13 12:41:53 EST 2004
I have a real-time computer vision project I'm working on that does some
analysis of incoming 10 fps video from a pair of webcams. The cycle of the
cameras is reliable so each frame comes in 100 msec after the last. The only
exception is the first frame which seems to take a while as the camera
initializes.
So what I want to do is as much processing as possible on the existing frame
before switching to the next frame. I have an algorithm that can be cut
short and still be useful. Rather than alter the parameters to get the frame
rate I want, I have the frame rate and want to tune the parameters in
real-time to get as much done as possible without exceeding a 100 msec
curfew.
I ran into a few strange things when it came to keeping track of the time in
small increments. Was hoping someone could explain.
I think the timer call I used was gettimeofday(). One call takes about 2
usec, which is small compared to the 1300 usec or so for one frame of video
capture, or the curfew of 100 msec = 100000 usec.
If I make a loop of repeated calls for the time, however, the duration of
these calls increases. I made a counter for how many times I could call for
the time before breaking 100 msec, along with capturing video at that rate.
It looks like this:
Frame Loops
0 1
1 399935
2 383604
3 367687
...
21 1178
22 1
23 1
So if you do nothing but ask for the time, like an annoying kid screaming "are
we there yet?", the response goes from immediate to quite slow. I don't
think this happens when there is real processing that puts some delay between
the calls. I tried to simulate this with nanosleep() but that actually
sleeps a good bit more than I requested, so I didn't get very far with a
control to compare to.
My guess was the kernel throttled back on my process because it was making too
many calls that required a kernel response. Not knowing crap about kernels I
thought I'd raise the question here and see if I could get help. Hardware is
Logitech QuickCam 4000, kernel tested was a while back, 2.4.20 or so.
I recently heard that gettimeofday() is only accurate to 18 msec or so. I
thought it was working better for me but I need to double-check. The
behavior above is still relevant.
Cheers
Sam
More information about the TriLUG
mailing list