Hi, I just discovered your list and what luck! I've already crawled
through 2005's archives, and that answered a lot of my questions.
However, I've still got a couple more.
Firstly, I've tried the oscilliscope trick for measuring shutter
speed, http://www.home.zonnet.nl/m.m.j.meijer/D_I_Y/webcam_shutter.htm,
and I see that with:
setpwc -f 30 *and* streamer -c /dev/video0 -t 120 -r 30
-o movie001.jpeg
things aren't quite right. Quite often two sequential photos were
identical! Is this because streamer has no mechanism to tell one photo
from another? Or is this the driver?
Furthermore, with the oscilliscope I observed that the frames were
taken faster than 30fps, right around 32fps to be exact. Of course,
with all the duplicate pictures taken, I'm not convinced that the
camera shutter speed is anything other than as advertised. Is this
streamer's fault or the camera's?
In fact, how does the driver determine when to record a picture? When
a system call is made to pwc, does it wait for the next photo to be
sent, or does it capture the photo that is currently at the head of
the line? If it *does* do this, and if there is a second driver call
that comes right after the first one, does pwc know to wait until the
current photo is cleared from memory before continuing? While reading
the docs, I got the firm impression that the driver numbers each
photo, which is how it can say which photos have been dropped, so I'm
inclinded to think that the driver is aware of the internal state of
the camera.
What sort of mechanism does the camera have to ensure proper timing? I
imagine it has an internal chip that generates a signal for the
shutter speed. If this is the case, is there any way to use the
internal clock as a time stamp on all images issuing from the camera?
It doesn't have to be anything visual, it could just be that the last
few pixels are coded to give the time, for instance.
On Linux, is the delay between reality and when the signal comes out
of the driver always constant? Approximately how much time does the
camera take to receive the signal through the CCD and then send it to
the USB bus? Once in the machine, what happens to the USB flux? Is it
buffered elsewhere, meaning that it can show up at the driver's input
port at variable times? (This is important for us because we can model
a physical delay quite easily if it's always constant.)
Lastly, a little question that's been egging at me: why does Philips
promise 60fps when, even under windows, it can't reach this speed?
Anyway, a little background on our project: I'm make a ball and plate
system, which is a two dimensional analog of the ball and beam system.
Basically, the challenge is to control a pool ball rolling on a table
by manipulating the tilt of the table. The ball's position is recorded
by, you guessed it, a TouCam II Pro. So when the camera ouput is
running behind reality by more than 50ms, things start to get a little
hairy. That's why we're moving to Linux, and that's why I'm pestering
you guys so.
There you have it, lots of questions, feel free to respond to all,
some, a few, or none at all. (We're not to fond of the "none" option,
though.) Thanks so much for your time.
Cheers,
Kenn
--
www.eissq.com
New, but not improved
_______________________________________________
pwc mailing list
http://lists.saillard.org/mailman/listinfo/pwc
|