Connect with us

Help: Synchronization problem of CMOS camera

Discussion in 'General Electronics' started by Jim, Aug 27, 2004.

  1. Jim

    Jim Guest

    I am using a CMOS camera to detect some sample which is illuminated
    with pulsed laser. The pulse duration is 10nS, and the duration of
    light pulse to be detected is 10-20nS. The pulse repeat once every
    second, and the laser provide a TTL synchronization signal which rises
    10nS before the light pulse, lasts for 10uS and fall down to zero.

    The exposure time of the CMOS camera is set to be 100mS (it is the max
    exposure time for the camera). I use the TTL signal to synchronize the
    camera--after each rising edge plus a delay, I request the camera to
    sample a single frame.

    I had thought that, if the delay is 900-1000mS, the next light pulse
    will fall into the exposure time of the camera, and thus i can use the
    camera to detect the light pulse. However, the experiment showed
    that, only when the delay is 100-300mS can the light pulse be
    detected. It seems that the exposure period is before my sampling
    request, instead of being after the request.

    What troubles me more is that, only part of the pixels detected the
    light and the part rolls up or down. For example, my camera is with
    320*240 pixels. In the first detected frame, only pixels at row 1-40
    detected the light; for the second frame, only pixels at row 31-70
    detected the light, and row 51-90 for the third time... During the
    whole period, the delay between the light pulse and my sampling
    request is fixed.

    I can not explain why and can not figure it out. Does anybody know the
    reason of this phenomenon?

    Thanks a million!
     
  2. Bob May

    Bob May Guest

    A camera's imaging pixels are always looking at the target. This means that
    you will indeed capture the image before the readout of the image.
    I don't know the camera (and probably won't be familiar with it anyway) so I
    don't know what the sequence of the taking of an image is. For video
    cameras, the image is taken, then moved to a storage section of the chip and
    finally read out from the chip. The exposure time is dependent upon the
    time between the start of the imaging cycle and the readout of the image to
    the storage section.
    That you are seeing only part of the image (generally the lower part if you
    are late with the light) lit up then you need to adjust the timing so that
    the imaging section of the camera sees the whole light flash before the
    exposure is completed.
    This is sort of one of those things where you more adjust to the reality
    than to try to figure out why such a reality exists contrary to your
    expectations.
     
  3. Jim

    Jim Guest

    I got the answer.
    Unlike CCD, most CMOS camera adopt electronic rolling shutter. The
    camera pixels are exposured raw by raw (or block by block).
     
Ask a Question
Want to reply to this thread or ask your own question?
You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.
Electronics Point Logo
Continue to site
Quote of the day

-