Connect with us

LCD VGA input circuit

Discussion in 'Electronic Design' started by [email protected], Mar 8, 2007.

Scroll to continue with content
  1. Guest

    How do LCD monitors get the correct dot clock frequency to sample the
    incoming VGA?
    If all you have is a HSYNC frequency, do they do a look up function or
    something more fancy?
    What chips are typically used to recover the clock, or is it built in
    to some huge SOC?
     
  2. Joel Kolstad

    Joel Kolstad Guest

    Something more fancy: A phase-locked loop, derived from HSync.

    Since the LCD "knows" it has, e.g., 1280 pixels to display, a PLL is
    configured such that it generates 1280 pulse (pixel clocks) between the active
    edges of HSync. Take a look at the data sheet for a digitizer meant for LCDs,
    e.g., the Analog Devices 9884:
    http://www.analog.com/en/prod/0,2877,AD9884A,00.html

    ---Joel
     
  3. Guest

    I appreciate how the clock is generated, I want to know how the thing
    knows _which_ dot clock to generate. I can think of many situations
    where the same HSYNC would have a different dot clock.
    Look at it this way. You are in a cardboard box. All you have is a
    tiny slot through which I slip you a piece of paper written 15743 Hz
    on it. Now guess which video mode I'm in, and what the correct dot
    clock is?
    Yup, but I want to know how the thing knows it needs 1280 dot clocks
    per line. It must have a timer, measures the HSYNC, looks up which
    video mode is closest, then programs the genlock?
    "A Voltage Controlled Oscillator (VCO) generates a much higher pixel
    clock frequency. This pixel clock is divided
    by the value PLLDIV programmed into the AD9884A, and phase compared
    with the HSYNC input."

    You see, you still need to program the value yourself via the I2C
    interface. You don't just toss in a HSYNC and it magically determines
    the dot clock.

    I'm asking because of a personal project that's been on hold for a
    long time.
     
  4. Rich Grise

    Rich Grise Guest

    The guy that built the display told the PLL designer that that's how
    many pixels the display has. Actually, if you had 1280 pixels, you
    probably wouldn't use 1280 * Fh for a dot clock, because then you'd be
    displaying horizontal retrace at the edges, so you'e up it a little and
    gate the actual pixel address counter with the horizontal blanking pulse.
    See above. It doesn't have to "know" anything - the circuit is designed
    to work for however many pixels across that the designer put into it.

    I think you're confusing the idea that the signal has to somehow "know"
    something about what mode it's being displayed in, but it couldn't give
    a shit less - if you get a chance, take a look at a TV video with the
    hor. sweep set to give about one line across, and you get sort of a
    top view of the edge of the picture, where height = brightness, and
    sync it to a whole frame/field and you get like an edge-on view.

    Cheers!
    Rich


    Hope This Helps!
    Rich
     
  5. jasen

    jasen Guest

    the only problem is that'd give the wrong result.

    "1280x960" 108.00 1280 1376 1488 1800 960 961 964 1000 +hsync +vsync
    A B C D E F G H
    clock# action

    0 first pixel
    1279 last pixel
    1280 right-hand overscan (aka border)
    1376 overscan end/horizontal sync start (retrace period)
    1488 horizontal sync end/left overscan start
    1800/0 first pixel
    1279 last pixel

    there's more clocks per line than there are pixels.
     
  6. Guest

    Those are trivial details next to finding out what mode you're in. I'm
    thinking of counting the hsyncs per frame so at least I know the
    vertical resolution. From there I can guess what mode I'm in. Problem
    is I'm dealing with an entirely programmable video chip, and there's
    no fixed H-V relationship. Ugh.
     
  7. Guest

    Not the signal, the monitor.
    So the monitor doesn't need to know that the incoming signal is
    640x480 VGA, and it magically appears on the RSDS lines to the gate
    drivers on the panel as 1280x1024 native resolution?

    Holy crap.
     
  8. jasen

    jasen Guest

    computer video cards have been like that since forever, CGA didn't offer the
    opportunity to change the pixel clock but all else could be tweaked
    (sometimnes not real good for the monitor)

    I can dial up pretty much any mode I want by editing a text file for Xfree86
    windows users can do the same by editiing the registry.

    Bye.
    Jasen
     
  9. Guest

    I don't mean in software on the PC, I mean a seperate piece of
    hardware connected to only the analog VGA port. Just from the signals
    present on the subD-15 connector, how do you figure out what the
    correct video mode is? I've got a microcontroller set up right now to
    give me a /1016 ratio on my PLL, but what can I do when the source
    video chip is fully programmable wrt syncs?
     
  10. joseph2k

    joseph2k Guest

    LDC's have a physical native resolution. Just the same most sync signals
    are set up for physical CRT's. CRT's require retrace time, both
    horizontally and vertically. Read up on NTSC signal composition,
    especially vertical blanking. It is the basis of all display timings.
    Next, when you have assimilated that, read up on X-windows modelines. Then
    you will know for your self.
     
  11. Gary Tait

    Gary Tait Guest

    wrote in
    1280 dots is hard coded in the firmware. The firmware programs the PLL so
    as 1280 pixel clock cycles (or so) are generated between horizontal sync
    pulses.

    That is, if the display directly digitises the input signal to the display
    elements.
     
Ask a Question
Want to reply to this thread or ask your own question?
You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.
Electronics Point Logo
Continue to site
Quote of the day

-