Connect with us

120hz versus 240hz

Discussion in 'Electronic Repair' started by Chris, Feb 25, 2010.

Scroll to continue with content
  1. FYI, Computer CRT screens refresh at 60 to 85 Hz.

    The main difference between CRT's and LED's or LCD's is persistance.
    The CRT's have long persistance phosphors, when they are illuminated, they
    stay lit for a relatively long time. That's why the interlacing system
    works, the odd lines are still lit when the even ones are illuminated.

  2. Sylvia Else

    Sylvia Else Guest

    It's not that long, which is why photographs of television pictures look
    so awful. Interlacing is used to avoid flicker without having to
    transmit 50 or 60 full frames per second.

    LCDs don't flicker anyway, regardless of their framerate. The frame rate
    issue relates to addressing the judder you get as a result of the image
    consisting of a sequence of discrete images, rather than one that
    continously varies.

    It doesn't help that much TV material that was recorded on film is
    transmitted with with odd and even interlaced frames that are scans of
    the same underlying image (or some variation thereon), so that the
    effective refresh rate considerably lower that the interlaced rate.

  3. LCDs don't flicker anyway, regardless of their framerate. The frame
    Not quite, otherwise the issue would occur with plasma displays. Indeed, it
    would with any moving-image recording system.

    The problem is that LCDs don't respond "instantaneously". They take a finite
    time to go from opaque to the desired transmission level, and then back
    again. The result is that the image can lag and "smear". (25 years ago, the
    first pocket LCD color TVs from Casio had terrible smear, which added an
    oddly "artistic" quality to sports.)

    For reasons not clear to me, adding interpolated images reduces the smear.
    This makes absolutely no sense whatever, as the LCD now has /less/ time to
    switch. I've never gotten an answer on this.

    Interlaced images can be de-interlaced. Note that most product reviews test
    displays for how well they do this.
  4. Dolby has a new thing -- HDR LCD that
    This is neither new, nor was it invented by Dolby.
  5. bob urz

    bob urz Guest

    It can get more complicated than that. Dolby has a new thing out HDR LCD
    that on the fly modulates the LED backlights for brightness in groups.
    that was not possible with CFL LCD backlights.

  6. Chris

    Chris Guest

    Yes, that is how it was explained to me from a salesman as well as what I
    gathered from online info. So, apparently, it is still an LCD screen. Also,
    somehow the refresh rate of the LEDs create some sort of multiplier effect
    with the LCDs; thus the higher hz. It sure would be nice to know if this is
    correct, and also why/how it enhances the picture.
    Although I am far from an expert in this area (hence my original post), I
    have the ability to understand just about anything that is explained
    correctly. When information is presented in an ambiguous way, which is what
    I have seen so far on internet research, that is definitely a red flag that
    the author probably is not knowledgable in the subject matter.
  7. Dolby has a new thing -- HDR LCD that on the fly
    I've seen at least one review that complained that local dimming produced
    "halos" around objects in darker scenes. I would never, ever buy a set with
    such a feature, unless it could be shut off.
  8. ** And if you put the remark back into its context --
    Agreed. It seemed unrelated, even out of left field. I suspect Sylvia didn't
    properly express what she wanted to say.
  9. I don't know how a 240Hz "scan rate" would be achieved.
    Actually, it's a frame rate. It can be done by interpolation, by inserting
    blank frames, or a combination of the two.
  10. Why would you jump to that conclusion?

    Oh, forgot; that's what you do.
  11. I guess we should refer to all LCD sets by their backlight type. That makes
    the one on my wall a CCFL TV. And I guess all of those DLP, LCD, DiLA,
    SXRD, and LCoS projection sets should be called mercury vapor or whatever
    type of lamp they use. And the new projectors could be called LED
    projectors as well, even if they are DLP devices.

    The point is that referring to the set by the type of backlight it uses is
    very misleading and is causing much confusion in the marketplace.

  12. Phil Allison

    Phil Allison Guest

    ** Yes it does - you ASD FUCKED TENTH WIT !
  13. Sylvia Else

    Sylvia Else Guest

    Many years ago (using a Sinclair Spectrum no less) I noticed an effect
    whereby if a small character sized square was moved across the screen in
    character sizes steps, the eye perceived phantom squares at intervening
    positions. Since the computer was not displaying these intermediate
    squares, their appearance must have been due to the eye. The likely
    explanation was that the eye was traversing the screen smoothly to
    follow the square, but the square itself was moving in discrete steps.
    So the eye was causing the image of the square to be smeared across the
    retina. I was seeing this effect on a CRT screen, but the longer the
    persistence of the image on the screen the worse the effect would be.
    Interpolating the position of the image on the screen would reduce that

    However, I can't explain why this would be less pronounced on a plasma
    They have to be deinterlaced for display on any screen with significant
    persistence, but deinterlacing doesn't increase the underlying frame rate.

  14. Sylvia Else

    Sylvia Else Guest

    It seems to me that the effect would be visible on any display that has
    any degree of persistence. Even if LCDs switched instantaneously, they'd
    still be displaying the image for the full frame time and then
    instantaneously switching to the next image. This would produce the
    smearing effect in the way I've described. To avoid it, one needs a
    display that produces a short bright flash at, say, the beginning of the
    display period, and remains dark for the rest of the time. As I
    understand plasma displays, that's not how they work.

  15. Mark Zenier

    Mark Zenier Guest

    At least in my area (Seattle), all the mainstream over the air stations
    are HD, now. (Each station has its own transmitter, so they don't have
    shared "multiplexes" like in the UK). The typical situation is an
    HD subchannel with the main program and one SD subchannel with some
    secondary service (sports, weather, old movies, old TV show, or an
    SD copy of the main signal to feed analog cable).

    There's a smaller number of stations that trade selection for resolution,
    running four or five SD subchannels. Christian broadcasting and
    speciality stuff like the secondary channels on the other stations.

    The only way you'd get stuck with standard definition on the national
    networks is to still be using analog cable. (I'm not familiar with
    what you get with the two (subscription only) national direct broadcast
    satellite providers).

    Mark Zenier
    Googleproofaddress(account:mzenier provider:eskimo domain:com)
  16. Sylvia Else

    Sylvia Else Guest

    The fact that a sequence of still images are perceived as a moving
    picture is clearly a consequence of visual persistence. And it's obvious
    that things will look bad if the images actually overlap. But that's not
    what we're discussing.

    We're discussing why certain types of display don't do such a good job
    despite having a reasonably sharp transition from one image to the next.

    The Wikipedia article you cited said that even LCD switching times of
    2ms are not good enough "because the pixel will still be switching while
    the frame is being displayed." I find this less than convicing as an
    explanation. So what if the pixel is switching while the frame is being
    displayed? It's not as if the eye has a shutter, and the transition time
    is much less that the eye's persistence time anyway.

  17. Short and long persistance are relative terms. Compared to the P1 phosphors
    of radar screens and osciloscopes, P4 phosphors are relatively short
    persistence. Compared to an LED they are long persistance.

    Note that there is a lot of "wiggle room" in there, supposedly the human
    eye can only see at 24 frames per second, which is 50ms.

    Also note that there are relatively few frame rates in source material,
    NTSC TV is 30/1001 frames per second, PAL TV is 25. Film is 24, which was
    stretched to 25 for PAL TV and reduced to 24/1001 for NTSC TV.

    Film shot for direct TV distribution (MTV really did have some technological
    impact) was shot at 30/1001 frames per second.

    Digital TV could be any frame rate, but they have stuck with the old standards,
    US digital TV is still the same frame rate as NTSC and EU, etc. digital TV is
    still 25 FPS.

    Lots of video files online are compressed at lower frame rates because of
    the way they are shown. The screens still operate at their regular frame
    rate, the computer decoding them just repeats them as necessary.

  18. It's more complicated than that. You only see one image, which has been
    created in your brain from several sources. The most information comes
    from the rods in your eyes, they are light level (monochromatic) sensors,
    as it were and they are the most prevalent. This means most of what you see
    is from the combination of two sets of monochrome images with slightly
    to wildly different information.

    Then there are the cones, or color sensors. There are far less of them
    and they are less sensitive to light, which is why night vision is black
    and white.

    There are also blind spots where the optic nerves attach to the retina.

    None of these show up on their own, they are all integrated into the one
    image you see. You never notice that you have two blind spots, you don't
    notice the lack of clarity in colors (due to the fewer number of spots)
    and rarely, if ever do you notice the difference between your eyes.

    If you were for example to need glasses in one eye and not the other, or have
    not quite properly prescibed lenses, your image will appear sharp, not blurred
    on one side and sharp on the other.

    Lots of tricks have been used over the years to take advantage of the
    limitations of the "equipment" and the process. For example, anything faster
    than 24 frames a second is not perceived as being discrete images, but one
    smooth image.

    The 50 and 60 fields per second (a field being half an interlaced frame) were
    chosen not because they needed to be that fast (48 would have done), but to
    eliminate interefence effects from electrical lights.

    Color is another issue. The NTSC (and later adopted by the BBC for PAL)
    determined that a 4:1 color system was good enough, i.e. color information
    only needed to be changed (and recorded) at 1/4 the speed of the light level.

    In modern terms, it means that for every 4 pixels, you only have to have
    color information once. Your eye can resolve the difference in light levels,
    but not in colors.

    This persists to this day, MPEG type encoding is based on that, it's not
    redgreenblue, redgreenblue, redgreenblue, redgreenblue of a still
    picture or a computer screen, it's the lightlevel, lightlevel,
    lightlevel, lightlevel colorforallfour encoding that was used by NTSC
    and PAL.

    In the end, IMHO, it's not frame rates, color encoding methods, at all, as
    they were fixed around 1960 and not changed, but it is display technology as
    your brain perceives it.

    No matter what anyone says here, it's the combination of exact implementation
    of display technology, and your brain that matter. If the combination
    looks good, and you are comfortable watching it, a 25 fps CRT, or a 100FPS
    LED screen, or even a 1000 FPS display, if there was such a thing, would look
    good if everything combined produce good images in YOUR brain, and
    bad if some combination produces something "wrong".

    There is more to that too. An LCD is like a shutter. It pivots on its
    axis and is either open or closed. Not really, there is a discrete
    time from closed (black) to open (lit) and therefore a build up of

    Plasma displays are gas discharge devices, they only glow when there is enough
    voltage to "fire" them until it drops below the level needed to sustain
    the glow. That depends more upon the speed of the control electronics than
    any (other) laws of physics, visocsity of the medium the crystals are in,
    temperature, etc.

    That's the aim of LED backlit TV screens (besides less power consumption, heat,
    etc). They only are lit when the crystals are "open", so there is no time
    where you see partially lit "pixels".

  19. Lots of tricks have been used over the years to take advantage
    Actually, it's 16 frames a second. However, that rate is not fast enough to
    prevent flicker -- which is why silent films were sometimes called
    "flickers". This is one of the reasons the frame rate was increased to 24
    with the introduction of sound.

    That's new to me.


    NTSC is actually 4.2/1.5, or roughly 2.8 to 1. PAL is closer to 5:1.

    I hate to spoil things, Geoff, but liquid crystals are quite capable of
    taking intermediate positions -- that is, forming a continuous gray scale.
  20. Sylvia Else

    Sylvia Else Guest

    Well, the story I heard way back when is that it was to synchronise the
    picture's vertical frequency with the mains frequency, so that
    inadequacies in power smoothing produced static distortions in the
    picture rather than much more noticable rolling distortions.

Ask a Question
Want to reply to this thread or ask your own question?
You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.
Electronics Point Logo
Continue to site
Quote of the day