Connect with us

220 Volt LED?

Discussion in 'Lighting' started by Søren M, Sep 4, 2005.

Scroll to continue with content
  1. Søren M

    Søren M Guest

    Dear Group,

    Hve they made 220 Volt LEDs without transfonmer?

    If they are made do they make as mouch heat as an normal 220 Volt
    halogen pin socket bulb?

    How small closed room forexamble in glass will it be possible to place a
    lightscource like that without it giving problems with over heating and
    dammage of the lightscource the socket or cords?

    Søren Momsen
  2. You can buy LED based lamps that have an array of LEDs with either a
    capacitive dropper or active electronic switcher to allow direct mains
    No. Unfortunately they don't generate much light either.
    You could use a large array of the lights to produce a modest level of
    illumination, but if you used the white version then it would be a harsh
    cold light.

    The really huge LED panels do get hot and require ventilation, but they
    cost an absolute fortune and don't offer much in the way of real
  3. Each LED has a very low operating voltage, typically about
    3.5 volts. They are wired in series, together with a passive
    or active current limiting device to match the available
    line voltage.
    Yes, for currently available LEDs if they produce the same
    amount of light. LEDs available now are just about as
    efficient as halogen lamps, so for the same amount of light,
    you would get the same amount of heat. Though, as Clive has
    said, many LEDs produce less light than the halogen lamps
    they are designed to replace, so on those cases they would
    produce less heat.
    This all depends upon how much light you want to generate.
    Also, LEDs are more sensitive to damage from their own waste
    heat than halogen lamps are. Incandescent lamps, including
    halogen incandescent lamps, and designed to operate at high
    temperature and also radiate much of their heat away as long
    wavelength IR.

    LEDs on the other hand, operate best when they are cool, or
    even cold. The performance given on LED spec sheets is
    almost always for a junction temperature of 25C and their
    efficacy and life degrades as the junction temperature
    rises. LEDs have almost no IR radiation, so the waste heat
    must be removed by conduction. High brightness LED, those
    operating at about 1 watt and above, will get hot enough to
    destroy themselves even in open air if they are not attached
    to a properly designed heat sink.

    Vic Roberts
    To reply via e-mail:
    replace xxx with vdr in the Reply to: address
    or use e-mail address listed at the Web site.

    This information is provided for educational purposes only.
    It may not be used in any publication or posted on any Web
    site without written permission.
  4. The part about meeting specs only when cooled enough to achieve a
    junction temperature of 25 C appears to me to be mainly a Lumileds thing.

    Cree XLamp supposedly meets spec when the LED's heatsink slug
    temperature is 25 C (still optimistic and and I believe a little
    unreasonable to expect to be achieved in typical use).

    Most 5 mm LEDs with leads supposedly meet spec "in typical mounting
    situations", maybe soldered onto a PCB away from heat sources (my words
    and speculation) in air temperature of 25 C. LEDs generally get warmer in
    cluster lamps than they do in whatever the "standard situation" is (my
    words), however.

    I believe this business of making the "standard conditons" including a
    temperature of 25 C came from semiconductors normally being characterized
    at 25 C (with either junction or heatsinking surface being 25 C for
    heatsinkable parts, and ambient temperature 25 C for low power parts).

    - Don Klipstein ()
  5. How much difference is there between junction and ambient
    temperature for an LED that is dissipating less than 80mW ?
    Based on your comments I was about to give Cree my "truth in
    advertising" award :) but when I looked at the data sheet
    for the Cree 7090 series XL LEDs I can't find any such
    claim. I do see that the electrical specs are given for an
    ambient temperature of 25C instead of a junction temperature
    of 25C, but the data sheet is silent about the temperature
    that is used for the lumen output specification, which by
    the way is given only as "typical". I've been involved in
    enough litigation to know that, based on this data sheet, it
    would be hard to claim that Cree specified the flux at an
    ambient temperature of 25C just because they used that
    specification for the electrical characteristics.

    So, I took a look at the binning sheet for the XL 7079. On
    this data sheet Cree does give minimum and maximum flux for
    each bin, but they don't even show the forward current and
    also fail to list the temperature spec. It is reasonable to
    assume that the forward current for these bin specs is
    350ma, based on the other data sheet, but it sure would be
    nice if they had shown the current on the binning data sheet
    and even better if the temperature had been listed on at
    least one, and preferably both of the data sheets.

    The "truth in advertising" award is thereby rescinded. :)
    Not a problem when the power dissipation is < 80mW.
    And, most other lamps are specified in an ambient of 25C

    Vic Roberts
    To reply via e-mail:
    replace xxx with vdr in the Reply to: address
    or use e-mail address listed at the Web site.

    This information is provided for educational purposes only.
    It may not be used in any publication or posted on any Web
    site without written permission.

  6. Just so we are all working from the same datasheet:

    It would appear on page five they have relative output vs. junction
    temperature graphs. It appears the relative output reaches 100% at 25 deg.
    C junction temperature. This suggests to me that 25 deg. C junction (as
    opposed to case or ambient) temperature is the standard (as is 350mA), at
    least in this case.

    I can see how this might be irritating to some designers, but in fairness to
    Cree, most electronics component manufacurers usually use 25 deg. C junction
    temperature as a somewhat unrealistic baseline for performance.

    As for LED luminous efficacy vs. temperature... In it interesting to study
    the bottom two graphs on the Cree datasheet. It would appear the white LED
    loses about 20% current to optical efficiency at a maximum die temperature
    of 125 deg. C, while the blue one only around 5%, while the amber one a
    horrendous 82%.

    Presumably as the die heats up, the forward voltage decreases. This effect
    is apparently fairly sigificant for these LEDs, at around -2.9mV/deg. C for
    the blue and white ones. This is roughly around a 8% or 9% or so
    improvement between 25C and 125C junction temperature. It seems to me, that
    in the case of the blue LED, as the temperature rises from 25 deg. C to 125
    deg. C the voltage to optical efficiency increases about 9% while the
    current to optical efficiency drops by around 5%. This should mean the
    power to optical efficiency in the LED (IE: luminous efficacy) actually
    increases by a few percent when running at the higher temperature, even
    though optical output decreases. Of course, unless you are using a switch
    mode current source, this voltage to optical efficiency gain will not be
    realised since the ballast would waste any improvements in forward voltage.

    Of more importance the white LED still loses luminous effacacy (though not
    necessarily as bad as it appears at first glance), and in all cases you
    still get less optical output at the higher temperature. Still, a human
    would have a hard time noticing a 10-20% decrease in intensity. As the
    lighting designer you get the choice how much you want to spend on "excess"
    heatsinking above and beyond the absolute minimum.

    The amber and red LEDs blow hard chunks at high junction temperature.
  7. Yes, same data sheet.
    Yes, that "suggests" that the photometric data is taken at a
    junction temperature of 25C, but then look at the graph just
    above, it gives intensity vs current at an AMBIENT
    TEMPERATURE of 25C and this graph seems to go through 100%
    at a current of 350ma, which is the rated current of the
    device. This curve "suggests" that ambient temperature is
    the standard. Pretty confusing I would say. But all this
    could be resolved if Cree had just stated the temperature
    and which temperature on the table on page 2 where the give
    the "Typical Luminous or radiant Flux @ 350mA." How about
    "... @ 350mA and 25C junction temperature" ?

    Then we have the problem that Cree uses ambient temperature
    at all, for example for the Electrical Characteristics on
    page 4 and the Relative Intensity vs. Current on page 5.
    These devices cannot run at rated current without a heat
    sink. That is the clear implication of the fact that the
    data sheet does not give any value for the case to ambient
    thermal resistance in the absence of a heat sink and state
    on page 4 the importance of designing the end product (the
    application or luminaire) to minimize the thermal resistance
    between the solder point and the ambient. So, how are we to
    use any of the "ambient temperature" specifications? What
    heat sink was used? Certainly they didn't run these devices
    at rated current without a heat sink for anything other than
    a short transient experiment.

    One might assume that the use of "ambient temperature" on
    the data sheet is an error. Usually the electrical and
    photometric characteristics are measured at the same time.
    It's hard to imagine any lab use using one setup to measure
    some characteristics at a fixed junction temperature and
    other characteristics at a fixed ambient temperature - even
    if the device could survive without an additional heat sink.
    Well, a standard test is needed. All that I am asking is
    that the test conditions be listed on the same table as the
    I get 7.25% based on a 100C rise in junction temperature and
    a forward voltage of 4 volts for the blue LEDs. (See page 2)
    Of course, we still don't know of the electrical data on
    page 2 is based on 25C junction or 25C ambient - since , as
    stated above, the other electrical data seems to be given at
    the elusive 25c ambient temperature.

    I think you mean to say that the photometric output at rated
    current appears to drop by only 5%, while the input power at
    rated current appears to drop by 9% (or 7% by my
    calculation) when the junction temperature rises from 25C to
    125C, which would imply a slight increase in efficacy of the
    blue LED.
    Only for the blue LED since the efficacy of the other colors
    decreases faster than the power drops as the temperature
    If this increase is real and not just the result of small
    measurement errors, then it should be preserved by a good
    switch mode power supply.
    Well, this "hard time noticing" argument can be made for any
    light source, and the problem is that a human may not notice
    the first 10%, or perhaps the next 10%, but when people
    continue to decrease light levels by 10% eventually even the
    lest observant human will realize there is too little light.
    Vic Roberts
    To reply via e-mail:
    replace xxx with vdr in the Reply to: address
    or use e-mail address listed at the Web site.

    This information is provided for educational purposes only.
    It may not be used in any publication or posted on any Web
    site without written permission.
Ask a Question
Want to reply to this thread or ask your own question?
You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.
Electronics Point Logo
Continue to site
Quote of the day