I hope this is not a stupid question...
I have read through the LED info thread and didn't see this issue...
When working with LEDs in the last several months I noticed that sometimes but not always when using or trying to use a current limiting resistor in series with the LEDs the resistor in question can become VERY hot.. (resistor/LED calc used) I was advised to use a higher watt rated component but I tried doing this and found that some of the time heat was still a big problem..
Is this possibly due to the type of resistor or something else? The kind of heat present in some of these was totally unacceptable as the placement of components is often in heat sensitive material.
Is running resistors in series the best way to power LEDs? So much heat seems odd and seems to suggest to me a lot of wasted energy.
Thanks,
Jim
I have read through the LED info thread and didn't see this issue...
When working with LEDs in the last several months I noticed that sometimes but not always when using or trying to use a current limiting resistor in series with the LEDs the resistor in question can become VERY hot.. (resistor/LED calc used) I was advised to use a higher watt rated component but I tried doing this and found that some of the time heat was still a big problem..
Is this possibly due to the type of resistor or something else? The kind of heat present in some of these was totally unacceptable as the placement of components is often in heat sensitive material.
Is running resistors in series the best way to power LEDs? So much heat seems odd and seems to suggest to me a lot of wasted energy.
Thanks,
Jim
Last edited: