qwertyface
- Apr 24, 2013
- 1
- Joined
- Apr 24, 2013
- Messages
- 1
Hi all,
I'm trying to hack together a rudimentary circuit to drive a high power LED. I have a mere basic understanding of electronics and electrical theory and I am turning to you brighter chaps for support with my calculations as I don't want to switch my circuit on and watch it smoke! These high power LEDs ain't cheep...
I waffle a bit to begin but only because I want to demonstrate that I've tried and not just posting on here for a quick answer without giving it a go first!
I hacked apart an old 300W computer PSU putting a dummy load between the PS_ON and GND wires. I've done this so that I can use the 5v/12v rails.
My LED has a Vf of 3 to 3.7V and forward current of 700 to 750mA. I would like it to take 730mA (to be persnickety!) but because I expect to use a number of these at a later date I have opted to use the 12V rail and figure therefore that I'll need a substantial resistor in the interim.
On the back of some paper my circuit is simple: Vcc --- LED --- R --- GND.
To my understanding KVL states that the sum of all voltages across components is equal to Vcc. I'll use 3V as my LEDs Vf, then the voltage drop across the resistor is 12 - 3 = 9V. I'd like to operate the LED at 730mA, so the wattage of the resistor is going to be 9 * 0.73 = 6.57W, but doubling up as per convention means I'm going to need a 12 to 15W resistor. AFAIK its value will need to be (12 - 3) / 0.73 = 12.32R, ~13R. So a 13R 12W resistor... a ceramic one comes to mind. And I have one, but at 10R 20W.
The problem:
LEDs are sensitive as you well know and so I need to determine what effect this resistor value is going to have in terms of both the new voltage across the LED (I am assuming that as the resistor I have is a lower value then what I require based on the previous calculation I'm guessing it will not sink the voltage I require it to and therefore the voltage drop over the LED will increase?) and the current through the LED (note, my DMM says the resistor is 10.14 ohms)
General questions:
QF.
I'm trying to hack together a rudimentary circuit to drive a high power LED. I have a mere basic understanding of electronics and electrical theory and I am turning to you brighter chaps for support with my calculations as I don't want to switch my circuit on and watch it smoke! These high power LEDs ain't cheep...
I waffle a bit to begin but only because I want to demonstrate that I've tried and not just posting on here for a quick answer without giving it a go first!
I hacked apart an old 300W computer PSU putting a dummy load between the PS_ON and GND wires. I've done this so that I can use the 5v/12v rails.
My LED has a Vf of 3 to 3.7V and forward current of 700 to 750mA. I would like it to take 730mA (to be persnickety!) but because I expect to use a number of these at a later date I have opted to use the 12V rail and figure therefore that I'll need a substantial resistor in the interim.
On the back of some paper my circuit is simple: Vcc --- LED --- R --- GND.
To my understanding KVL states that the sum of all voltages across components is equal to Vcc. I'll use 3V as my LEDs Vf, then the voltage drop across the resistor is 12 - 3 = 9V. I'd like to operate the LED at 730mA, so the wattage of the resistor is going to be 9 * 0.73 = 6.57W, but doubling up as per convention means I'm going to need a 12 to 15W resistor. AFAIK its value will need to be (12 - 3) / 0.73 = 12.32R, ~13R. So a 13R 12W resistor... a ceramic one comes to mind. And I have one, but at 10R 20W.
The problem:
LEDs are sensitive as you well know and so I need to determine what effect this resistor value is going to have in terms of both the new voltage across the LED (I am assuming that as the resistor I have is a lower value then what I require based on the previous calculation I'm guessing it will not sink the voltage I require it to and therefore the voltage drop over the LED will increase?) and the current through the LED (note, my DMM says the resistor is 10.14 ohms)
- Am I correct in calculating the current through the LED with the following: (12- 3) / 10.14 = 0.888A, or ~890mA ?
- I don't know how to determine the voltage, how would I do this?
General questions:
- Is it more appropriate to place the resistor before or after an LED, e.g. Vcc --- R --- LED --- GND or Vcc --- LED --- R --- GND?
- Is the shunt resistor described in the tutorial (here: https://www.electronicspoint.com/current-divider-circuits-t222484.html) "Current Divider Circuits" more appropriate for what I'm trying to do?
QF.
Last edited: