# LED array, quick question

Discussion in 'LEDs and Optoelectronics' started by xgabrielx, Oct 16, 2016.

1. ### xgabrielx

6
0
Jan 1, 2010
So, assuming I have a battery of a given voltage, and, for example 8 LEDs wired in series, dropping exactly 75% of the voltage, and then an appropriate resistor. (Assuming the resistor can have any value at all, not just standard values)
Now...I understand how if, instead of wiring the 8 in series, I were to wire them in 2 parallel circuits each with 4 LEDS off the SAME power supply, then efficiency would drop, since I'd have to drop far more voltage in the resistors.
BUT, suppose I were to replace the original battery with one of exactly half the voltage and re-calculate the resistors. now each should drop the same 25% as in the original series example. Now I have two circuits consuming the same amount of power and with the same efficiency, (I THINK) Except one is half the current at double the volts, and one is double the current at half the voltage. I guess with an ideal power supply, it doesn't matter, but with batteries, would one be preferred over the other? When power consumption is the same, is it better to use a battery with double the voltage, to get half the current draw by wiring all LEDs in series, is the reverse true, or does it not matter either way?

2. ### BobK

7,682
1,688
Jan 5, 2010
Assuming the resistors were chosen to give the same current through the LEDs, the two are equally efficient.

Let's say I is the current through the LEDs and V is the voltage of one battery.

For the one battery case you can compute the power as follows:

P = (2 * I) * V

For the two battery case you have

P = I * (2 * V)

Of course the two battery version will last twice as long, but only because the 2 batteries have twice the energy.

Bob