# Circuits in Parallel and Series... Breaking it down!

Discussion in 'General Electronics Discussion' started by Moha99, Apr 3, 2012.

1. ### Moha99

261
0
Nov 18, 2011
Hallo everyone!

I was really curious about circuits in parallel and series...

I understand how things work in series..

But is a demonstration of a circuit in parallel:

Now as you can see there are two light bulbs that are functional in the circuit, now what i want to get my mind around is the "power input"! If there is an input of 6volts at 1 Amp which is 6Watts total...

In parallel will the light bulbs all get 6watts of power? In the diagram there are only 2 light bulbs what if i had 10 light bulbs? Will the 10 all get 6 watts still?

If they all do now thats really remarkable!

2. ### (*steve*)¡sǝpodᴉʇuɐ ǝɥʇ ɹɐǝɥdModerator

25,480
2,827
Jan 21, 2010
Yes, but if there are 2 bulbs that 6W is 3W each. If there are 10 bulbs, that 6W becomes 0.6W each.

3. ### Moha99

261
0
Nov 18, 2011
wow... So they all would get the power they need huh

is that one of the reason why parallel is preferred than series too?

They thing i wonder about in series for example the 1st light bulb will be brighter and the 2nd less brighter...

But in parallel they all are bright enough.

Last edited: Apr 3, 2012
4. ### Harald KappModeratorModerator

11,151
2,546
Nov 17, 2011
In parallel, all components "see" the same voltage. In the case of a resistive load the current through each component is defined by I=U/R. So each load "takes" the current it needs.
In the case of other, non purely resistive loads, in detail things are more complicated, but the same principle applies: Assume that the load is rated for the voltage Vnom as is used in the circuit. Further that the load has a rated power Pnom (and at this point we don't care whether it is resistive or not), the current Inom will then be Inom = Pnom/Vnom.
This applies as long at the voltage source can supply all the current drawn by the loads.

A common misunderstanding is that the power used in the circuit is not defined by the power source, but by the load. The power stated on a power source is just the limit of this source, what it will be able to supply without degradation.
So if a power source states e.g. 5V/1A, that does mean that it can supply up to 1A while still regulating 5V. This does not mean, however, that the power supply will force 1A through any circuit attached. If the circuit (e.g. a charger for a cellphone) requires only 0.5A, then 0.5A will flow.

The basic rule is:
Parallel connection: same voltage, different currents.
Series connection: same current, different voltages.

Harald

5. ### Harald KappModeratorModerator

11,151
2,546
Nov 17, 2011
This will happen if the lamps are of different wattage. If you put lamps of the same type in series, they all will glow with approx. the same brightness (differences in brightness are only due to manufacturing tolerances of the individual bulbs).

This applies for tungsten light bulbs, because these are used on a fixed voltage.
You can't do this with LEDs, for example. LEDs are current controlled. If you put them in parallel, each LED requires its own current control (a resistor in the most simple case). If you put them in series, just one current control circuit (again a resistor in the simples case) suffices for all LEDs.

You see: There is no general rule saying series is better than parallel or vice versa. It depends on the circumstances.

Harald

6. ### Moha99

261
0
Nov 18, 2011
So can I make a circuit that consists of both series and in parallel paths? The power supply is 5 watts, I wired up a bulb that requires only 5 watts and then added 2 bulbs in parallel... Will the 1st bulb in series have that 5W completely? then the other 2 bulbs will divide it so that the 2nd and 3rb bulb will have 2.5 watts?

7. ### timothy48342

218
1
Nov 28, 2011
First of all it's a little more complex than that. It is much easier to calculate if you work with voltages and currents instead of just the wattage. A schematic would help, too so we know how you are hooking them up.

Secondly... No. Even without being sure just how you have it hooked. If the power supply is capable of suppying 5 watts, then your not going to get 5W+2.5W+2.5W through the lights, no matter how they are hooked.

--tim

8. ### Moha99

261
0
Nov 18, 2011

Ah ok...Thanks!

9. ### jackorocko

1,284
1
Apr 4, 2010
Yes, but as timothy has stated, you will not get more then 5W out of a 5W power supply.

This is why sizing of components is so vital to the functionality of the circuit. Every aspect of a circuit needs to be sized properly, right down to the power supply that is supplying the load.

10. ### Moha99

261
0
Nov 18, 2011
What is the power supply is 30Watts and there is a source that only require for example 27Watts... And there is 3 watts additional in the circuit what would happend to it? Should I add a source that requires 3 watts would it work?

11. ### jackorocko

1,284
1
Apr 4, 2010
No, a load will only sink as much current as is necessary from the power supply. For the most part, it is better to have more current available then the load consumes when it comes to power supplies.

Last edited: Apr 3, 2012
12. ### Moha99

261
0
Nov 18, 2011
Yea the current will be more than the load defiantly not the exact same but higher by a few amps.

13. ### Harald KappModeratorModerator

11,151
2,546
Nov 17, 2011
Surely you don't mean source but load.
No, you don't need to add additional load. If the power supply is rated at 30 W that means that it can deliver up to 30 W, not more. But always less.

Think of the power outlets in your home. They deliver 115 V (or 230 V or whatever the power voltage in your country is) and are protected by a fuse, e.g. 16 A. So the max. power you can draw from an outlet is 115 V * 16 A = 1840 W.
Do you always plug in appliances to constantly draw 1840 W? You sure don't. You plug in whatever you need (up to 1840 W).

Harald

261
0
Nov 18, 2011

15. ### zamilac

5
0
Mar 8, 2012
Exactly, So can I supply more and less? not equal to each. Then

16. ### (*steve*)¡sǝpodᴉʇuɐ ǝɥʇ ɹɐǝɥdModerator

25,480
2,827
Jan 21, 2010
No.

The amount of power supplied equals the amount of power consumed.

The amount of power supplied may be less than the rating for a power supply.

If the amount of power supplied exceeds (or attempts to exceed) the rating for the power supply then a fuse might blow, the voltage might drop, smoke and flames could erupt from the PSU, and maybe other things.

17. ### jackorocko

1,284
1
Apr 4, 2010
it might help to think of batteries, if a load always consumed as much current as was available then a battery would never last very long.

But, we know from experience that depending on the load the same battery will last longer with a smaller load current.