# capacitors instead of rechargeable batteries

Discussion in 'Power Electronics' started by pharaon, Dec 6, 2014.

1. ### pharaon

392
6
Oct 28, 2014
is it possible to use capacitors instead of rechargeable batteries ?
if so then how can it be done

2. ### Bluejets

5,161
1,079
Oct 5, 2014
Yes, to a degree, but why would you need to?
Capacitors store a charge ( value can be calculated) but they also "leak" the charge over a period of time.
The time is dependant on many things.
As to how, the cap (usually electrolytic type) is basically connected to a source of dc supply.
It will charge to near the supply value.
Energy stored would depend on capacitance value.
Would be suited only to device drawing minimal amounts of current, like chip memory etc.
Then again, depends on what you have in mind.

3. ### (*steve*)¡sǝpodᴉʇuɐ ǝɥʇ ɹɐǝɥdModerator

25,500
2,840
Jan 21, 2010
Bluejets has covered most of the points.

From my perspective the only two things I would add are:
1. Capacitors are less energy dense than batteries (so a capacitor bank is larger for a given amount of energy)
2. Capacitors have a voltage which decreases as you discharge them. Batteries tend to maintain a more constant voltage for longer. This means that it becomes more difficult to provide a stable voltage from them under load and over time.

KrisBlueNZ likes this.
4. ### oz93666

38
1
Dec 7, 2014
A big problem with capacitors for energy storage is that they lose half of the energy put into them at the charge stage, and that's before leakage and problems with voltage compatibility at discharge...

Where did half of the capacitor charging energy go?
The problem of the "energy stored on a capacitor" is a classic one because it has some counterintuitive elements. To be sure, the battery puts out energy QVb in the process of charging the capacitor to equilibrium at battery voltage Vb. But half of that energy is dissipated in heat in the resistance of the charging pathway, and only QVb/2 is finally stored on the capacitor at equilibrium. The counter-intuitive part starts when you say "That's too much loss to tolerate. I'm just going to lower the resistance of the charging pathway so I will get more energy on the capacitor." This doesn't work, because the energy loss rate in the resistance I2R increases dramatically, even though you do charge the capacitor more rapidly. It's not at all intuitive in this exponential charging process that you will still lose half the energy into heat....

5. ### (*steve*)¡sǝpodᴉʇuɐ ǝɥʇ ɹɐǝɥdModerator

25,500
2,840
Jan 21, 2010
Unfortunately this post is completely wrong.

38
1
Dec 7, 2014
7. ### (*steve*)¡sǝpodᴉʇuɐ ǝɥʇ ɹɐǝɥdModerator

25,500
2,840
Jan 21, 2010
But you'll notice that there's no loss in the capacitor. The loss results from connecting a voltage source to a capacitor. Energy is lost in the series resistance.

So the problem is how you transfer the energy to the capacitor. Using a current source will reduce the losses to zero in theory. In practice you can get pretty close.

8. ### davennModerator

13,902
1,971
Sep 5, 2009
Yes indeed, and that was well clarified in the discussions on The Physics Forum

Dave

9. ### wdariusw

149
7
Nov 10, 2014
Very interesting topic. There is formula to calculate capacitance into mAh of something ? Let's say I have 180mAh battery, what capacitor (super-capacitor) i need ?

10. ### Bluejets

5,161
1,079
Oct 5, 2014
Capacitor energy is stored as joules (if my memory is correct).
This is not normally accessed as an amount of current for a particular time.
Driving circuitry via capacitors is a by-product of the fact there is an amount of energy available however they were never really designed with this sole purpose in mind.
Others more knowledgable will comment further I imagine.

One way would be to get one, charge it up and drain off with whatever load you have in mind and see for yourself.
Sort of "suck it and see approach".
Just take care as those super caps can be dangerous due to the amount of energy they can hold.

Last edited: Dec 7, 2014

149
7
Nov 10, 2014