A
Alexander Avtanski
- Jan 1, 1970
- 0
Hello,
I'm working on a hobby project (something like a really fancy timer).
The thing will be powered mostly from the grid, but when necessary I
should be able to unplug it and use it on internal power. My plan is
to use a rechargeable NiMH 9V battery, that is being kept topped off
by a trickle charger when the thingy is plugged in (that would be,
like, over 95% of the time).
I don't have any practical experience with building battery chargers.
So, if I'm making wrong assumptions somewhere, please let me know.
Here is my plan and line of reasoning:
Let say I use a 12V DC external power supply, and charge the battery
through the simplest possible trickle charger I could come up with - a
1 diode and 1 resistor circuit (no need for diagram here). A NiMH
battery, according to Wikipedia should have about 30%/month self-
discharge rate. For an arithmetically-challenged person like me this
means that a 150mAh battery (seems pretty typical for a 9V NiMH) will
have about 0.07mA self discharge current. Leaving some margin,
something like 0.3 to 0.5 mAh trickle charging current should keep the
battery nicely topped off. With 12V and a fully charged 9V battery
and taking off a 0.7V voltage drop over the diode I'll have 2.3V
voltage difference. This means that the resistor should be in the
5-10k range. That is ignoring the internal resistance of the battery
(which I don't have a clue about, but it should be insignificant
compared to the resistor).
Do you see any problems with this? Will that thing be safe, or it
will explode in a spectacular fireball after couple of hours? Do you
have a better idea for building the charger, that is not too complex?
Thanks,
- Alex
I'm working on a hobby project (something like a really fancy timer).
The thing will be powered mostly from the grid, but when necessary I
should be able to unplug it and use it on internal power. My plan is
to use a rechargeable NiMH 9V battery, that is being kept topped off
by a trickle charger when the thingy is plugged in (that would be,
like, over 95% of the time).
I don't have any practical experience with building battery chargers.
So, if I'm making wrong assumptions somewhere, please let me know.
Here is my plan and line of reasoning:
Let say I use a 12V DC external power supply, and charge the battery
through the simplest possible trickle charger I could come up with - a
1 diode and 1 resistor circuit (no need for diagram here). A NiMH
battery, according to Wikipedia should have about 30%/month self-
discharge rate. For an arithmetically-challenged person like me this
means that a 150mAh battery (seems pretty typical for a 9V NiMH) will
have about 0.07mA self discharge current. Leaving some margin,
something like 0.3 to 0.5 mAh trickle charging current should keep the
battery nicely topped off. With 12V and a fully charged 9V battery
and taking off a 0.7V voltage drop over the diode I'll have 2.3V
voltage difference. This means that the resistor should be in the
5-10k range. That is ignoring the internal resistance of the battery
(which I don't have a clue about, but it should be insignificant
compared to the resistor).
Do you see any problems with this? Will that thing be safe, or it
will explode in a spectacular fireball after couple of hours? Do you
have a better idea for building the charger, that is not too complex?
Thanks,
- Alex