The "impossible" question is to determine, with a brief test, the SOC of a
NiCd or NIMH battery that is presented for such test. The continuous
discharge curve is of **no use** as a predictor when a battery is used
intermittently - as is usually the case.
The URL linked to by RH shows that it is possible to assess the state
of at least one cell chemistry (ie lead acid). Cadex use some
proprietary algorithm to measure and evaluate the frequency dependency
of a cell's internal impedance.
As for intermittent usage, I would think that a smart chip could
become confused under such conditions. For example, how could a
coulomb counter possibly know how much capacity has been lost to self
discharge, such as when a cell remains unused for long periods? In
such a case, ESR may be a better predictor of remaining capacity.
Cells recover when at rest for some time, voltages rise and ESRs fall -
this * voids* the accuracy of tests that rely on those two parameters. The
temperature and age of the cells also has a major effect on those same
parameters.
An old battery with reduced capacity can also confuse a simple coulomb
counter.
As for temperature effects, just wait for the battery to cool to room
temperature before testing it. said:
The upshot is that a NiCd or NiMH battery placed on a brief test will show
an ESR and voltage reading consistent with near full charge when in fact the
true SOC is as little as 10 % of capacity.
- Franc Zabkar