How do chargers determine what battery pack is installed?

How do multi-voltage chargers (specifically, cordless power tool battery
chargers) determine whether the battery pack installed is a 7.2, 9.6, 12,
14.4, 18, or 24 volt pack before charging it? If any one of these is quite
depleted, it won't register its nominal voltage until it's partially charged.
Several of the lesser-voltage packs don't have identifying terminals on the
pack. How does the charger determine what voltage to apply?
Thanks,
Reply to
DaveC
Loading thread data ...
There must be some type of key, plastic or otherwise that activates certain switches depending on the type. Otherwise, there's no way of automatically telling.
Reply to
John Tserkezis
It doesn't know, and it doesn't need to. The typical peak-charge algorithm for NiCd and NiMH batteries doesn't care about absolute pack voltage, it applies a current and only cares about seeing the voltage go up and then decrease slightly (for NiCd) or hit a plateau (for NiMHs).
Reply to
Tim Wescott
I read in sci.electronics.design that DaveC wrote (in ) about 'How do chargers determine what battery pack is installed?', on Sat, 17 Jul 2004:
I expect they apply the lowest voltage first and monitor the charging current. If it drops to near zero in a few minutes, they increase the voltage to the next value, and repeat the process. A maximum current detector and limiter would detect 'a step too far' or a battery with a shorted cell.
Reply to
John Woodgate
On Sat, 17 Jul 2004 23:09:05 -0700, Tim Wescott wrote (in article ):
How much reduction in V (for NiCd) or how long a time at plateau V (for NiMH)?
Thanks,
Reply to
DaveC
Yes, but that leaves no room for saftey. Providing all cells are good, it'll work as expected, all you need is one cell short or otherwise, and it'll either over or undercharge the pack.
What about temperature monitoring? Bit flakey to rely on Delta V only.
Reply to
John Tserkezis
Many use the voltage of the discharged battery to identify the battery. If you run the battery stone cold flat, it won't charge it.
Reply to
Ken Smith
Smart battery system specifications (SBS) - which are not the kind of charging situations the poster ia talking about - specify a wake-up charge cycle to handle cells discharged to zero - if only to wake up the discharge battery's communication interface.
When Bosch makes a battery charger, they're primary concern is charging compatible bosch tools and batteries. This cuts down on the possible combinations and permutations considerably.
Supposedly you can hit any nicad or nimh battery with a hefty current below chargr output terminal voltage limits, for a short period, at any time - just to see the resulting terminal voltage. This will tell the charger quite a bit about the battery, before it has to make any decisions about charging parameters or charging safety.
Charging a battery with a shorted cell using simplest constant current dV termination need not result in a safety hazard. A charger doesn't need to detect bad batteries to operate safely. It does not need to fix those batteries either.
There are many smart things that chargers can be made to do, but there has to be a reason for the feature.
RL
RL
Reply to
legg
I don't know, and I strongly suspect that it depends on who's in 'charge' :) of the of the algorithm. It's on the order of 10-100mV per cell for NiCd and a couple of seconds for NiMH -- and I think you could go cheap and use the same algorithm for both; just detect the top of the peak and call it a plateau.
Gates used to put out a very nice battery book on Pb-acid and NiCd batteries, I don't know if it's still in print.
Reply to
Tim Wescott
This is what the industry seems to do, and it seems to work well. Apparently for a good pack the delta-V is a very good indication, and for a crapped-out pack -- well, it's crap anyway, right?
Reply to
Tim Wescott
I have designed a bunch of these while a "Fluke." I used several methods:
1: keyed terminals. different packs used very slightly different terminal arrangements. 2: extra terminal with connection to circuitry, sometimes just a resistor. This circuit could signal the main device or just be a programming resistor to set charging parameters. All packes used the same terminals, but the insides were different. 3: smart battery IC on the terminals, usually containing E^2 for memory. 4: keys on the packaging 5: then the obvious-- put in a constant current and measure the voltage in the main device.
Depended completely on the goals of the project. What was more fun was determining different types of batteries with the same terminal voltages (like NiCd vs NiMH).
DaveC writes:
Reply to
Steven Swift
On Sun, 18 Jul 2004 17:17:24 -0700, Steven Swift wrote (in article ):
How did you do that?
And does it matter? If you charge either NiCD or NiMH at a rate that won't damage either (for
Reply to
DaveC
I believe this method will either leave both under-charged or destroy a NiMH cell. The generally accepted (rapid) charge method for NiMH is a constant current with a temperature cut-off. During charge perhaps 10% of the energy goes into heating the cell. After charge 100% goes into heat so it's rather easy to sense this difference. This method works well for both technologies. V or delta-V sense works for NiCd, but is highly "not-recommended" for NiMH.
Yes the "Rechargeable Batteries Applications Handbook", originally published (and given away[*] to customers) by Gates Energy is/was a very good reference manual. I believe the rights were sold (Stoneham?) and Gates itself bought (?). The book is available on Amazon:
formatting link

[*] who swiped my copy?!
Reply to
Keith Williams

Site Timeline

PolyTech Forum website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.