Honda Insight Forum banner

The quintessential Insight NiMH voltage thread

49K views 283 replies 19 participants last post by  eq1 
#1 ·
I know, such optimism...

Normally I should probably have all my ducks in a row and actually have something substantial with which to start this thread. But I don't really. I was just thinking about how pack management probably revolves a great deal around working voltages specific to the stock Panasonic NiMH cells; about how important those voltages are; yet also how voltage always got short shrift around IC and still remains such a haphazard affair around here...

I imagined how great it would be if we could have a single thread that was all about Insight NiMH voltages - pack voltages, cell voltages, stick voltages, tap voltages, full voltages, empty voltages, typical full self-discharge voltages, normal 'working' voltages, 75% voltages, 'bad' voltages, etc etc...

One working theory I have at the moment involves the importance of the 1.20 to 1.25V range: I think good cells will really stick to this tight range, even at pretty high discharges rates. That's one major difference I've been seeing with my latest good rebuilt pack versus my other couple or so crummy packs. We all know that the nominal voltage of the cells is 1.2V - though I think it's really supposed to be 1.25V. That means something. Cells should be working at that voltage (during discharge); it's what they're meant to do...

I think the Insight battery management probably relies a lot on that 'middle' voltage...

1.34V is also a key value; it's the idealized equilibrium potential for the main reaction on the positive electrode. In practice, one thing it means to 'us' is that resting, open circuit pack voltage is often hovering around 160-161V (120 cells X 1.34V)...

Full voltages? The Panasonic spec sheet I have for cells that have often been said to be 'our' cells, the HHR650D, show a charge curve at 1C rate peaking at 1.53V... Over the years of grid charging, I've honed-in on full charge values of about 174-176V, charging at typical room temp and typical grid charge rates 300-500mA... 174-176V=1.45V to 1.467V were it just a single cell... I made a chart a while back for temp adjustments; I'll add that later...

0.78V is supposedly the equilibrium potential for the unwanted reaction that causes 'voltage depression' or 'memory effect' - analogous to how 1.34V is the equilibrium potential for the wanted, good and normal reaction...

0.68V - That's the voltage I've seen many times on cells that have self-discharged for a long time; self-discharged cells will hover around that voltage for a long time...

I'll try to come back and add 'voltage' stuff when it comes to mind. If you have any "voltage stuff" please consider adding it to the thread - questions, comments, values to report (cell-level or otherwise), and stuff related to voltage, like battery management. I guess that's kind of my main concern, fleshing-out how the Insight BMS uses voltages to do its thing, and probably mainly that 1.25-1.20V range...

Basically, it's really looking to me like, if your pack is NOT holding tight to the 1.2-1.25V range (144V to 150V), it's a sign that something's amiss - cells are drifting out of balance (most likely and/or most frequent issue), maybe voltage depression is setting in, maybe cells are just getting old and internal resistance is getting high, a combination of these, etc. And the thing is, you can really spot when your pack is in top form (or not) if you just watch how voltage is holding up.

For instance, when the pack is cool it won't hold that voltage so well or it won't be able to hold it at relatively high currents. Likewise, if cells are out of balance, it won't hold that range so well, warm or not.

And really, it's not the whole range: in really top form your (stock) pack will be holding 1.25V (150V), not 1.2V (144V) at at least about a 3C (19.5A) rate. And it will do this even at relatively low charge states - say 50%, it doesn't have to be 'full' 75%...

This is all stuff I'm grappling with - trying to juggle charge state, temperature, balance state, overall cell condition, and voltage - to get a good handle on pack condition/state. I've generally been seeing something like 144V to 150V at up to about a 3C rate between about 45% and 75% charge state, pack warmed up to at least 70F. There's wiggle room in these numbers depending on the exact values, like the exact temp, SoC, and current rate. I'd like to tighten this up...

What voltages under various conditions should I be seeing if my pack is 'good'? If my pack is at a nominal 50% charge state, and I'm discharging at a 3C rate, pack temp 75F, what voltage should I see with a good pack? Or a bad pack? It's all pretty subtle - at this point I can pretty much tell when the cells have drifted about 20% out of balance. I'd like to lower that figure to some degree, plus be a bit more sure about it...

Right now, pack voltage sag at moderate charge states (maybe 50-60%) is my main realtime, tell-tale sign of imbalance. And I'm talking about only a few volts difference, like seeing about 139V instead of about 142V, warmed pack. Or maybe it's seeing that sag at slightly higher charge states; like instead of seeing it down around 45% SoC, it might be 60% SoC... I'd like to tighten-up these kinds of observations, like I'd like to be able to tell just what 'state' my pack is at with just a few charge/assist runs - like get the pack warmed up, do a couple assist runs, see the numbers, know the state... I'm almost there...
 
See less See more
#2 ·
I was just thinking about how pack management probably revolves a great deal around working voltages specific to the stock Panasonic NiMH cells; about how important those voltages are; yet also how voltage always got short shrift around IC and still remains such a haphazard affair around here....
FWIW I think you are right and the stock BCM is very attuned to those specific cells.

It does not always like or work 100% correctly with replacements including the BB cells we all use.

The differences and voltage nuances are very difficult to pin down though.

Your post is a good crack at the idea.

The fact we can't modify the BCM software does hamper the way it works with non oem cells.
 
#3 · (Edited)
From my very limited experience BCM 010 works fine with the BB aftermarket cells, while the 020 is a bit shakey. Just two cars and two BCMs. Anecdotal, and not much data. The two combinations were 010/ HybridRevolt and 020/Bumblebee, though I strongly suspect the cells were identical and came from the same manufacturer so the BCMs would thus be the explanation.

I'll furnish one data point on the voltage issue, though it is rather specific to the way I check out sticks. The 70A load voltage of good cells at the 2000mAh discharge point is very close to 1.08 +/- .02V. That seems very repeatable. Cells with lower voltages will usually deliver somewhat limited capacity.
 
#4 · (Edited)
^hmm, your post/'data point' got me thinking a bit and looking at an older chart I made of voltage drop vs. current and IR. I'm reminded of an idea I had a while back on how to combine a bunch of different info into a single graphical representation. If I find some time I'll experiment with that...

Meanwhile, looking at that one older chart, all else being equal, seeing about 1.1V, 132V for a pack, at a 70A discharge, assuming about 160V at the start of the discharge (i.e. assuming voltage would probably bounce back to about 160V were you to stop the discharge after that 2000mAh drain and then re-initiate the 70A discharge) and a 28 volt drop, puts the cell-level internal resistance at about 3.3 milliohms... The Panasonic spec sheet I have reports IR as about 2 milliohms...

IF we had 2mΩ cells, same other assumptions as above, we'd be seeing about a 17 volt drop - so about 143 volts for the pack at 70A discharge, about a 10kW load... Of course there's some resistance in all the connections; my guess is something like 2.5mΩ to 3.5mΩ per cell is the real-world acceptable/workable/doable range...

Let's assume 3mΩ at cell level, including connections, is good, normal. And let's also continue to assume that 160-161V, 1.34V per cell, is 'the' starting voltage no matter the charge state (or at least between say 25% and 75%). Let's assume normal working temperature - say about 75F - and perfectly balanced cells:

At 20A discharge we should see a pack voltage of about 153.5V, or 1.28V per cell.
At 50A discharge we should see a pack voltage of about 142.5V, or 1.19V per cell.
At 70A discharge we should see a pack voltage of about 135.5V, or 1.13V per cell.
And at 100A discharge we should see a pack voltage of about 124.5V, or 1.04V per cell...

One should be able to see how these values match fairly well what the Insight demands of the pack. And you can imagine how different conditions, such as cool cells, low charge state, cell imbalance, aging cells, will start to push things to the 'edges'...

Here's that chart. I find it pretty useful for thinking about this kind of stuff. I think originally I was more interested in observing real-world voltage drops on the OBDIIC&C and then finding out what internal resistance my cells were at:

Oh, one thing: I think I used 164V for this chart to determine the "OK", "Iffy", etc. colors and labels, i.e. if the voltage drop were 42 volts and you started at 164V, 164-42=122V, you'd still be above the 120V minimum (1V per cell). I think 160-161V is a better starting point though, it's more central...
 
#5 ·
Probably not what you're looking for, but focusing on "NiMH" and "voltage," I have the following to offer:

Tap resting voltages of a healthy pack will be less < 0.2V between min/max

Heavily loaded tap voltages of a healthy pack will be < 0.6V between min/max

TRULY healthy sticks WILL NOT self-discharge below nominal over MANY MONTHS.
 
#6 ·
Probably not what you're looking for .... Tap resting voltages of a healthy pack will be less < 0.2V between min/max. Heavily loaded tap voltages of a healthy pack will be < 0.6V between min/max.
No, or yeah, this is the kind of stuff I'm interested in. I think I'm looking to get more specific, like semi-quantify "healthy pack" and "heavily loaded," and I guess those types of voltage spreads as well...

And before I forget, there's a handful of subtle 'voltage-related' issues I can't get a handle on. For example, one thing I've noticed that I always forget is that 'good' cells seem to actually have a lower resting voltage than 'bad' cells after being fully charged (or whenever, maybe not necessarily fully charged)... In general I'm usually thinking that higher resting voltage is better, but I think that might not be true; I think that higher resting voltage might be an indicator of badness... No real idea why though, just seems like I've seen this at the cell and pack level, where, when I know this or that cell is better or good, or I know my pack is fully balanced and running strong, the resting voltages are lower - like instead of sticking around 1.40V for a long time, voltage falls to something like 1.34-1.37 fairly quickly but just hangs out there for a long time... Pack voltage - it might be 160-163V instead of maybe 165V or more... I forget the other "subtle" voltage issues at the moment...
 
#7 · (Edited)
One working theory I have at the moment involves the importance of the 1.20 to 1.25V range: I think good cells will really stick to this tight range, even at pretty high discharge rates....We all know that the nominal voltage of the cells is 1.2V - though I think it's really supposed to be 1.25V. That means something. Cells should be working at that voltage (during discharge); it's what they're meant to do...

I think the Insight battery management probably relies a lot on that 'middle' voltage...

1.34V is also a key value; it's the idealized equilibrium potential for the main reaction on the positive electrode. In practice, one thing it means to 'us' is that resting, open circuit pack voltage is often hovering around 160-161V (120 cells X 1.34V)....
These I think are really one and the same thing, or similar and/or related - this 1.20V to 1.25V range and this 1.34 'equilibrium' voltage. The equilibrium potential for the positive electrode is 1.34V, but then take into account internal resistance and operation under 'non-equilibrium' conditions (i.e. under loads, charging and discharging, rather than resting).

When you take that reported 2mΩ cell-level internal resistance value and apply it to 1.34V under various discharge loads, you approach the nominal voltage. For instance, with an IR of 2mΩ, a cell with a nominal 1.34V will see a voltage drop equal to 0.1V at a 50 amp discharge, which drops the cell voltage to -- 1.24V...

So, this 'equilibrium' voltage and the nominal voltage are in essence the same thing. I think it's more cut and dried, too, when it comes to NiMH electrochemistry - since the way that electrochemistry works produces flat charge and discharge voltage curves, i.e. the nominal voltage is less of an average over the entire curve, like it might be for some lithium chemistries, more of a real working value across the whole capacity... It has to do with the type of reaction, something like a two-phase insertion reaction, which produces charge and discharge voltages that are independent of the charge state, one of the good things about NiMH...

Interestingly, when you think about it, to some degree the extent to which charge and discharge voltages vary from the equilibrium reflects the extent to which cells are being pushed closer toward their limits, at least based on what they're supposed to do, based on the electrochemistry, assuming a perfect cell and/or reactions... I guess you take internal resistance into account and then you're left with this working voltage range, like 1V to ... let's call it 1.6V, as it's been said that 192V is the Insight pack voltage max and 192V/120 cells=1.6V. I've read 1.50V is a thermodynamic max or something of that ilk, above which overcharge reactions take over or at least happen more; and then 1.53V is the peak voltage at full under a 6.5A 1C charge according to Panasonic; and then I think that Panasonic spec sheet also reports 1.8V as being some critical max voltage... I would probably pick 1.53V as a conservative max voltage under any conditions - like if I designed the BMS I don't think I'd let voltage go beyond 1.53V ever...

Let's take that 2mΩ resistance value into account again and see what kind of current we can charge at to keep voltage below 1.54V: With an equilibrium voltage of 1.34V, we have headroom equal to 0.19V; 0.19V/0.002Ω=95 amps...

How about 3mΩ?: 0.19V/0.003Ω=63.3 amps...

Pretty sure this is the way you do it... Of course, these are just back-of-the-envelope; cell voltage isn't always 1.34V. For instance, if you've been hitting regen a lot you're seeing cell voltages up to something like 1.416V (no load), which leaves us with headroom equal to only 0.114V. At 3mΩ, our max charge current could only be something like 38 amps (again, assuming we want to keep cell voltage below 1.54V)... And then we're not working with perfectly balanced and matched cells, and not just 1 but 120 of them. Etc etc...
 
#9 · (Edited)
....Over the years of grid charging, I've honed-in on full charge values of about 174-176V, charging at typical room temp and typical grid charge rates 300-500mA... 174-176V=1.45V to 1.467V were it just a single cell... I made a chart a while back for temp adjustments; I'll add that later....
Here's that chart (table) I made for adjusting voltages for temperature differences. It's based on only few pieces of data from the Panasonic spec sheet for the HHR650D cell. The lone values at the far right column are the values you'd use in a pinch to adjust voltages.

For example, for temperatures between freezing and 19 Celsius, you'd adjust voltages plus 0.360V per degree C below 20 degrees C. For temperatures between 20 and 40 Celsius you'd adjust voltages minus 0.312V per degree C above 20 degrees C...

For an Insight pack, the idea is that, if say you're grid charging at say 10 degrees C, you'd expect to see a higher peak voltage because the ambient temp is colder and voltages are higher when colder. Instead of the normal 174-176V you might expect to see plus 10 X 0.360V =+3.6V, or about 177.6V to 179.6V...

If you're charging at say 30 degrees C (86F), you'd adjust the expected peak voltage downward - by 10 X 0.312V, or 3.12V. Instead of the normal 174V to 176V, you might expect to see 170.9V to 172.9V...

I can't remember exactly what I used for this table and how, so take it as an approximation. The other thing is, I'm not sure what "ambient" temperature means when it comes to charging the pack in the car. The pack will heat up and heat the air around the pack - so what's 'ambient temp' at that point? Is it the temp outside the car or the temp in the battery box? I suppose if you're running the fan on high you could probably use outside air temp...



A more exact method to calculate an adjustment value is to look up the temperature you're charging at, read the voltage value for that row under the 'pack V' heading, and then subtract the voltage value in the same column for the 20 degree C row. For example, if you're charging at 30 degrees C, the Pack V value is 180.4, the Pack V value for 20 degrees C is 183.6, and the difference is 3.2V -- subtract 3.2V from the expected peak voltage. If you take 174-176V to be normal, then the adjusted peak voltage will be 170.8V to 172.8V...

[edit] I just 'captured' a bit of rough data while charging a stick and putting a fan on it, and the values for temp change vs. voltage change more or less match the values derived from the table above. So a bit of external validation for this table... Basically, I saw a temp drop of about 9C degrees between 37C and 28C and the voltage increase was 22.5mV. Looking up that temp change in the table and calculating values, I should have seen 23.3mV (per cell) - pretty darn close...

So, around room temp up to about 40C, you should expect to see about a 2.6mV change in cell voltage per degree C...

Here's a post with a graph illustrating the impact of cooling on the end of a stick charge; the CV charge allows us to see the impact of cooling reflected in the current curve. As the cells cool, upward voltage pressure increases, current has to drop to maintain the set constant voltage level: http://www.insightcentral.net/forum...putzers-even-ness-cell-temps.html#post1219066
 

Attachments

#277 ·
Here's that chart (table) I made for adjusting voltages for temperature differences. It's based on only few pieces of data from the Panasonic spec sheet for the HHR650D cell. The lone values at the far right column are the values you'd use in a pinch to adjust voltages.

For example, for temperatures between freezing and 19 Celsius, you'd adjust voltages plus 0.360V per degree C below 20 degrees C. For temperatures between 20 and 40 Celsius you'd adjust voltages minus 0.312V per degree C above 20 degrees C...

For an Insight pack, the idea is that, if say you're grid charging at say 10 degrees C, you'd expect to see a higher peak voltage because the ambient temp is colder and voltages are higher when colder. Instead of the normal 174-176V you might expect to see plus 10 X 0.360V =+3.6V, or about 177.6V to 179.6V...

If you're charging at say 30 degrees C (86F), you'd adjust the expected peak voltage downward - by 10 X 0.312V, or 3.12V. Instead of the normal 174V to 176V, you might expect to see 170.9V to 172.9V...

I can't remember exactly what I used for this table and how, so take it as an approximation. The other thing is, I'm not sure what "ambient" temperature means when it comes to charging the pack in the car. The pack will heat up and heat the air around the pack - so what's 'ambient temp' at that point? Is it the temp outside the car or the temp in the battery box? I suppose if you're running the fan on high you could probably use outside air temp...



A more exact method to calculate an adjustment value is to look up the temperature you're charging at, read the voltage value for that row under the 'pack V' heading, and then subtract the voltage value in the same column for the 20 degree C row. For example, if you're charging at 30 degrees C, the Pack V value is 180.4, the Pack V value for 20 degrees C is 183.6, and the difference is 3.2V -- subtract 3.2V from the expected peak voltage. If you take 174-176V to be normal, then the adjusted peak voltage will be 170.8V to 172.8V...

[edit] I just 'captured' a bit of rough data while charging a stick and putting a fan on it, and the values for temp change vs. voltage change more or less match the values derived from the table above. So a bit of external validation for this table... Basically, I saw a temp drop of about 9C degrees between 37C and 28C and the voltage increase was 22.5mV. Looking up that temp change in the table and calculating values, I should have seen 23.3mV (per cell) - pretty darn close...

So, around room temp up to about 40C, you should expect to see about a 2.6mV change in cell voltage per degree C...

Here's a post with a graph illustrating the impact of cooling on the end of a stick charge; the CV charge allows us to see the impact of cooling reflected in the current curve. As the cells cool, upward voltage pressure increases, current has to drop to maintain the set constant voltage level: http://www.insightcent al.net/forum...putzers-even-ness-cell-temps.html#post1219066
Here another chart. Mite help sean
 
#10 · (Edited)
Voltage Taps

As many times as I've suggested to people having IMA troubles that they check their voltage tap voltages, I never actually checked them myself -- until today. I thought I should jot down what I found...

First, for the uninitiated, the IMA pack is made up of 20 'sticks' or 'sub-packs', each of which is made up of 6 "D"-sized cells. The car taps into pairs of sticks and monitors voltages for those pairs - so there's 10 pairs, 10 voltage taps. The battery condition monitor, or BCM, is responsible for those taps. The BCM is one of the square silver computer boxes on top of the battery pack, on the left side if you're facing the pack from the hatch. The taps connect to the BCM at the back of the computer; when you open the battery compartment, the tap connector is right in front of your face... You can 'back probe' each wire ensconced in the grey plastic connector and check the voltages of stick pairs...

BCM 'Connector C' - Voltage Taps (left connector)
image credit: Bumblebee Batteries


The BCM seems pretty concerned about tap voltages - so it's probably a good idea that we be at least a little concerned. A few IC'ers, including myself, tend to believe that tap voltages that are too far apart are one if not THE leading cause of IMA trouble codes. A couple people have put a value on some of that - like if the tap voltages are more than 1.2V apart under load, you'll get a code. It says something like that in the Troubleshooting manual, though it's a little more nuanced than just this...

Anyhow, I checked my tap voltages. I have a self-rebuilt pack that's been working very well for maybe half a year now. It's actually working the best it ever has at the moment. I grid charged it about a week ago, have been using it regularly, and I let it 'positively recal' to a nominal 75% before I let it rest over night and then checked the tap voltages. So basically, these are tap voltages for a known, good-working pack under these circumstances, and "these circumstances" generally mean fully car-charged, balanced, at roughly 75 degrees F, after sitting over night:

16.04, 16.07, 09, 05, 08, 02, 01, 09, 04, 05
So a range of 16.01V to 16.09V
Average=16.054V, +-0.04V


Here's a graphic depicting the BCM voltage tap connector as it looks from the back plugged-in, the wires that you'd probe with your DMM to read the voltages, and how the sticks corresponding to the taps are arranged in the pack:

fyi, "thrm" is shorthand for "thermistor," indicates to which sticks the 4 temp probes attach...

If you're having IMA troubles, that'd be a great time to check voltage taps. I think chances are high that you'll see a much greater range of values than I posted above...
 

Attachments

#11 · (Edited)
On another voltage-related note, I swapped back in my 010/030 BCM/MCM, replacing the 305 combo I've been running for a while -- and discovered that my 305 BCM reads pack voltage about 5V higher than the 010 BCM. I posted about that here:
http://www.insightcentral.net/forum.../20218-bcm-mcm-ecm-revisions.html#post1006041

So keep in mind that different BCMs don't necessarily measure and/or report the same pack voltages. 5V is a big difference...

EDIT: turns out I had been reading Mvo on the 305 BCM and comparing to Bvo on the 010 BCM. Both BCMs read Mvo and Bvo the same. Mvo is +6V higher than Bvo. Mvo is +2.8V vs. actual and Bvo is -3.2V vs. actual...
 
#12 · (Edited)
Voltage Taps Part 2

I stumbled upon a graph I had saved, that Eli Bumblebee Master must have posted in one of the threads at IC (couldn't find it), that goes a long way toward illustrating what's going on in terms of battery management at the tap level.

Eli must have logged tap voltages during a long assist event. The upper blue curve shows assist current, on the right hand Y-axis, while the 10 tap voltages are on the left hand Y-axis. Time in seconds is on the bottom.

87911


The general sweep of the graph goes like this: Start with full assist. Full assist is quickly throttled back to acceptable levels. Assist continues until, most likely, the car's 'empty' threshold. I think we can reasonably take the values at the key transitions as approximate Insight BCM/MCM management thresholds. Plus, the values generally fit with what we know about NiMH electrochemistry/use.

You can see how assist current initially peaks at about -75 amps. Voltage for the lowest 2 taps drops to about 10.5V (average of 0.875V per cell), while other tap voltages range from about 12V to 13V at this point (avg. of 1V to 1.08V per cell).

This is an old, weak pack - so current is quickly throttled back. Time from peak current to throttled-current looks like it takes no more than about 5 seconds. Assist current is throttled to about -25 amps. Voltage for the lowest taps rises to about 14V (avg. 1.17V per cell), while voltage for the others ranges from about 14.5V to 15V (avg. 1.21V to 1.25V per cell).

The assist discharge continues for another 500 seconds or so. Over this period, current gradually decreases to a low of about 15 amps at about 540 seconds. Voltage for the lowest taps at the end of this period is about 13.0V (1.08V per cell avg.), while voltage for the others is about 13.5V (1.13V per cell avg.)...

Overall, the BCM and MCM adjust assist current lower and that keeps voltage for the lowest taps between about 13V and 14V, or roughly 1.1V to 1.2V per cell average...

 
#13 ·
It is a very interesting graph which I guess I missed somewhere along the road. The specific values of the taps are useful in confirming my "severe" sorting routine - as Keith calls it. I have found that "good" sticks will support 6 volts @ 35A for pretty long periods of time, and that becomes my measure of quality when separating the good from the bad. Thanks you for the refresher.
 
#14 · (Edited)
Strictly speaking, "6 volts at 35A" wouldn't be so good. The worst sticks in the pack graphed above are at about 7 volts after the current gets throttled to about 25 amps. The others hit about 7 volts at 20 amps midway through the discharge; they don't fall below 6 volts even at the initial 75 amp discharge. And this is for a bad pack...

Plus, at 6 volts you're already below the level the car allows - and that's at only a 35 amp rate, yet the car demands more like 80 amps+ at max, 45 amps typical max sustained in 1st, 4th, and 5th gears...

Maybe your 6 volt figure is a typo?, or maybe you're measuring voltage through the loaded wires (kind of doubt that though since you're an expert)...

Some posts up I listed some discharge rates and voltages we might expect for good cells/sticks, assuming they were all at about 3mΩ; it'd be more like 7.35V, 1.22V per cell, at 35 amps... (permalink to post #4): http://www.insightcentral.net/forum...l-insight-nimh-voltage-thread.html#post995466

In fact, you posted just above post #4 that you were seeing 1.08V per cell at a 70 amp rate. That's more like it... 6V at 35 amps isn't good, basically. I want to make sure people get that straight.
 
#15 ·
Strictly speaking, "6 volts at 35A" wouldn't be so good. The worst sticks in the pack graphed above are at about 7 volts after the current gets throttled to about 25 amps. The others hit about 7 volts at 20 amps midway through the discharge; they don't fall below 6 volts even at the initial 75 amp discharge. And this is for a bad pack...
It is a little difficult to make direct comparisons since my loads are fixed at approximately 70A and 35A, mid discharge cycle, and there is no BCM/MCM management involved. I was just making the observation that the data for good sticks seems to generally agree with what I was seeing by entirely different methods. I define good sticks as those which can reach 5500 mAh capacity under the my test conditions. I won't go into methods because it just leads to interminable arguments, sorry, but suffice it to say that I'm just trying to find good sticks. I am doing nothing to try to heal sticks. I have accumulated enough sticks that I have that "luxury."

I just don't have the time right now to respond to this kind of very detailed discussion. But one quick point, the 6 V is right at the end of the exhaustion of the capacity test. I'll try to take some evening time to respond if I think I can do it adequately, but the long winded discussions are generally the reason I avoid discussing the topic at all. Plus it seldom leads to any agreement at all:(
 
#17 ·
Has anyone installed a voltage logger/monitor on these taps? It might be interesting. Also can you recondition pairs of sticks through these taps (keeping in mind pack reconditioning probably does as good a job as working with individual sticks unless you want to get philosophical).
 
#18 ·
Yes. Search for JimIsBell posts. I think Retepsnikrep just did one recently.

Conceivably, yes, but the wire is very fine gauge, so currents would be severely limited.
 
#22 · (Edited)
I posted this in another, current thread [http://www.insightcentral.net/forum...-what-does-full-assist-mean.html#post1147770], but thought it should go here too...


Here's a couple graphs pulled from logged OBDIIC&C data, illustrating short duration 'full assist' on my rebuilt OEM pack. It's been a 'good' pack for the year+ I've been using it, though it's not top-notch. The cells are probably around 4mO, compared to new at something like 2-3mO - so this pack sees more voltage drop under assist load and more voltage rise under regen load than a good, new pack would, and definitely more than a new aftermarket pack... (Also, my apologies if these pics disappear in March, as Dropbox might be changing the public links and I likely won't have the patience to deal with changing anything).

The red curves are amps, blue =pack voltage - or at least MDM voltage ('Mvo' - 'Bvo', the real pack voltage, has usually been a little lower than Mvo, but I've read different things at different times so haven't been able to nail that down)... Green is nominal state of charge and black is pack temp...


Looking at the red curve, the full blown full assist event happens between about 40:44 and 40:48. You can see that I actually only eek-out about 1 1/2 seconds at the full value. The output at that max is about 90 amps at 125V, which equals 11.25kW. I'm pretty sure that if I were watching the 'Ikw' parameter on the OBDIIC&C, it would have displayed "10kW" for 4 seconds, before dropping to the lower 'full assist' command value... At about 40:48 the output current drops to about 45 amps and pack voltage rises to about 142V, for an output of about 6.4kW. This of course is that second stage 'full assist' level. I let that go for only a couple seconds...

All this started with the pack at about 170V (again, Mvo value, which is higher than Bvo), 70% nominal charge state, and the temp at about 78F - i.e. a fairly car-full pack, warmed-up...

Here's another 'full assist' event that happened shortly after the one above. There's more 'noise' here - stuff other than just a clear-cut full assist event. But figured I'd post this too for you to look over:


For comparison, here's a graph Eli posted a long time ago, showing full assist commanded with MIMA - on a 'betterbattery' pack. This pack puts out about 2 minutes and 30 seconds of 'full-blown full assist', here about 70-75 amps at about 132-139 volts (call it 72.5A at 135V, for ~9.8kW). You can see how the red amps curve 'gets throttled' at the nominal 38% charge state, after which the pack still has a lot of juice left, but at the lower discharge rate (though still 30 amps)... That's some impressive stuff: This pack is putting out about 30 amps at 140V even after 2 1/2 minutes of 'full-blown full assist'. My pack in the graphs above was at about 140V and 45 amps after only a short blip of 'full-blown full assist'...
 
#23 · (Edited)
Here's a link to some tap voltage musings I posted in another thread, addressing someone's questions about his tap voltage measurements: http://www.insightcentral.net/forum...trange-ima-issue-please-help.html#post1167921

Here's an excerpt: I'm under the impression that functioning sticks with 'some' charge should usually be between about 1.32 to 1.37V per cell, or 15.84V to 16.44V for a tap. You've got 4 or 5 taps that fall just below or quite below this range. The problem is it's hard to know how the sticks were left/charged prior to measurement, hard to tell what's causing a low value, etc. For instance, if all sticks were equally charged at time A, a low tap voltage at time B would suggest relatively high self discharge - maybe for 1 cell in the tap, maybe for a few, who knows... On the other hand, one or more cells may never have been charged to the degree others were charged, because those one or more cells are bunk...

I think I might grid charge to what appears to be full, log tap voltages maybe 5 minutes after the grid charge, and then check tap voltages again in a few days. Do any taps pop out? This seems easy, doable, and might reveal more about the problem...

If one or more cells are truly 'bunk', then I'd expect one or more taps to be quite lower than the others at the 5 minutes after grid charge measurement; I'm not really sure of the absolute values though, I think 'normal' might be something like 16.8-17V... If one or more cells has high self discharge, then the tap to which it belongs will be lower than the others at the 3 day or so measurement point... I'd treat as suspect any tap that falls below 15.84V for sure; I'd be suspicious of any tap that falls below 16.08V...
 
#27 ·
....I'm under the impression that functioning sticks with 'some' charge should usually be between about 1.32 to 1.37V per cell, or 15.84V to 16.44V for a tap.
As it stands, this is definitely not right... It'd at least need a bunch of qualifiers to make it true... For example, at the moment I'm working with a few used sticks and their voltages weren't at least 15.84V/2=7.92V when I started (the two I've worked over a bit were 7.67V and 7.91V) - yet they're certainly functional and fairly charged (I pulled about 4Ah out of each at 6.5A)...

There's a concept I've come across that I can't really grasp, that I think might apply. A couple research papers I read talked about "modes" - the "charge mode" and the "discharge mode". The authors mention in passing how this or that voltage curve 'falls to the discharge mode' or 'approaches the charge mode'... All I could really gather is that there's a tendency at one time or another for voltages to gravitate toward the charge or discharge mode. Normally, or ideally or theoretically, the charge and discharge curves are generally flat and the charge and discharge reactions happen within a narrow voltage range. Yet at any given time it seems like a given cell's voltage can be closer to the discharge mode voltage - say around 1.20-1.25V, or the charge mode voltage, say around 1.35-1.40V...

Yesterday I briefly charged my hobby charger battery power supply (4 Insight sticks), which were fully discharged. I only put like 1000mAh or so into them, yet resting voltage at the end was about 1.34V per cell... On the other hand, I worked with some sticks that most likely had been 75% charged some months ago and then sat. Their voltages were as mentioned above, call it 1.28V per cell. Yet these sticks were charged maybe 60%, me pulling about 4Ah out of them.

So, my power supply sticks were barely charged with that 1000mAh, yet the cell voltage was 1.34; these sticks that sat for a while were at 1.28V - yet they were way more charged than the power supply sticks...

This discrepancy, this circumstance, is something I just can't understand, and I can never seem to find anything that discusses why and how this is the case. But I think it has to do with this 'mode' concept - that, for whatever reason, a cell can be more charged than another yet still have a lower voltage...

So anyway, this idea that a 'functional' stick has to have 1.32 to 1.37V per cell needs a lot of work. I might rather say that a good cell, ones operating the way they're supposed to, are likely to hit those voltages after a brief charge, when they're easily pushed to the 'charge mode'. I might like to say that cells with some funkiness are more likely to gravitate toward the discharge mode voltages - voltages down toward 1.25V, I think...
 
#24 ·
I'm measuring tap voltages on the spare tap I got, measuring self-discharge. Here's my latest readings, maybe you guys can help me interpret it:

a: 7.35
b: 7.32 ab: 14.67
c: 7.2
d: 6.66 cd: 13.87
e: 7.28
f: 6.62 ef: 13.90
g: 7.48
h: 7.37 gh: 14.86
i: 7.20
j: 7.18 ij: 14.38
k: 6.24
l: 6.13 kl: 12.38
m: 5.10
n: 7.44 mn: 12.54
o: 7.30
p: 7.05 op: 14.34
q: 5.63
r: 6.58 qr: 12.21
s: 7.18
t: 7.4 st: 14.59
pack: 138.2

This is a pack that's hasn't been in a car for 4-5 months. It's self discharged from 140V to 138.2V in the last 10 days. Looking like sticks m and q are good candidates for replacement either due to high SD or a shorted cell. dfklr worth monitoring as slightly less healthy, but the rest look pretty good.
 
#25 ·
Generally speaking, the line is basically 7.2V. Below that, you likely have excessive SD. At or above it, you may be okay. Good sticks don't self-discharge very much at all. Some, none or all may improve with reconditioning cycles.

Short answer: 12 are above, 8 are below. 12 likely are in a reasonable state of health. 8 need reconditioning of some sort or replacement.

The fact that we're looking at a snapshot 4-5 months after last-use, it's really impossible to say anything conclusive except the ones at or above 7.2V are probably okay. I'm surprised that it has dropped 1.8V over the last 10 days given the depletion over the last 4-5 months. If your 140V and 138.2V measurements are not taken identically, you can't compare them, i.e., direct full pack measurement or sum of taps. You can't compare one to the other.

Basically, you're checking in an unknown state. All you can say is that anything < 7.2V is discharged. It would be beneficial to establish a state and assess. The easy answer is to grid charge to 100% SoC @ 350mA for 30 hours - near guarantee of full charge. Check taps before termination. Check taps 24 hours after termination, discharge and check taps as you approach 144V or 120V.

Cells that have excessive SD may be very usable with periodic grid charging. I have a pack of those in my Insight. it depends on what you're trying to accomplish - a trouble free pack with no maintenance or one that works fine with periodic grid charging/discharging.
 
#26 ·
#28 ·
Not sure. Again, there is nothing conclusive beyond the data itself, and much of the data is derived from an unknown or abnormal state.

As it stands, this is definitely not right... It'd at least need a bunch of qualifiers to make it true... For example, at the moment I'm working with a few used sticks and their voltages weren't at least 15.84V/2=7.92V when I started (the two I've worked over a bit were 7.67V and 7.91V) - yet they're certainly functional and fairly charged (I pulled about 4Ah out of each at 6.5A)...

There's a concept I've come across that I can't really grasp, that I think might apply. A couple research papers I read talked about "modes" - the "charge mode" and the "discharge mode". The authors mention in passing how this or that voltage curve 'falls to the discharge mode' or 'approaches the charge mode'... All I could really gather is that there's a tendency at one time or another for voltages to gravitate toward the charge or discharge mode. Normally, or ideally or theoretically, the charge and discharge curves are generally flat and the charge and discharge reactions happen within a narrow voltage range. Yet at any given time it seems like a given cell's voltage can be closer to the discharge mode voltage - say around 1.20-1.25V, or the charge mode voltage, say around 1.35-1.40V...

Yesterday I briefly charged my hobby charger battery power supply (4 Insight sticks), which were fully discharged. I only put like 1000mAh or so into them, yet resting voltage at the end was about 1.34V per cell... On the other hand, I worked with some sticks that most likely had been 75% charged some months ago and then sat. Their voltages were as mentioned above, call it 1.28V per cell. Yet these sticks were charged maybe 60%, me pulling about 4Ah out of them.

So, my power supply sticks were barely charged with that 1000mAh, yet the cell voltage was 1.34; these sticks that sat for a while were at 1.28V - yet they were way more charged than the power supply sticks...

This discrepancy, this circumstance, is something I just can't understand, and I can never seem to find anything that discusses why and how this is the case. But I think it has to do with this 'mode' concept - that, for whatever reason, a cell can be more charged than another yet still have a lower voltage...

So anyway, this idea that a 'functional' stick has to have 1.32 to 1.37V per cell needs a lot of work. I might rather say that a good cell, ones operating the way they're supposed to, are likely to hit those voltages after a brief charge, when they're easily pushed to the 'charge mode'. I might like to say that cells with some funkiness are more likely to gravitate toward the discharge mode voltages - voltages down toward 1.25V, I think...
I can't tell you the exact number of times I've seen evidence of the bolded portion, but it's in the 100s.
 
#30 · (Edited)
Well, it's one thing to read everywhere that NiMH voltage doesn't correspond to state of charge; it's another to understand, or even just grasp, why and how that would be so... And, there is a "correlation," it's just not the typical, linear and/or positive relationship you see with household batteries, car batteries, and lithium, for instance... The type of reaction dictates that voltage will be independent of the state of charge (can't remember what it's called, 'two-phase something-or-other').* Yet that reaction/those reactions do happen within a specific voltage range. And then there's other factors that play a part in influencing the voltage you actually see, like internal resistance, resistances in your equipment, and stuff...

Higher voltage should mean higher charge state: what's going on in the NiMH cell that makes it so this isn't true? In the past I pictured a 'floating window' within the cell, like a cell within a cell: the voltage you read is a reflection of the cell within the cell, the conditions immediately surrounding that window, as well as the circumstances leading to that window. Yet that window doesn't reflect the conditions in the entire cell, the charge state of the entire cell... I'm trying to think of other good analogies, but I can't. I keep picturing sails and feathers - in the wind. The 'cell within the cell' is like a sail in the wind, maybe: the wind represents electrical current, the cell's recent usage, stuff like that, while the sail itself is the ... location in the cell at which the two-phase reaction interface is situated. When the wind blows one way, the sail catches the wind and moves in that direction, akin to say a charge current moving the reaction interface and the voltage increasing. But the wind isn't perfectly steady and always in one direction; the sail can kind of flutter around, ebb and flow, back to where it was originally...

I don't know, I'd have to think about it some more, and maybe re-read some stuff...

[*Later...] A couple things I came across in some pages of a book I've relied on for a while for this type of info:

-there's different ways to describe the type of reaction on/in the positive nickel electrode, involving Ni(OH)2 (nickel hydroxide) and NiOOH (nickel oxy-hydroxide). It's an "insertion reaction" at a "two-phase interface." It's also a "solid-solution" reaction. And I think the term "intercalation" is thrown in there, too. And if that's not enough, it's also called a "pseuo-binary" reaction. I think the "pseudo-binary" aspect and "two-phase" aspect are the same thing, though not positive. One or both of these, though, are what makes NiMH voltage independent of the state of charge.

Here's a few sentences from this Huggins book that speak to this idea: "When the electrode is fully reduced, its structure consists of only H2NiO2 [aka Ni(OH)2], whereas oxidation causes the interface to translate in the opposite direction until only HNiO2 [aka NiOOH] is present. Although these are both ternary phases, the only compositional change involves the amount of hydrogen present, and the structure of the host 'NiO2' does not change. Thus, this is a pseudo-binary reaction, although it takes place in a ternary system, and the potential is independent of the overall composition; i.e. state of charge."

-Now here's a passage that I think might have to do with either the 'mode' idea above or just voltage variation in general. This is in the same book, where it is briefly describing "insertion" reactions: "...the insertion of guest species into normally unoccupied interstitial sites in the crystal structure of an existing stable host material. Though the chemical composition of the host phase initially present can be substantially changed, this reaction does not result in a change in its identity, the basic crystal structure, or the amounts of the phases in the microstructure. In most cases, however, the addition of interstitial species to previously unoccupied locations in the structure causes a change in volume. This involves mechanical stress, and mechanical energy. The mechanical energy related to insertion and extraction of interstitial species plays a significant role in the hysteresis, and subsequent energy loss, observed in a number of reversible battery electrode reactions."

I've seen the term "hysteresis" applied to NiMH voltage variation. Not positive that's always what it means - 'voltage variation' - when it comes to batteries, but it might be... Basically, it's possible that inserting and extracting stuff from the nickel electrode in our cells can result in location-specific differences along the electrode, so that when we use the cells, voltage can vary depending on how hard it is to insert and extract stuff... There's another passage or two, though, that does say that the volume changes in cells with nickel electrodes are very small, and that that's probably why they have good cycle life... Yet again, there's also passages that talk about a "gamma modification" and "alpha modification" that result in larger inter-slab spacing, i.e. larger volume changes, species of the electrolyte getting into the electrode, etc. So here, basically, there's other stuff that happens that can muck with the voltage, I think...

What it boils down to, perhaps, is that there's sort of two things we're looking at:

1) the two-phase binary stuff means we've got cells where voltage is independent of the state of charge. That's just how it is. I'd call this natural variation... That's one thing.
2) an insertion reaction means we've got stuff being inserted and extracted into a material, and perhaps there's some normal voltage variation associated with that. But then there's abnormal stuff going on, where, for example, the gamma and alpha mods crack open the electrode in places, making it harder to insert and extract, making voltage vary more than it should were stuff normal...
 
#31 ·
Just a few general comments

On new sticks I see IR of about 10-12 milliohms. About 2 per cell.
On old Oem sticks after cycling I see IR consistently around double that. 18-22 milliohms.
I bin any old sticks that have an IR higher than 25 after cycling.

Old oem long term stored sticks can take a lot of cycling to recover capacity with aggressive parameters. They are all usually completely empty (voltage will generally still be nominal ish) (Available capacity will be zero) when first charged after hibernation.

Using back to back 6.5A charge and 16A discharge cycles it can take seven cycles to recover full capacity. I bin old oem sticks if they can't give at least 6ah at 16A discharge rate down to 0.8v per cell. Most are around 6100mah+ after cycling.

I'm not that bothered about SD (within reason) because as SKeith says if I supply a recon pack I will supply a cycler/charger setup as well.

Currently I can do 4 sticks a day at around 6-7 cycles.

I store cycled and quantified old sticks discharged for several reasons.

1) It helps them bottom balance naturally through self discharge even more during storage.
2) If you store them charged then inevitably individual cell soc will be all over the place.
Requiring a long slow grid charge to correct before use.
3) It's safer. If you have a short circuit in your stick pile/racks/storage etc
then they don't contain any significant energy with which to get things going....
 
#41 ·
On new sticks I see IR of about 10-12 milliohms. About 2 per cell. On old Oem sticks after cycling I see IR consistently around double that. 18-22 milliohms. I bin any old sticks that have an IR higher than 25 after cycling.
Your numbers made me realize that I often report resistance values based on what I measure 'at the main leads' on my charger device, which has a push-button IR measurement. But the main leads have resistance (about 2ft of 12 awg wire, round trip)... The device also has cell leads; when I measure through those the IR drops a lot compared to the main lead measurement. I think the other day I measured something like 12mO at the cells (sum of) and 20 at the main leads, a huge difference... Not that I fully trust the cell leads either, just that, my numbers based on the main leads include the resistance in that circuit, and I've often forgotten to adjust for that...
 
#34 ·
If I were the BCM...

Knowing what I think I know about Insight NiMH, voltage behavior, and stuff, something like this is how I'd be figuring out state of charge for the battery pack, managing it - if I were the BCM Master...

First, the main principle is that we want to keep the cells away from the top and bottom charge state; we want to use the pack in the middle (mainly). Second, we know that open circuit voltage can't tell us what the charge state is at any given moment. BUT, it seems pretty clear that voltage under charge and discharge loads can, maybe not at 'any given moment', but certainly under certain circumstances...

For example, I think most of us understand how quite low and quite high charge states show up in the voltages, again, under load. Voltage plummets at the bottom, voltage rises fairly quickly toward the top. AND, if we're discharging from full to empty, or charging from empty to full, it's easy to point to the graph and say, 'See, the voltage is going up very gradually' or 'it's going down very gradually', as the cell is getting charged and discharged, respectively...

I think this characteristic plays out even when we're charging or discharging in the middle charge state range. This is key.

So here's the program: The BMS calculates a moving average of the voltages under charge loads and voltages under discharge loads. Based on test data from when the cells and BMS were designed/engineered, the BMS knows where the middle voltage should be at middle state of charge, under various loads and temperatures, and probably condition states. At any given time, the charge state reflects the median value between the moving average of peak voltages under charge loads and trough voltages under discharge loads - relative to the known middle charge state data. I guess full and empty voltages might play a role as well, or instead, not just a middle benchmark...

So, over multiple assist events in a row, the BMS develops this moving average of voltage - let's say it's 132v. And it develops a moving average for multiple regen events in a row - let's say that's 179V. OK, let's just take the average of these instead of doing multiple iterations and taking the median. The average voltage is 132+179=311/2=155.5V. The BMS compares that 155.5V value to its lookup table and sees that 155.5V is say 50% state of charge...

Grossly simplified, I know, but...

In the real deal, I don't imagine the BMS is looking up voltages, but rather, some kind of index value, where voltage is one variable, one part of what goes into the index value... Another thing is that this plays out at the tap level, not the pack level...

The main 'management' objective would be to keep the voltage in the middle of the two extremes, between what we know to be the bottom and what we know to be the top...

Anyhow, the main sweep of this pontificating is simply that a moving average of loaded voltages would almost certainly reveal the true state of charge - if you had some baseline data/information.
 
#36 ·
The OBDIIC&C reports the nominal state of charge value as the BCM has calculated it. Most of the time the movement of that value is simply an accounting procedure, where 1%=65mAh. The IMA electronics include current sensors that 'count' the current (or coulombs or whatever), so the BCM knows how much current is moving into and out of the pack... The OBDIIC&C reports that info, too...

As far as 'my method' goes, there really isn't a method here: I'm just imagining partially what's going on and partially what I think should be going on. I don't know how the car's BCM does what it does (and half the time I don't even know what it does or is doing). I know bits and pieces with a fair bit of certainty, like that 1%=65mAh bit, '72% hanging', '3 bar hanging', etc., but not the whole scheme of things...

I do think the scenario I describe, though, might be akin to what the BCM does to determine when to implement background charging, at least when the 12V system load is low. Otherwise it seems to background charge at around 65% nominal, trying to keep the IMA pack well-charged. That particular aspect, BTW, seems really dumb to me... I once wrote that whether or not you drive at night, like if you have a night job or day job, makes a big difference in how the BCM manages your pack, probably with different longevity outcomes as a result... The difference would be constant short cycling between about 65% and 70% vs. deeper cycling between maybe 40-50% and 70%... Not sure what that would mean for the pack though. One school of thought would say that the shallower cycles would mean greater longevity; another might say that the extremely short cycling would cause a build-up of 'crud' in the unused portions of the cells (wherever those could possibly be), and that would lead to premature bad-ness...
 
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top