Honda Insight Forum banner

1 - 20 of 70 Posts

·
Registered
Joined
·
6,004 Posts
Discussion Starter #1
I know, such optimism...

Normally I should probably have all my ducks in a row and actually have something substantial with which to start this thread. But I don't really. I was just thinking about how pack management probably revolves a great deal around working voltages specific to the stock Panasonic NiMH cells; about how important those voltages are; yet also how voltage always got short shrift around IC and still remains such a haphazard affair around here...

I imagined how great it would be if we could have a single thread that was all about Insight NiMH voltages - pack voltages, cell voltages, stick voltages, tap voltages, full voltages, empty voltages, typical full self-discharge voltages, normal 'working' voltages, 75% voltages, 'bad' voltages, etc etc...

One working theory I have at the moment involves the importance of the 1.20 to 1.25V range: I think good cells will really stick to this tight range, even at pretty high discharges rates. That's one major difference I've been seeing with my latest good rebuilt pack versus my other couple or so crummy packs. We all know that the nominal voltage of the cells is 1.2V - though I think it's really supposed to be 1.25V. That means something. Cells should be working at that voltage (during discharge); it's what they're meant to do...

I think the Insight battery management probably relies a lot on that 'middle' voltage...

1.34V is also a key value; it's the idealized equilibrium potential for the main reaction on the positive electrode. In practice, one thing it means to 'us' is that resting, open circuit pack voltage is often hovering around 160-161V (120 cells X 1.34V)...

Full voltages? The Panasonic spec sheet I have for cells that have often been said to be 'our' cells, the HHR650D, show a charge curve at 1C rate peaking at 1.53V... Over the years of grid charging, I've honed-in on full charge values of about 174-176V, charging at typical room temp and typical grid charge rates 300-500mA... 174-176V=1.45V to 1.467V were it just a single cell... I made a chart a while back for temp adjustments; I'll add that later...

0.78V is supposedly the equilibrium potential for the unwanted reaction that causes 'voltage depression' or 'memory effect' - analogous to how 1.34V is the equilibrium potential for the wanted, good and normal reaction...

0.68V - That's the voltage I've seen many times on cells that have self-discharged for a long time; self-discharged cells will hover around that voltage for a long time...

I'll try to come back and add 'voltage' stuff when it comes to mind. If you have any "voltage stuff" please consider adding it to the thread - questions, comments, values to report (cell-level or otherwise), and stuff related to voltage, like battery management. I guess that's kind of my main concern, fleshing-out how the Insight BMS uses voltages to do its thing, and probably mainly that 1.25-1.20V range...

Basically, it's really looking to me like, if your pack is NOT holding tight to the 1.2-1.25V range (144V to 150V), it's a sign that something's amiss - cells are drifting out of balance (most likely and/or most frequent issue), maybe voltage depression is setting in, maybe cells are just getting old and internal resistance is getting high, a combination of these, etc. And the thing is, you can really spot when your pack is in top form (or not) if you just watch how voltage is holding up.

For instance, when the pack is cool it won't hold that voltage so well or it won't be able to hold it at relatively high currents. Likewise, if cells are out of balance, it won't hold that range so well, warm or not.

And really, it's not the whole range: in really top form your (stock) pack will be holding 1.25V (150V), not 1.2V (144V) at at least about a 3C (19.5A) rate. And it will do this even at relatively low charge states - say 50%, it doesn't have to be 'full' 75%...

This is all stuff I'm grappling with - trying to juggle charge state, temperature, balance state, overall cell condition, and voltage - to get a good handle on pack condition/state. I've generally been seeing something like 144V to 150V at up to about a 3C rate between about 45% and 75% charge state, pack warmed up to at least 70F. There's wiggle room in these numbers depending on the exact values, like the exact temp, SoC, and current rate. I'd like to tighten this up...

What voltages under various conditions should I be seeing if my pack is 'good'? If my pack is at a nominal 50% charge state, and I'm discharging at a 3C rate, pack temp 75F, what voltage should I see with a good pack? Or a bad pack? It's all pretty subtle - at this point I can pretty much tell when the cells have drifted about 20% out of balance. I'd like to lower that figure to some degree, plus be a bit more sure about it...

Right now, pack voltage sag at moderate charge states (maybe 50-60%) is my main realtime, tell-tale sign of imbalance. And I'm talking about only a few volts difference, like seeing about 139V instead of about 142V, warmed pack. Or maybe it's seeing that sag at slightly higher charge states; like instead of seeing it down around 45% SoC, it might be 60% SoC... I'd like to tighten-up these kinds of observations, like I'd like to be able to tell just what 'state' my pack is at with just a few charge/assist runs - like get the pack warmed up, do a couple assist runs, see the numbers, know the state... I'm almost there...
 

·
Administrator
Joined
·
10,698 Posts
I was just thinking about how pack management probably revolves a great deal around working voltages specific to the stock Panasonic NiMH cells; about how important those voltages are; yet also how voltage always got short shrift around IC and still remains such a haphazard affair around here....
FWIW I think you are right and the stock BCM is very attuned to those specific cells.

It does not always like or work 100% correctly with replacements including the BB cells we all use.

The differences and voltage nuances are very difficult to pin down though.

Your post is a good crack at the idea.

The fact we can't modify the BCM software does hamper the way it works with non oem cells.
 

·
Super Moderator
Joined
·
6,889 Posts
From my very limited experience BCM 010 works fine with the BB aftermarket cells, while the 020 is a bit shakey. Just two cars and two BCMs. Anecdotal, and not much data. The two combinations were 010/ HybridRevolt and 020/Bumblebee, though I strongly suspect the cells were identical and came from the same manufacturer so the BCMs would thus be the explanation.

I'll furnish one data point on the voltage issue, though it is rather specific to the way I check out sticks. The 70A load voltage of good cells at the 2000mAh discharge point is very close to 1.08 +/- .02V. That seems very repeatable. Cells with lower voltages will usually deliver somewhat limited capacity.
 

·
Registered
Joined
·
6,004 Posts
Discussion Starter #4 (Edited)
^hmm, your post/'data point' got me thinking a bit and looking at an older chart I made of voltage drop vs. current and IR. I'm reminded of an idea I had a while back on how to combine a bunch of different info into a single graphical representation. If I find some time I'll experiment with that...

Meanwhile, looking at that one older chart, all else being equal, seeing about 1.1V, 132V for a pack, at a 70A discharge, assuming about 160V at the start of the discharge (i.e. assuming voltage would probably bounce back to about 160V were you to stop the discharge after that 2000mAh drain and then re-initiate the 70A discharge) and a 28 volt drop, puts the cell-level internal resistance at about 3.3 milliohms... The Panasonic spec sheet I have reports IR as about 2 milliohms...

IF we had 2mΩ cells, same other assumptions as above, we'd be seeing about a 17 volt drop - so about 143 volts for the pack at 70A discharge, about a 10kW load... Of course there's some resistance in all the connections; my guess is something like 2.5mΩ to 3.5mΩ per cell is the real-world acceptable/workable/doable range...

Let's assume 3mΩ at cell level, including connections, is good, normal. And let's also continue to assume that 160-161V, 1.34V per cell, is 'the' starting voltage no matter the charge state (or at least between say 25% and 75%). Let's assume normal working temperature - say about 75F - and perfectly balanced cells:

At 20A discharge we should see a pack voltage of about 153.5V, or 1.28V per cell.
At 50A discharge we should see a pack voltage of about 142.5V, or 1.19V per cell.
At 70A discharge we should see a pack voltage of about 135.5V, or 1.13V per cell.
And at 100A discharge we should see a pack voltage of about 124.5V, or 1.04V per cell...

One should be able to see how these values match fairly well what the Insight demands of the pack. And you can imagine how different conditions, such as cool cells, low charge state, cell imbalance, aging cells, will start to push things to the 'edges'...

Here's that chart. I find it pretty useful for thinking about this kind of stuff. I think originally I was more interested in observing real-world voltage drops on the OBDIIC&C and then finding out what internal resistance my cells were at:

Oh, one thing: I think I used 164V for this chart to determine the "OK", "Iffy", etc. colors and labels, i.e. if the voltage drop were 42 volts and you started at 164V, 164-42=122V, you'd still be above the 120V minimum (1V per cell). I think 160-161V is a better starting point though, it's more central...
 

·
Premium Member
Joined
·
3,421 Posts
Probably not what you're looking for, but focusing on "NiMH" and "voltage," I have the following to offer:

Tap resting voltages of a healthy pack will be less < 0.2V between min/max

Heavily loaded tap voltages of a healthy pack will be < 0.6V between min/max

TRULY healthy sticks WILL NOT self-discharge below nominal over MANY MONTHS.
 

·
Registered
Joined
·
6,004 Posts
Discussion Starter #6
Probably not what you're looking for .... Tap resting voltages of a healthy pack will be less < 0.2V between min/max. Heavily loaded tap voltages of a healthy pack will be < 0.6V between min/max.
No, or yeah, this is the kind of stuff I'm interested in. I think I'm looking to get more specific, like semi-quantify "healthy pack" and "heavily loaded," and I guess those types of voltage spreads as well...

And before I forget, there's a handful of subtle 'voltage-related' issues I can't get a handle on. For example, one thing I've noticed that I always forget is that 'good' cells seem to actually have a lower resting voltage than 'bad' cells after being fully charged (or whenever, maybe not necessarily fully charged)... In general I'm usually thinking that higher resting voltage is better, but I think that might not be true; I think that higher resting voltage might be an indicator of badness... No real idea why though, just seems like I've seen this at the cell and pack level, where, when I know this or that cell is better or good, or I know my pack is fully balanced and running strong, the resting voltages are lower - like instead of sticking around 1.40V for a long time, voltage falls to something like 1.34-1.37 fairly quickly but just hangs out there for a long time... Pack voltage - it might be 160-163V instead of maybe 165V or more... I forget the other "subtle" voltage issues at the moment...
 

·
Registered
Joined
·
6,004 Posts
Discussion Starter #7 (Edited)
One working theory I have at the moment involves the importance of the 1.20 to 1.25V range: I think good cells will really stick to this tight range, even at pretty high discharge rates....We all know that the nominal voltage of the cells is 1.2V - though I think it's really supposed to be 1.25V. That means something. Cells should be working at that voltage (during discharge); it's what they're meant to do...

I think the Insight battery management probably relies a lot on that 'middle' voltage...

1.34V is also a key value; it's the idealized equilibrium potential for the main reaction on the positive electrode. In practice, one thing it means to 'us' is that resting, open circuit pack voltage is often hovering around 160-161V (120 cells X 1.34V)....
These I think are really one and the same thing, or similar and/or related - this 1.20V to 1.25V range and this 1.34 'equilibrium' voltage. The equilibrium potential for the positive electrode is 1.34V, but then take into account internal resistance and operation under 'non-equilibrium' conditions (i.e. under loads, charging and discharging, rather than resting).

When you take that reported 2mΩ cell-level internal resistance value and apply it to 1.34V under various discharge loads, you approach the nominal voltage. For instance, with an IR of 2mΩ, a cell with a nominal 1.34V will see a voltage drop equal to 0.1V at a 50 amp discharge, which drops the cell voltage to -- 1.24V...

So, this 'equilibrium' voltage and the nominal voltage are in essence the same thing. I think it's more cut and dried, too, when it comes to NiMH electrochemistry - since the way that electrochemistry works produces flat charge and discharge voltage curves, i.e. the nominal voltage is less of an average over the entire curve, like it might be for some lithium chemistries, more of a real working value across the whole capacity... It has to do with the type of reaction, something like a two-phase insertion reaction, which produces charge and discharge voltages that are independent of the charge state, one of the good things about NiMH...

Interestingly, when you think about it, to some degree the extent to which charge and discharge voltages vary from the equilibrium reflects the extent to which cells are being pushed closer toward their limits, at least based on what they're supposed to do, based on the electrochemistry, assuming a perfect cell and/or reactions... I guess you take internal resistance into account and then you're left with this working voltage range, like 1V to ... let's call it 1.6V, as it's been said that 192V is the Insight pack voltage max and 192V/120 cells=1.6V. I've read 1.50V is a thermodynamic max or something of that ilk, above which overcharge reactions take over or at least happen more; and then 1.53V is the peak voltage at full under a 6.5A 1C charge according to Panasonic; and then I think that Panasonic spec sheet also reports 1.8V as being some critical max voltage... I would probably pick 1.53V as a conservative max voltage under any conditions - like if I designed the BMS I don't think I'd let voltage go beyond 1.53V ever...

Let's take that 2mΩ resistance value into account again and see what kind of current we can charge at to keep voltage below 1.54V: With an equilibrium voltage of 1.34V, we have headroom equal to 0.19V; 0.19V/0.002Ω=95 amps...

How about 3mΩ?: 0.19V/0.003Ω=63.3 amps...

Pretty sure this is the way you do it... Of course, these are just back-of-the-envelope; cell voltage isn't always 1.34V. For instance, if you've been hitting regen a lot you're seeing cell voltages up to something like 1.416V (no load), which leaves us with headroom equal to only 0.114V. At 3mΩ, our max charge current could only be something like 38 amps (again, assuming we want to keep cell voltage below 1.54V)... And then we're not working with perfectly balanced and matched cells, and not just 1 but 120 of them. Etc etc...
 

·
Registered
Joined
·
6,004 Posts
Discussion Starter #8 (Edited)
....I forget the other "subtle" voltage issues at the moment.
I remembered another "subtle" voltage issue. It's not really subtle, though you'd probably never notice it unless you were doing something to test... It's partly related to the mythology around IC that says 'voltage doesn't tell you anything about state of charge'. In general, it's simply about how voltage changes during pack use, how it doesn't change linearly with state of charge say like lead acid...

For the longest time, running crappy packs, it seemed like voltage fell when state of charge got lower. But it turned out, I think, that most of that change simply reflected the extent to which my cells had deteriorated capacity and voltage depression. When I got my good rebuilt pack, one thing I noticed is that voltage didn't drop in the same way...

I can discharge the pack from say 75% to 60%, so -15%. Initially voltage might be 164V at 75%, and when I discharge to 60% it might settle in at something like 156V. If I keep the charge/discharge cycles shallow (like about 5%), pack voltage will remain around 156V or so. But if I extend a discharge lower, say drop state of charge by 10-15% more instead of by only 5%, when I charge back just that 10-15% amount, pack voltage will be high again and settle in at the initial high level, the level not 10-15% up but 30% up - possibly 164V, usually at least 160V...

Basically, despite having a net loss of about 1000mAh, pack voltage ends up back where it was before any drain - if I discharge even lower and then charge back up. Knowing a little about the electrochemistry, how the cells work, and also based on what people commonly believe - that 'voltage doesn't tell you anything about state of charge', it doesn't buck any trend. But it's weird and I don't understand what makes the voltage do that.

Why would voltage be low after the initial 15% discharge (that 156V level), but then, after discharging more, say an additional 10%, and then cycling back up only 10%, voltage rises back to the initial level - 160-164V?

I surmised one time that it was like the cells have a 'moving charge state window'. It's like there's localized affects, like having a cell within a cell.

The voltage one reads tends to reflect the recent usage history, like the main reaction is happening somewhere fairly specific in the cell, not everywhere equally, and when you charge or discharge, other reactions are more likely to happen in that area. That usage then shapes the 'chemical composition' or what-not of that area...

If you discharge X amount and then keep the cycles shallow, subsequent reactions will tend to happen in that 'usage space' you've primed/created - and that space is primed for discharge and is small (or something like that). But if you cycle lower and then cycle back up, you've increased the 'primed' 'usage space' - and voltage will be more normal, higher...

I don't know, it's a weird thing, hard to understand, grasp... I have a research paper that includes a similar idea, but I never really read it very closely. I'll have to dig that out and see what it says... If anyone knows why this happens I'm all ears...

fyi, here's a snip from that research paper. It does a pretty thorough job of describing the problem but says nothing about what causes it. The authors call it "voltaic hysteresis," and I think that's what such behavior is called in general... This snip is from the introduction, the paper's titled "Implications of NiMH Hysteresis on HEV Battery Testing and Performance," 2002:

"Unlike other high-power batteries such as Lithium-Ion, NiMH batteries exhibit a strong voltaic hysteresis between charge and discharge. Hysteresis has a profound impact on the ability to monitor and control state-of-charge (SOC) and measure battery performance. As a consequence, previously developed calendar life-tests may not be applicable to this technology and no easy method (short of completely discharging the battery and measuring its residual capacity) presently exists for determining the energy remaining in the battery during use.

Srinivasan et al. state that two oxidation states (i.e., SOCs) can exist at the same potential depending only on the previous history of the electrode. Consequently, the potential of nickel-based batteries cannot be used as an indication of the SOC of the cell...."
 

·
Registered
Joined
·
6,004 Posts
Discussion Starter #9 (Edited)
....Over the years of grid charging, I've honed-in on full charge values of about 174-176V, charging at typical room temp and typical grid charge rates 300-500mA... 174-176V=1.45V to 1.467V were it just a single cell... I made a chart a while back for temp adjustments; I'll add that later....
Here's that chart (table) I made for adjusting voltages for temperature differences. It's based on only few pieces of data from the Panasonic spec sheet for the HHR650D cell. The lone values at the far right column are the values you'd use in a pinch to adjust voltages.

For example, for temperatures between freezing and 19 Celsius, you'd adjust voltages plus 0.360V per degree C below 20 degrees C. For temperatures between 20 and 40 Celsius you'd adjust voltages minus 0.312V per degree C above 20 degrees C...

For an Insight pack, the idea is that, if say you're grid charging at say 10 degrees C, you'd expect to see a higher peak voltage because the ambient temp is colder and voltages are higher when colder. Instead of the normal 174-176V you might expect to see plus 10 X 0.360V =+3.6V, or about 177.6V to 179.6V...

If you're charging at say 30 degrees C (86F), you'd adjust the expected peak voltage downward - by 10 X 0.312V, or 3.12V. Instead of the normal 174V to 176V, you might expect to see 170.9V to 172.9V...

I can't remember exactly what I used for this table and how, so take it as an approximation. The other thing is, I'm not sure what "ambient" temperature means when it comes to charging the pack in the car. The pack will heat up and heat the air around the pack - so what's 'ambient temp' at that point? Is it the temp outside the car or the temp in the battery box? I suppose if you're running the fan on high you could probably use outside air temp...



A more exact method to calculate an adjustment value is to look up the temperature you're charging at, read the voltage value for that row under the 'pack V' heading, and then subtract the voltage value in the same column for the 20 degree C row. For example, if you're charging at 30 degrees C, the Pack V value is 180.4, the Pack V value for 20 degrees C is 183.6, and the difference is 3.2V -- subtract 3.2V from the expected peak voltage. If you take 174-176V to be normal, then the adjusted peak voltage will be 170.8V to 172.8V...

[edit] I just 'captured' a bit of rough data while charging a stick and putting a fan on it, and the values for temp change vs. voltage change more or less match the values derived from the table above. So a bit of external validation for this table... Basically, I saw a temp drop of about 9C degrees between 37C and 28C and the voltage increase was 22.5mV. Looking up that temp change in the table and calculating values, I should have seen 23.3mV (per cell) - pretty darn close...

So, around room temp up to about 40C, you should expect to see about a 2.6mV change in cell voltage per degree C...

Here's a post with a graph illustrating the impact of cooling on the end of a stick charge; the CV charge allows us to see the impact of cooling reflected in the current curve. As the cells cool, upward voltage pressure increases, current has to drop to maintain the set constant voltage level: http://www.insightcentral.net/forums/modifications-technical-issues/106322-q-battery-putzers-even-ness-cell-temps.html#post1219066
 

Attachments

·
Registered
Joined
·
6,004 Posts
Discussion Starter #10 (Edited)
Voltage Taps

As many times as I've suggested to people having IMA troubles that they check their voltage tap voltages, I never actually checked them myself -- until today. I thought I should jot down what I found...

First, for the uninitiated, the IMA pack is made up of 20 'sticks' or 'sub-packs', each of which is made up of 6 "D"-sized cells. The car taps into pairs of sticks and monitors voltages for those pairs - so there's 10 pairs, 10 voltage taps. The battery condition monitor, or BCM, is responsible for those taps. The BCM is one of the square silver computer boxes on top of the battery pack, on the left side if you're facing the pack from the hatch. The taps connect to the BCM at the back of the computer; when you open the battery compartment, the tap connector is right in front of your face... You can 'back probe' each wire ensconced in the grey plastic connector and check the voltages of stick pairs...

BCM 'Connector C' - Voltage Taps (left connector)
image credit: Bumblebee Batteries


The BCM seems pretty concerned about tap voltages - so it's probably a good idea that we be at least a little concerned. A few IC'ers, including myself, tend to believe that tap voltages that are too far apart are one if not THE leading cause of IMA trouble codes. A couple people have put a value on some of that - like if the tap voltages are more than 1.2V apart under load, you'll get a code. It says something like that in the Troubleshooting manual, though it's a little more nuanced than just this...

Anyhow, I checked my tap voltages. I have a self-rebuilt pack that's been working very well for maybe half a year now. It's actually working the best it ever has at the moment. I grid charged it about a week ago, have been using it regularly, and I let it 'positively recal' to a nominal 75% before I let it rest over night and then checked the tap voltages. So basically, these are tap voltages for a known, good-working pack under these circumstances, and "these circumstances" generally mean fully car-charged, balanced, at roughly 75 degrees F, after sitting over night:

16.04, 16.07, 09, 05, 08, 02, 01, 09, 04, 05
So a range of 16.01V to 16.09V
Average=16.054V, +-0.04V


Here's a graphic depicting the BCM voltage tap connector as it looks from the back plugged-in, the wires that you'd probe with your DMM to read the voltages, and how the sticks corresponding to the taps are arranged in the pack:

fyi, "thrm" is shorthand for "thermistor," indicates to which sticks the 4 temp probes attach...

If you're having IMA troubles, that'd be a great time to check voltage taps. I think chances are high that you'll see a much greater range of values than I posted above...
 

Attachments

·
Registered
Joined
·
6,004 Posts
Discussion Starter #11 (Edited)
On another voltage-related note, I swapped back in my 010/030 BCM/MCM, replacing the 305 combo I've been running for a while -- and discovered that my 305 BCM reads pack voltage about 5V higher than the 010 BCM. I posted about that here:
http://www.insightcentral.net/forums/modifications-technical-issues/20218-bcm-mcm-ecm-revisions.html#post1006041

So keep in mind that different BCMs don't necessarily measure and/or report the same pack voltages. 5V is a big difference...

EDIT: turns out I had been reading Mvo on the 305 BCM and comparing to Bvo on the 010 BCM. Both BCMs read Mvo and Bvo the same. Mvo is +6V higher than Bvo. Mvo is +2.8V vs. actual and Bvo is -3.2V vs. actual...
 

·
Registered
Joined
·
6,004 Posts
Discussion Starter #12
Voltage Taps Part 2

I stumbled upon a graph I had saved, that Eli Bumblebee Master must have posted in one of the threads at IC (couldn't find it), that goes a long way toward illustrating what's going on in terms of battery management at the tap level.

Eli must have logged tap voltages during a long assist event. The upper blue curve shows assist current, on the right hand Y-axis, while the 10 tap voltages are on the left hand Y-axis. Time in seconds is on the bottom.

The general sweep of the graph goes like this: Start with full assist. Full assist is quickly throttled back to acceptable levels. Assist continues until, most likely, the car's 'empty' threshold. I think we can reasonably take the values at the key transitions as approximate Insight BCM/MCM management thresholds. Plus, the values generally fit with what we know about NiMH electrochemistry/use.

You can see how assist current initially peaks at about -75 amps. Voltage for the lowest 2 taps drops to about 10.5V (average of 0.875V per cell), while other tap voltages range from about 12V to 13V at this point (avg. of 1V to 1.08V per cell).

This is an old, weak pack - so current is quickly throttled back. Time from peak current to throttled-current looks like it takes no more than about 5 seconds. Assist current is throttled to about -25 amps. Voltage for the lowest taps rises to about 14V (avg. 1.17V per cell), while voltage for the others ranges from about 14.5V to 15V (avg. 1.21V to 1.25V per cell).

The assist discharge continues for another 500 seconds or so. Over this period, current gradually decreases to a low of about 15 amps at about 540 seconds. Voltage for the lowest taps at the end of this period is about 13.0V (1.08V per cell avg.), while voltage for the others is about 13.5V (1.13V per cell avg.)...

Overall, the BCM and MCM adjust assist current lower and that keeps voltage for the lowest taps between about 13V and 14V, or roughly 1.1V to 1.2V per cell average...

 

·
Super Moderator
Joined
·
6,889 Posts
It is a very interesting graph which I guess I missed somewhere along the road. The specific values of the taps are useful in confirming my "severe" sorting routine - as Keith calls it. I have found that "good" sticks will support 6 volts @ 35A for pretty long periods of time, and that becomes my measure of quality when separating the good from the bad. Thanks you for the refresher.
 

·
Registered
Joined
·
6,004 Posts
Discussion Starter #14 (Edited)
Strictly speaking, "6 volts at 35A" wouldn't be so good. The worst sticks in the pack graphed above are at about 7 volts after the current gets throttled to about 25 amps. The others hit about 7 volts at 20 amps midway through the discharge; they don't fall below 6 volts even at the initial 75 amp discharge. And this is for a bad pack...

Plus, at 6 volts you're already below the level the car allows - and that's at only a 35 amp rate, yet the car demands more like 80 amps+ at max, 45 amps typical max sustained in 1st, 4th, and 5th gears...

Maybe your 6 volt figure is a typo?, or maybe you're measuring voltage through the loaded wires (kind of doubt that though since you're an expert)...

Some posts up I listed some discharge rates and voltages we might expect for good cells/sticks, assuming they were all at about 3mΩ; it'd be more like 7.35V, 1.22V per cell, at 35 amps... (permalink to post #4): http://www.insightcentral.net/forums/modifications-technical-issues/89298-quintessential-insight-nimh-voltage-thread.html#post995466

In fact, you posted just above post #4 that you were seeing 1.08V per cell at a 70 amp rate. That's more like it... 6V at 35 amps isn't good, basically. I want to make sure people get that straight.
 

·
Super Moderator
Joined
·
6,889 Posts
Strictly speaking, "6 volts at 35A" wouldn't be so good. The worst sticks in the pack graphed above are at about 7 volts after the current gets throttled to about 25 amps. The others hit about 7 volts at 20 amps midway through the discharge; they don't fall below 6 volts even at the initial 75 amp discharge. And this is for a bad pack...
It is a little difficult to make direct comparisons since my loads are fixed at approximately 70A and 35A, mid discharge cycle, and there is no BCM/MCM management involved. I was just making the observation that the data for good sticks seems to generally agree with what I was seeing by entirely different methods. I define good sticks as those which can reach 5500 mAh capacity under the my test conditions. I won't go into methods because it just leads to interminable arguments, sorry, but suffice it to say that I'm just trying to find good sticks. I am doing nothing to try to heal sticks. I have accumulated enough sticks that I have that "luxury."

I just don't have the time right now to respond to this kind of very detailed discussion. But one quick point, the 6 V is right at the end of the exhaustion of the capacity test. I'll try to take some evening time to respond if I think I can do it adequately, but the long winded discussions are generally the reason I avoid discussing the topic at all. Plus it seldom leads to any agreement at all:(
 

·
Super Moderator
Joined
·
6,889 Posts
Strictly speaking, "6 volts at 35A" wouldn't be so good. The worst sticks in the pack graphed above are at about 7 volts after the current gets throttled to about 25 amps. The others hit about 7 volts at 20 amps midway through the discharge; they don't fall below 6 volts even at the initial 75 amp discharge. And this is for a bad pack...
I got curious about my previous 6 volt exhaustion number, so I went back and checked a so-so stick at half discharge. It read 7.05V at 35A. It would obviously read higher at a lower load. It is not a number that I usually document within my procedure, so I didn't have data.

Later I want to make some observations about this specific pack example and discuss/question some technical points, but right now the lawn mower calls.

Thank you for reposting Eli's data. It is instructional!
 

·
Registered
Joined
·
39 Posts
Has anyone installed a voltage logger/monitor on these taps? It might be interesting. Also can you recondition pairs of sticks through these taps (keeping in mind pack reconditioning probably does as good a job as working with individual sticks unless you want to get philosophical).
 

·
Premium Member
Joined
·
3,421 Posts
Has anyone installed a voltage logger/monitor on these taps? It might be interesting. Also can you recondition pairs of sticks through these taps (keeping in mind pack reconditioning probably does as good a job as working with individual sticks unless you want to get philosophical).
Yes. Search for JimIsBell posts. I think Retepsnikrep just did one recently.

Conceivably, yes, but the wire is very fine gauge, so currents would be severely limited.
 
1 - 20 of 70 Posts
Top