The quintessential Insight NiMH voltage thread - Insight Central: Honda Insight Forum
 
Go Back   Insight Central: Honda Insight Forum > 1st Generation Honda Insight Forum > Modifications and Technical Issues

Insightcentral.net is the premier Honda Insight Forum on the internet. Registered Users do not see the above ads.

» Auto Insurance
» Featured Product
Wheel & Tire Center

Reply
 
LinkBack Thread Tools
Old 05-13-2016, 05:40 AM   #1 (permalink)
eq1
Senior Member
 
eq1's Avatar
 
Join Date: Jan 2012
Location: PNW
Posts: 4,318
Default The quintessential Insight NiMH voltage thread

I know, such optimism...

Normally I should probably have all my ducks in a row and actually have something substantial with which to start this thread. But I don't really. I was just thinking about how pack management probably revolves a great deal around working voltages specific to the stock Panasonic NiMH cells; about how important those voltages are; yet also how voltage always got short shrift around IC and still remains such a haphazard affair around here...

I imagined how great it would be if we could have a single thread that was all about Insight NiMH voltages - pack voltages, cell voltages, stick voltages, tap voltages, full voltages, empty voltages, typical full self-discharge voltages, normal 'working' voltages, 75% voltages, 'bad' voltages, etc etc...

One working theory I have at the moment involves the importance of the 1.20 to 1.25V range: I think good cells will really stick to this tight range, even at pretty high discharges rates. That's one major difference I've been seeing with my latest good rebuilt pack versus my other couple or so crummy packs. We all know that the nominal voltage of the cells is 1.2V - though I think it's really supposed to be 1.25V. That means something. Cells should be working at that voltage (during discharge); it's what they're meant to do...

I think the Insight battery management probably relies a lot on that 'middle' voltage...

1.34V is also a key value; it's the idealized equilibrium potential for the main reaction on the positive electrode. In practice, one thing it means to 'us' is that resting, open circuit pack voltage is often hovering around 160-161V (120 cells X 1.34V)...

Full voltages? The Panasonic spec sheet I have for cells that have often been said to be 'our' cells, the HHR650D, show a charge curve at 1C rate peaking at 1.53V... Over the years of grid charging, I've honed-in on full charge values of about 174-176V, charging at typical room temp and typical grid charge rates 300-500mA... 174-176V=1.45V to 1.467V were it just a single cell... I made a chart a while back for temp adjustments; I'll add that later...

0.78V is supposedly the equilibrium potential for the unwanted reaction that causes 'voltage depression' or 'memory effect' - analogous to how 1.34V is the equilibrium potential for the wanted, good and normal reaction...

0.68V - That's the voltage I've seen many times on cells that have self-discharged for a long time; self-discharged cells will hover around that voltage for a long time...

I'll try to come back and add 'voltage' stuff when it comes to mind. If you have any "voltage stuff" please consider adding it to the thread - questions, comments, values to report (cell-level or otherwise), and stuff related to voltage, like battery management. I guess that's kind of my main concern, fleshing-out how the Insight BMS uses voltages to do its thing, and probably mainly that 1.25-1.20V range...

Basically, it's really looking to me like, if your pack is NOT holding tight to the 1.2-1.25V range (144V to 150V), it's a sign that something's amiss - cells are drifting out of balance (most likely and/or most frequent issue), maybe voltage depression is setting in, maybe cells are just getting old and internal resistance is getting high, a combination of these, etc. And the thing is, you can really spot when your pack is in top form (or not) if you just watch how voltage is holding up.

For instance, when the pack is cool it won't hold that voltage so well or it won't be able to hold it at relatively high currents. Likewise, if cells are out of balance, it won't hold that range so well, warm or not.

And really, it's not the whole range: in really top form your (stock) pack will be holding 1.25V (150V), not 1.2V (144V) at at least about a 3C (19.5A) rate. And it will do this even at relatively low charge states - say 50%, it doesn't have to be 'full' 75%...

This is all stuff I'm grappling with - trying to juggle charge state, temperature, balance state, overall cell condition, and voltage - to get a good handle on pack condition/state. I've generally been seeing something like 144V to 150V at up to about a 3C rate between about 45% and 75% charge state, pack warmed up to at least 70F. There's wiggle room in these numbers depending on the exact values, like the exact temp, SoC, and current rate. I'd like to tighten this up...

What voltages under various conditions should I be seeing if my pack is 'good'? If my pack is at a nominal 50% charge state, and I'm discharging at a 3C rate, pack temp 75F, what voltage should I see with a good pack? Or a bad pack? It's all pretty subtle - at this point I can pretty much tell when the cells have drifted about 20% out of balance. I'd like to lower that figure to some degree, plus be a bit more sure about it...

Right now, pack voltage sag at moderate charge states (maybe 50-60%) is my main realtime, tell-tale sign of imbalance. And I'm talking about only a few volts difference, like seeing about 139V instead of about 142V, warmed pack. Or maybe it's seeing that sag at slightly higher charge states; like instead of seeing it down around 45% SoC, it might be 60% SoC... I'd like to tighten-up these kinds of observations, like I'd like to be able to tell just what 'state' my pack is at with just a few charge/assist runs - like get the pack warmed up, do a couple assist runs, see the numbers, know the state... I'm almost there...
__________________
2000MT, CAN, ~168K miles
eq1 is offline   Reply With Quote Quick reply to this message
Sponsored Links
Advertisement
 
Old 05-13-2016, 12:42 PM   #2 (permalink)
Administrator
 
Join Date: Dec 2005
Location: Beverley. East Riding of Yorkshire.
Posts: 8,460
Send a message via MSN to retepsnikrep
Default

Quote:
Originally Posted by eq1 View Post
I was just thinking about how pack management probably revolves a great deal around working voltages specific to the stock Panasonic NiMH cells; about how important those voltages are; yet also how voltage always got short shrift around IC and still remains such a haphazard affair around here....
FWIW I think you are right and the stock BCM is very attuned to those specific cells.

It does not always like or work 100% correctly with replacements including the BB cells we all use.

The differences and voltage nuances are very difficult to pin down though.

Your post is a good crack at the idea.

The fact we can't modify the BCM software does hamper the way it works with non oem cells.
__________________
Pcb's/Built Units for ObdIIc&c, Imac&c, ImaBoost, Current Hacks available.
Enqs to 150mpg[at]gmail.com


To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts.


To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts.


To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts.


To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts.
retepsnikrep is online now   Reply With Quote Quick reply to this message
Old 05-13-2016, 01:44 PM   #3 (permalink)
Moderator
 
Join Date: May 2009
Location: Richmond VA
Posts: 4,308
Default

From my very limited experience BCM 010 works fine with the BB aftermarket cells, while the 020 is a bit shakey. Just two cars and two BCMs. Anecdotal, and not much data. The two combinations were 010/ HybridRevolt and 020/Bumblebee, though I strongly suspect the cells were identical and came from the same manufacturer so the BCMs would thus be the explanation.

I'll furnish one data point on the voltage issue, though it is rather specific to the way I check out sticks. The 70A load voltage of good cells at the 2000mAh discharge point is very close to 1.08 +/- .02V. That seems very repeatable. Cells with lower voltages will usually deliver somewhat limited capacity.
__________________
2000 MT Citrus(DogBite); 2000 MT Citrus; 2001 MT(Aeromodded Hypermiler - 134.4MPG); 2006 MT

Last edited by jime; 05-13-2016 at 06:22 PM. Reason: addition
jime is offline   Reply With Quote Quick reply to this message
 
Old 05-13-2016, 11:35 PM   #4 (permalink)
eq1
Senior Member
 
eq1's Avatar
 
Join Date: Jan 2012
Location: PNW
Posts: 4,318
Default

^hmm, your post/'data point' got me thinking a bit and looking at an older chart I made of voltage drop vs. current and IR. I'm reminded of an idea I had a while back on how to combine a bunch of different info into a single graphical representation. If I find some time I'll experiment with that...

Meanwhile, looking at that one older chart, all else being equal, seeing about 1.1V, 132V for a pack, at a 70A discharge, assuming about 160V at the start of the discharge (i.e. assuming voltage would probably bounce back to about 160V were you to stop the discharge after that 2000mAh drain and then re-initiate the 70A discharge) and a 28 volt drop, puts the cell-level internal resistance at about 3.3 milliohms... The Panasonic spec sheet I have reports IR as about 2 milliohms...

IF we had 2mΩ cells, same other assumptions as above, we'd be seeing about a 17 volt drop - so about 143 volts for the pack at 70A discharge, about a 10kW load... Of course there's some resistance in all the connections; my guess is something like 2.5mΩ to 3.5mΩ per cell is the real-world acceptable/workable/doable range...

Let's assume 3mΩ at cell level, including connections, is good, normal. And let's also continue to assume that 160-161V, 1.34V per cell, is 'the' starting voltage no matter the charge state (or at least between say 25% and 75%). Let's assume normal working temperature - say about 75F - and perfectly balanced cells:

At 20A discharge we should see a pack voltage of about 153.5V, or 1.28V per cell.
At 50A discharge we should see a pack voltage of about 142.5V, or 1.19V per cell.
At 70A discharge we should see a pack voltage of about 135.5V, or 1.13V per cell.
And at 100A discharge we should see a pack voltage of about 124.5V, or 1.04V per cell...

One should be able to see how these values match fairly well what the Insight demands of the pack. And you can imagine how different conditions, such as cool cells, low charge state, cell imbalance, aging cells, will start to push things to the 'edges'...

Here's that chart. I find it pretty useful for thinking about this kind of stuff. I think originally I was more interested in observing real-world voltage drops on the OBDIIC&C and then finding out what internal resistance my cells were at:

Oh, one thing: I think I used 164V for this chart to determine the "OK", "Iffy", etc. colors and labels, i.e. if the voltage drop were 42 volts and you started at 164V, 164-42=122V, you'd still be above the 120V minimum (1V per cell). I think 160-161V is a better starting point though, it's more central...
__________________
2000MT, CAN, ~168K miles

Last edited by eq1; 05-13-2016 at 11:44 PM.
eq1 is offline   Reply With Quote Quick reply to this message
Old 05-14-2016, 12:04 AM   #5 (permalink)
Lifetime Member
 
S Keith's Avatar
 
Join Date: Oct 2014
Location: Phoenix, AZ area
Posts: 1,898
Default

Probably not what you're looking for, but focusing on "NiMH" and "voltage," I have the following to offer:

Tap resting voltages of a healthy pack will be less < 0.2V between min/max

Heavily loaded tap voltages of a healthy pack will be < 0.6V between min/max

TRULY healthy sticks WILL NOT self-discharge below nominal over MANY MONTHS.
__________________
'05 G1 CVT w/192K mi as of 7/2016; '02 G1 CVT w/182K mi as of 7/2016 (P1449 HR pack); '06 HCH2 #1 w/145K mi as of 7/2016 (SOLD!), '06 HCH2 #2 w/223k mi as of 2/2016 (SOLD!); '03 Prius w/166K mi - Pack rebuilt with Gen2 modules; '08 Prius w/135k mi as of 7/2016; '07 Prius w/185k mi as of 7/2016

To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts.


To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts.


To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts.


To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts.


To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts.
S Keith is offline   Reply With Quote Quick reply to this message
Old 05-14-2016, 04:45 AM   #6 (permalink)
eq1
Senior Member
 
eq1's Avatar
 
Join Date: Jan 2012
Location: PNW
Posts: 4,318
Default

Quote:
Originally Posted by S Keith View Post
Probably not what you're looking for .... Tap resting voltages of a healthy pack will be less < 0.2V between min/max. Heavily loaded tap voltages of a healthy pack will be < 0.6V between min/max.
No, or yeah, this is the kind of stuff I'm interested in. I think I'm looking to get more specific, like semi-quantify "healthy pack" and "heavily loaded," and I guess those types of voltage spreads as well...

And before I forget, there's a handful of subtle 'voltage-related' issues I can't get a handle on. For example, one thing I've noticed that I always forget is that 'good' cells seem to actually have a lower resting voltage than 'bad' cells after being fully charged (or whenever, maybe not necessarily fully charged)... In general I'm usually thinking that higher resting voltage is better, but I think that might not be true; I think that higher resting voltage might be an indicator of badness... No real idea why though, just seems like I've seen this at the cell and pack level, where, when I know this or that cell is better or good, or I know my pack is fully balanced and running strong, the resting voltages are lower - like instead of sticking around 1.40V for a long time, voltage falls to something like 1.34-1.37 fairly quickly but just hangs out there for a long time... Pack voltage - it might be 160-163V instead of maybe 165V or more... I forget the other "subtle" voltage issues at the moment...
__________________
2000MT, CAN, ~168K miles
eq1 is offline   Reply With Quote Quick reply to this message
Old 05-14-2016, 05:07 AM   #7 (permalink)
eq1
Senior Member
 
eq1's Avatar
 
Join Date: Jan 2012
Location: PNW
Posts: 4,318
Default

Quote:
Originally Posted by eq1 View Post
One working theory I have at the moment involves the importance of the 1.20 to 1.25V range: I think good cells will really stick to this tight range, even at pretty high discharge rates....We all know that the nominal voltage of the cells is 1.2V - though I think it's really supposed to be 1.25V. That means something. Cells should be working at that voltage (during discharge); it's what they're meant to do...

I think the Insight battery management probably relies a lot on that 'middle' voltage...

1.34V is also a key value; it's the idealized equilibrium potential for the main reaction on the positive electrode. In practice, one thing it means to 'us' is that resting, open circuit pack voltage is often hovering around 160-161V (120 cells X 1.34V)....
These I think are really one and the same thing, or similar and/or related - this 1.20V to 1.25V range and this 1.34 'equilibrium' voltage. The equilibrium potential for the positive electrode is 1.34V, but then take into account internal resistance and operation under 'non-equilibrium' conditions (i.e. under loads, charging and discharging, rather than resting).

When you take that reported 2mΩ cell-level internal resistance value and apply it to 1.34V under various discharge loads, you approach the nominal voltage. For instance, with an IR of 2mΩ, a cell with a nominal 1.34V will see a voltage drop equal to 0.1V at a 50 amp discharge, which drops the cell voltage to -- 1.24V...

So, this 'equilibrium' voltage and the nominal voltage are in essence the same thing. I think it's more cut and dried, too, when it comes to NiMH electrochemistry - since the way that electrochemistry works produces flat charge and discharge voltage curves, i.e. the nominal voltage is less of an average over the entire curve, like it might be for some lithium chemistries, more of a real working value across the whole capacity... It has to do with the type of reaction, something like a two-phase insertion reaction, which produces charge and discharge voltages that are independent of the charge state, one of the good things about NiMH...

Interestingly, when you think about it, to some degree the extent to which charge and discharge voltages vary from the equilibrium reflects the extent to which cells are being pushed closer toward their limits, at least based on what they're supposed to do, based on the electrochemistry, assuming a perfect cell and/or reactions... I guess you take internal resistance into account and then you're left with this working voltage range, like 1V to ... let's call it 1.6V, as it's been said that 192V is the Insight pack voltage max and 192V/120 cells=1.6V. I've read 1.50V is a thermodynamic max or something of that ilk, above which overcharge reactions take over or at least happen more; and then 1.53V is the peak voltage at full under a 6.5A 1C charge according to Panasonic; and then I think that Panasonic spec sheet also reports 1.8V as being some critical max voltage... I would probably pick 1.53V as a conservative max voltage under any conditions - like if I designed the BMS I don't think I'd let voltage go beyond 1.53V ever...

Let's take that 2mΩ resistance value into account again and see what kind of current we can charge at to keep voltage below 1.54V: With an equilibrium voltage of 1.34V, we have headroom equal to 0.19V; 0.19V/0.002Ω=95 amps...

How about 3mΩ?: 0.19V/0.003Ω=63.3 amps...

Pretty sure this is the way you do it... Of course, these are just back-of-the-envelope; cell voltage isn't always 1.34V. For instance, if you've been hitting regen a lot you're seeing cell voltages up to something like 1.416V (no load), which leaves us with headroom equal to only 0.114V. At 3mΩ, our max charge current could only be something like 38 amps (again, assuming we want to keep cell voltage below 1.54V)... And then we're not working with perfectly balanced and matched cells, and not just 1 but 120 of them. Etc etc...
__________________
2000MT, CAN, ~168K miles

Last edited by eq1; 05-14-2016 at 05:37 AM.
eq1 is offline   Reply With Quote Quick reply to this message
Old 05-15-2016, 05:19 AM   #8 (permalink)
eq1
Senior Member
 
eq1's Avatar
 
Join Date: Jan 2012
Location: PNW
Posts: 4,318
Default

Quote:
Originally Posted by eq1 View Post
....I forget the other "subtle" voltage issues at the moment.
I remembered another "subtle" voltage issue. It's not really subtle, though you'd probably never notice it unless you were doing something to test... It's partly related to the mythology around IC that says 'voltage doesn't tell you anything about state of charge'. In general, it's simply about how voltage changes during pack use, how it doesn't change linearly with state of charge say like lead acid...

For the longest time, running crappy packs, it seemed like voltage fell when state of charge got lower. But it turned out, I think, that most of that change simply reflected the extent to which my cells had deteriorated capacity and voltage depression. When I got my good rebuilt pack, one thing I noticed is that voltage didn't drop in the same way...

I can discharge the pack from say 75% to 60%, so -15%. Initially voltage might be 164V at 75%, and when I discharge to 60% it might settle in at something like 156V. If I keep the charge/discharge cycles shallow (like about 5%), pack voltage will remain around 156V or so. But if I extend a discharge lower, say drop state of charge by 10-15% more instead of by only 5%, when I charge back just that 10-15% amount, pack voltage will be high again and settle in at the initial high level, the level not 10-15% up but 30% up - possibly 164V, usually at least 160V...

Basically, despite having a net loss of about 1000mAh, pack voltage ends up back where it was before any drain - if I discharge even lower and then charge back up. Knowing a little about the electrochemistry, how the cells work, and also based on what people commonly believe - that 'voltage doesn't tell you anything about state of charge', it doesn't buck any trend. But it's weird and I don't understand what makes the voltage do that.

Why would voltage be low after the initial 15% discharge (that 156V level), but then, after discharging more, say an additional 10%, and then cycling back up only 10%, voltage rises back to the initial level - 160-164V?

I surmised one time that it was like the cells have a 'moving charge state window'. It's like there's localized affects, like having a cell within a cell.

The voltage one reads tends to reflect the recent usage history, like the main reaction is happening somewhere fairly specific in the cell, not everywhere equally, and when you charge or discharge, other reactions are more likely to happen in that area. That usage then shapes the 'chemical composition' or what-not of that area...

If you discharge X amount and then keep the cycles shallow, subsequent reactions will tend to happen in that 'usage space' you've primed/created - and that space is primed for discharge and is small (or something like that). But if you cycle lower and then cycle back up, you've increased the 'primed' 'usage space' - and voltage will be more normal, higher...

I don't know, it's a weird thing, hard to understand, grasp... I have a research paper that includes a similar idea, but I never really read it very closely. I'll have to dig that out and see what it says... If anyone knows why this happens I'm all ears...

fyi, here's a snip from that research paper. It does a pretty thorough job of describing the problem but says nothing about what causes it. The authors call it "voltaic hysteresis," and I think that's what such behavior is called in general... This snip is from the introduction, the paper's titled "Implications of NiMH Hysteresis on HEV Battery Testing and Performance," 2002:

"Unlike other high-power batteries such as Lithium-Ion, NiMH batteries exhibit a strong voltaic hysteresis between charge and discharge. Hysteresis has a profound impact on the ability to monitor and control state-of-charge (SOC) and measure battery performance. As a consequence, previously developed calendar life-tests may not be applicable to this technology and no easy method (short of completely discharging the battery and measuring its residual capacity) presently exists for determining the energy remaining in the battery during use.

Srinivasan et al. state that two oxidation states (i.e., SOCs) can exist at the same potential depending only on the previous history of the electrode. Consequently, the potential of nickel-based batteries cannot be used as an indication of the SOC of the cell...."
__________________
2000MT, CAN, ~168K miles

Last edited by eq1; 05-17-2016 at 05:05 AM.
eq1 is offline   Reply With Quote Quick reply to this message
Old 05-19-2016, 04:41 AM   #9 (permalink)
eq1
Senior Member
 
eq1's Avatar
 
Join Date: Jan 2012
Location: PNW
Posts: 4,318
Default

Quote:
Originally Posted by eq1 View Post
....Over the years of grid charging, I've honed-in on full charge values of about 174-176V, charging at typical room temp and typical grid charge rates 300-500mA... 174-176V=1.45V to 1.467V were it just a single cell... I made a chart a while back for temp adjustments; I'll add that later....
Here's that chart (table) I made for adjusting voltages for temperature differences. It's based on only few pieces of data from the Panasonic spec sheet for the HHR650D cell. The lone values at the far right column are the values you'd use in a pinch to adjust voltages.

For example, for temperatures between freezing and 19 Celsius, you'd adjust voltages plus 0.360V per degree C below 20 degrees C. For temperatures between 20 and 40 Celsius you'd adjust voltages minus 0.312V per degree C above 20 degrees C...

For an Insight pack, the idea is that, if say you're grid charging at say 10 degrees C, you'd expect to see a higher peak voltage because the ambient temp is colder and voltages are higher when colder. Instead of the normal 174-176V you might expect to see plus 10 X 0.360V =+3.6V, or about 177.6V to 179.6V...

If you're charging at say 30 degrees C (86F), you'd adjust the expected peak voltage downward - by 10 X 0.312V, or 3.12V. Instead of the normal 174V to 176V, you might expect to see 170.9V to 172.9V...

I can't remember exactly what I used for this table and how, so take it as an approximation. The other thing is, I'm not sure what "ambient" temperature means when it comes to charging the pack in the car. The pack will heat up and heat the air around the pack - so what's 'ambient temp' at that point? Is it the temp outside the car or the temp in the battery box? I suppose if you're running the fan on high you could probably use outside air temp...



A more exact method to calculate an adjustment value is to look up the temperature you're charging at, read the voltage value for that row under the 'pack V' heading, and then subtract the voltage value in the same column for the 20 degree C row. For example, if you're charging at 30 degrees C, the Pack V value is 180.4, the Pack V value for 20 degrees C is 183.6, and the difference is 3.2V -- subtract 3.2V from the expected peak voltage. If you take 174-176V to be normal, then the adjusted peak voltage will be 170.8V to 172.8V...
__________________
2000MT, CAN, ~168K miles

Last edited by eq1; 05-19-2016 at 04:52 AM.
eq1 is offline   Reply With Quote Quick reply to this message
Old 06-01-2016, 05:21 AM   #10 (permalink)
eq1
Senior Member
 
eq1's Avatar
 
Join Date: Jan 2012
Location: PNW
Posts: 4,318
Default Voltage Taps

As many times as I've suggested to people having IMA troubles that they check their voltage tap voltages, I never actually checked them myself -- until today. I thought I should jot down what I found...

First, for the uninitiated, the IMA pack is made up of 20 'sticks' or 'sub-packs', each of which is made up of 6 "D"-sized cells. The car taps into pairs of sticks and monitors voltages for those pairs - so there's 10 pairs, 10 voltage taps. The battery condition monitor, or BCM, is responsible for those taps. The BCM is one of the square silver computer boxes on top of the battery pack, on the left side if you're facing the pack from the hatch. The taps connect to the BCM at the back of the computer; when you open the battery compartment, the tap connector is right in front of your face... You can 'back probe' each wire ensconced in the grey plastic connector and check the voltages of stick pairs...

BCM 'Connector C' - Voltage Taps (left connector)
image credit: Bumblebee Batteries


The BCM seems pretty concerned about tap voltages - so it's probably a good idea that we be at least a little concerned. A few IC'ers, including myself, tend to believe that tap voltages that are too far apart are one if not THE leading cause of IMA trouble codes. A couple people have put a value on some of that - like if the tap voltages are more than 1.2V apart under load, you'll get a code. It says something like that in the Troubleshooting manual, though it's a little more nuanced than just this...

Anyhow, I checked my tap voltages. I have a self-rebuilt pack that's been working very well for maybe half a year now. It's actually working the best it ever has at the moment. I grid charged it about a week ago, have been using it regularly, and I let it 'positively recal' to a nominal 75% before I let it rest over night and then checked the tap voltages. So basically, these are tap voltages for a known, good-working pack under these circumstances, and "these circumstances" generally mean fully car-charged, balanced, at roughly 75 degrees F, after sitting over night:

16.04, 16.07, 09, 05, 08, 02, 01, 09, 04, 05
So a range of 16.01V to 16.09V
Average=16.054V, +-0.04V


Here's a graphic depicting the BCM voltage tap connector as it looks from the back plugged-in, the wires that you'd probe with your DMM to read the voltages, and how the sticks corresponding to the taps are arranged in the pack:

fyi, "thrm" is shorthand for "thermistor," indicates to which sticks the 4 temp probes attach...

If you're having IMA troubles, that'd be a great time to check voltage taps. I think chances are high that you'll see a much greater range of values than I posted above...
__________________
2000MT, CAN, ~168K miles
eq1 is offline   Reply With Quote Quick reply to this message
Sponsored Links
Advertisement
 
Reply

Quick Reply
Message:
Options

Register Now

In order to be able to post messages on the Insight Central: Honda Insight Forum forums, you must first register.
Please enter your desired user name, your email address and other required details in the form below.
User Name:
Password
Please enter a password for your user account. Note that passwords are case-sensitive.
Password:
Confirm Password:
Email Address
Please enter a valid email address for yourself.
Email Address:
Insurance
Please select your insurance company (Optional)

Log-in

Random Question

Thread Tools

Posting Rules
You may post new threads
You may post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Powered by vBadvanced CMPS v3.2.2

All times are GMT -4. The time now is 01:10 PM.



Powered by vBulletin® Copyright ©2000 - 2017, vBulletin Solutions, Inc.
vBulletin Security provided by vBSecurity v2.2.2 (Pro) - vBulletin Mods & Addons Copyright © 2017 DragonByte Technologies Ltd.
 

Content Relevant URLs by vBSEO 3.3.2