Honda Insight Forum banner

21 - 26 of 26 Posts

·
Registered
Joined
·
6,496 Posts
Discussion Starter #21
Just a quick follow-up on what I was writing about above, and then on to something else I want to mention...

I ended up taking a closer look at the self discharge data I mentioned above, and overall it's a bit too inconclusive, not good enough. I posted a couple graphs based on those data in another thread and explained it a bit there: The quintessential Insight NiMH voltage thread

The gist of it is this: Discharging a bit off the top will mitigate the impact of uneven self discharge rates on cell-to-cell balance only if the faster self discharge cells tend to self discharge even faster at higher charge states/voltages. I just can't conclude that that's the case from the weak dataset I have, and I don't care enough about it to create better data.

* * *

So, since I got back from the Danville Insight meet I've more or less switched to using my pack in the top charge state range, so I've been looking closer at how the car deals with the top. There's really a lot of interesting stuff going on, a lot of things to talk about. For now I just want to mention a couple things about the high charge state cutoff 'algorithms' in play. Mainly, there must be a quite refined program going on in one or more of the computers that makes sure you're not overcharging a cell, stick-pair, pack - whatever...

I think it was here or maybe in another thread where I talked a lot about the low-end slope detection algorithm in play: the BCM is able to detect when a cell is empty by calculating the slope of stick-pair voltage discharge curves - a steep slope reflects an empty cell and the BCM (or MCM) throttles discharge current and then throws a neg recal on the second detection of steep slope. I mentioned at the time that I thought it was likely something similar plays out at the top end, to determine full or too-full... Can't say I know that's what happens, but definitely something similar and very iterative does play out.

I've been resetting state of charge with the OBDIIC&C and more or less trying to stuff the pack, teetering at the edge of when this high cutoff algorithm kicks in. I noticed a handful of things today that I've either never noticed or never really thought too much about. Here's a couple of them off the top of my head:

-at some juncture during the sequence of 'full detection' parameters, one of the computers commands a discharge, the 12V load will be sourced directly from the IMA pack rather than the motor.

This is a really weird little program. I can't tell exactly what triggers it, and then it usually only lasts until I 'press the clutch pedal', i.e. I can trigger my calpod switch, ON then OFF quickly, and that will disable this drain.

Sometimes when this drain is happening, subsequent regen will trigger the dash CHRG lights - but OBDIIC&C shows no current. So, it's like the BCM or ECM triggers that discharge/drain, but perhaps the MCM doesn't get the message(?) - whatever drives the dash regen lights, that's still acting like everything's normal...

This drain seems to be an 'afterthought': it's not the main/first high cutoff behavior, it seems to happen only after 'something else' happens first. For instance, maybe an initial high tap voltage or steep slope is detected and regen current is throttled/limited. But then, perhaps a high tap resting voltage is detected, perhaps for a set duration, and then the drain will kick in... I know that the drain will normally kick in once the nominal charge state reaches its normal set max, such as 81%. But if you're manipulating the system, such as by resetting nominal charge state from 80% to 75%, this normal trigger is defeated, and the other real-time monitored parameters or what-not come into play, are revealed, etc...

-It looks like the absolute cutoff is resting voltage of 17.4V tap level, (1.45V per cell), or equivalent.

I can watch total pack voltage and current during regen and see that the pack itself isn't quite full; typically I'm keeping an eye out for about 186V at about 6.5 amps as an indicator of truly full. I've gotten closer, but still quite far away from that. I think I've seen maybe 180V at maybe 10 amps. But the highest resting voltage I've seen is about 174V, and usually I haven't been able to get that to 'stick' for long; 174V for maybe 30 seconds, and then a more stable 172-173V. At this point the 'car' is not allowing any more charge.

You can charge substantially more than the car would normally allow if you just work the system a little. For example, with one of the BCMs that pos recal to 75%, you can charge from 75% to 80%, reset with OBDIIC&C to 75%, charge another 5%, reset again to 75%, and so forth. But then, once you start seeing the 'automatic drain', you can discharge just a little to bring voltage down, yet then charge even more than you discharged without triggering the absolute top end cutoffs.

It all seems precariously designed around tap voltages - voltages that suffer a ton of hysteresis. I'm pretty sure OEM Insight cells, and probably Civic cells, can exhibit a fairly large degree of voltage hysteresis at the top end (i.e. the voltage can vary a lot), but that variation is due to short-lived, temporary, reversible electro-chemical phenomena. Does the BCM adequately deal with this? I don't think it does.

It seems like the BCM must have fixed voltage thresholds (probably adjusted for current and temperature), and once a tap hits the threshold, or once a steep slope is detected, charging is done. But subsequent usage around that charge state can 'loosen' things up: similar to how low charge state usage can raise sagging voltages, high charge state usage can lower peaky voltages... After this 'loosening', you can charge more while staying under the absolute cutoffs...

Personally, with my pack in its current state, I'm thinking a lot of this extra charge I'm able to do probably stems from me having used the pack at rock bottom charge state for the last month or so. I did cycle up some times during that low end usage, but most usage has remained low. I don't really have the greatest data, but after that low charge state usage, the first time I cycled up I was able to charge the pack to an adjusted, estimated real charge state of only about 30%, i.e. the car pos recal-ed at what I estimate to be only about 30% true charge state - not the '75%' you'd expect. That was like two weeks ago. Since then I've concentrated usage toward the 'high' end (above this real 30%, and usually as high as I could go) and now my estimated adjusted true charge state figure is at 67% (that's probably a slight under-estimate, though)...

In other words, two weeks ago I could charge the pack to 30%, now I can charge the pack to 67%. Pretty sure this would never happen were I not juking the system. I'm not sure if grid charging and discharging would accomplish the same thing... It wouldn't if the treatment intervals were too far apart - more than 6 months? 3 months? I imagine that would depend on the condition of the cells.
 

·
Registered
Joined
·
6,496 Posts
Discussion Starter #22
Was looking a little more at the top-end cutoff behavior today, particularly that automatic discharge/drain. It seems like it has to be, or at least can be, triggered by something that's transient, short-lived and/or the 'drain command' itself is just a one-shot deal, like the command says, 'discharge until you see XXX', "XXX" being, for example, clutch pedal press or probably any subsequent IMA usage, like assist or regen, after which it resets and looks for the trigger parameter again, whatever it is...

If I had to guess I'd say it's probably tap voltage slope: when I do coasting regen and it's a modest rate, say 7-12 amps, total voltage might be at like 175-180V, but voltage on a single tap is probably increasing faster than others (i.e. it's more charged, or at least one cell in the stick pair is). I can do this repeatedly and the drain is invoked almost in lock-step with whatever I'm doing. Hard to explain.

I can gauge just how close the pack is getting to 'full' and I can see just how much I'm able to input, and I get a feel for when the drain is going to trigger, and the pace and rhythm has the feel of a cell reaching full, as if I were charging a cell on the bench. Charge slope near full gets steeper and steeper (that is, until it peaks). So, the sense I get is that slope is being measured under the slight charge load, and once it reaches the set threshold, the drain kicks-in. But, if I disable the drain such as by hitting the calpod and try to trigger it again, it will happen again - sooner. If I discharge a little and repeat the process, the drain takes a bit to kick in. I disable the drain, try to trigger again, it happens sooner. And again and it happens sooner, etc etc...
 

·
Registered
Joined
·
6,496 Posts
Discussion Starter #23
....Charge slope near full gets steeper and steeper (that is, until it peaks). So, the sense I get is that slope is being measured under the slight charge load, and once it reaches the set threshold, the drain kicks-in. But....
This isn't exactly true. Charge slope gets steeper and steeper, but only up to a point, after which it gets shallower and shallower until the cell is full, voltage peaks and flattens-out, and then falls if charging continues. IF the BCM uses slope detection at the top, I imagine it would use the steeper and steeper part of the curve, but it seems like it could use the shallower and shallower part - either also or in addition to the steeper and steeper part. For example, perhaps it detects when the voltage curve gets steeper and stepper and implements some controlling behavior, such as regen throttling. But after that, perhaps under the right conditions, it also detects when the voltage curve gets shallower and shallower - and implements more aggressive regen throttling or outright disabling... I haven't really seen absolute disabling, not when nominal charge state hasn't reached 81%...

In any event, was looking a bit at regen limit flag "Rlf" on OBDIIC&C today, while doing stuff similar to what's described in previous post. In general, Rlf reads 0 when regen isn't throttled and 1 when it is. I said above that if I had to guess I'd say slope was being detected and triggers throttling and/or drain. But watching Rlf it almost seems like there could be multiple criteria at play. For example, most often I would see Rlf trigger 1 only after the regen event was over - do some coasting or braking regen and when back on throttle Rlf flips to 1. You'd think that if slope were the controlling parameter that Rlf would flip 1 during the regen, not after, wouldn't we?... Flipping 1 after the regen makes it seem more like open circuit voltage/resting voltage is in play. But, I also saw Rlf flip to 1 during regen, so perhaps both slope and resting voltage, or even max voltage loaded - they all could be in play...

Rlf doesn't stay at 1 (under these circumstances; as I recall it does if you let nominal SoC max-out and auto drain is locked in). It only flips to 1 for brief periods. However, throttling/drain behavior sticks around even though Rlf isn't 1. That makes me think indeed a timer is in play - time and/or a cancelling behavior, such as assist or depressing the clutch (turning 'calpod' on and off)...

If you have a BCM that pos recals to 75% rather than ~81%, it seems like you have a lot more control of how much more you can charge the pack at the top. Resetting SoC from say 80% to 75% and then stuffing it some more allows way more charge than resetting low repeatedly and letting the car pos recal on its own. I can reset low and get pos recal say a couple times, but after that additional resets low don't allow any more charge, pos recal happens right away. But, continuing to regen above 75%, taking it up to 80%, resetting back down to 75% and repeating the process, juking the BCM's top-end algorithms, can stuff a ton more into the pack. I've added something like 20% just over the past two drives...

I have one tap that's about 50-100mV higher than the others, when loaded at around 6 amps (say 17.10V vs. 17-17.05V). I think that's due to persistent, 'hard-core' high IR (rather than high resistance due to electrochemical stuff). I imagine that tap might cause 'premature pos recal' or 'premature full'. Now, I think it would cause premature full or whatever if the high cutoff parameter were high loaded voltage, possibly high resting voltage. But I don't think it would if the parameter were slope, steep or otherwise: I'm pretty sure 'high IR' just shifts the curve higher (or lower on discharge), but it doesn't change the contour, in general. Normally I'd think this tap with a high IR cell (or 2) would have a higher voltage spike at the top under charge load, but the voltage would drop lower than other taps when the load was removed. Yet, these NiMH cells, the ones that have persistent high IR, seem like their voltages can kind of get stuck up there, they don't fall lower... I don't really know why this happens, how that works... So, persistent high IR = premature full if trigger parameters are loaded voltage and resting voltage, I guess, but probably not steep slope...
 

·
Administrator
Joined
·
11,167 Posts
Maybe not the right thread but you are the NIMH guru.

Modelling packs..


Could we model a packs behaviour in a sophisticated spreadsheet with nice graphs as we step thru data or time?

I'm rubbish with spreadsheets so this is out of my league.

But i'm thinking build a spreadsheet with 20 sticks data, each assigned a value for capacity, voltage, IR and most importantly self discharge rate.

The spread sheet allows you input nominal soc as start point, then it plots daily stick soc etc and calculates pack imbalance and whatever else we fancy looking at for say however many day/weeks/months we want. ..

A bit like weather modeling. We could predict how far out of balance a pack would be after 20/30/90 days etc

If you quantified a set of 20 sticks accurately and input that data into the model we could see how it would react to charing/discharging/cycling etc etc.

Depending on how clever the spreadsheet is we could add peukert effect and natural balancing due to efficiencies etc etc etc.

A shared google sheet might be best as several people could work on it.
If you think this should have another thread that's fine.

We might be able to determine if self discharge is linked closely and predictably with Internal resistance or other factors like terminal voltage etc.

Meaning that easy to determine IR could be used to predict or model SD without having to actually measure the stick SD rate..

Just lockdown thinking..
 

·
Registered
Joined
·
6,496 Posts
Discussion Starter #25
^ hmm, I think I understand everything, and it makes sense, up to the last couple lines. Pretty sure self discharge isn't correlated with 'IR', or if it is it's a negative correlation ('high IR' correlated with slower self discharge)...

In general, are you talking about a theoretical model, where you plug-in fictitious values, where the overall model is simply used to inform, educate, or a real one, where you plug-in real values and try to, say, diagnose a pack? Sounds like you're talking mainly about the former most of the time... I think a very simple model might be doable, and might be useful, though I doubt anyone would be interested.

I think anything that tries to get too 'real-world' would get too complicated and would probably end up missing the forest for the trees...

Even just modelling 20 sticks (perhaps 10 taps) with a couple variables, like capacity and self discharge, within the confines of management that has only a couple parameters, like top and bottom voltage cutoffs, over time, would require a lot of rows and columns and calcs... hmm, I don't think I know enough to 'model' even a single cell...

I don't think I'd be up for it. I barely enjoy 'modelling' my whole pack with the simplest of data and calcs, let alone trying to do 20 sticks with added complexity. I can't think of anyone else around here who would care enough to participate... I don't think anyone even looks at the simple charts I've done, which aim to elucidate relatively simple (though fundamental) concepts; I imagine it'd be even more...oppressive for people to 'look at' a 20 stick model...
 

·
Registered
Joined
·
6,496 Posts
Discussion Starter #26
Took my pack back down to the bottom today and watched assist limit flag ('Alf'), still on 010 BCM. A couple things I noticed:

-assist gets throttled when nominal charge state gets low, I think it was around 30%, but I wasn't seeing the assist limit flag.

-I reset nominal SoC from around 28% to 40%. As mentioned earlier, with 305 BCM I can do that and defeat throttling, but with 010 BCM I wasn't able to do that, last time. Now I was able to do that. But, it only lasted until Alf triggered, which wasn't too long after...

It looks like Alf is triggered by tap voltage slope detection (near empty cell), like every time a steep slope is detected Alf flips to 1 (and back to zero). And every time Alf flips to 1 assist is immediately throttled. With the 010 BCM I can charge back up a little, like 5-10%, and then use assist freely as long as nominal SoC is set artificially high (such as 40% rather than 28%) and Alf isn't triggered. With 305 BCM I'd be allowed two steep slope detections, the first one at whatever current, the second one low current, and then a neg recal.

* * *
One other thing I've been noticing in general, and something I'm starting to believe, is that my pack doesn't perform as good when it's charged high and used high versus when I use it low. I had it charged to probably a true 75%, but it didn't take much assist, maybe 10-20% of capacity, for total voltage to look pretty saggy. Like, after about only 10-20% usage, total voltage was below 144V at about 20 amps. When I've used the pack low, I've seen higher voltages longer and typically not below about 144V until close to the bitter end. Of course, if I'm using it low I'm closer to the bitter end to begin with. But, I know I can get more than say 10-20% of usable capacity at loftier voltages, I'm thinking it's more like 35-45%...

This is pretty hard to explain. Earlier I had been thinking that low-end usage restores 'curves', restores performance, across a whole absolute charge state range. I can use the pack low and see performance down low improve, and I've generally assumed that improvement at the low end meant improvement at the top as well, at the same time, that they go hand in hand. But over time I've come to see that that's not the case.

I'm pretty sure that low end works better than high end, and I'm not sure why that's the case. But even so, a trade-off is happening: low-end performance comes at the expense of high end performance, and vice versa...

Basically, there's like two things happening:

1. When you use low end it's like you drag active materials from the top down to the low end. You see performance gains down low. If you could miraculously, instantaneously transport your usage to the top, however, you would NOT see great performance - there really is no top end any more. You've dragged stuff from the top down low, so as long as you're seeing good performance down low you can't see good performance up high, it's a trade-off.

Of course, you can't just instantaneously go from low usage to high usage; you have to charge back up. The first few charges you won't see good performance up high, it takes several cycles to start seeing good performance up high after you've been using the pack low. Why? Because you're now dragging active materials back up to the 'top', the cycling up high is re-concentrating active materials 'up there'... The same or similar things happen when you go from top usage to bottom usage...

2. When you use low end you probably recondition 'stuff' down there. I think this is a different, distinct process. For example, you burn bad stuff - 'crud', shrink crystals, probably achieve some kind of balancing among cells in the process, etc etc. This isn't the same thing as the 'dragging stuff from high' type of recondition or what have you...

From what I can tell, this number 2 happens early-on, and then you're stuck with the finite, stunted capacity of used, old cells, and all you can do is drag top end down or drag bottom end up, achieving good performance within that window, but never achieving that good performance across the entire 6.5Ah capacity (or maybe 5.2 Ah capacity, as I imagine even new cells can't do this "good" performance, such as 90 amps for 4 seconds or 45 amps indefinitely, across the entire 6.5Ah range)...

So, the question now is, despite being a trade-off, where you can have good performance low or high but not both at the same time, the low end still seems to work better -- Why? Just not something I understand.

Falling back on old, boiler-plate ideas, it seems possible that charging at high currents at relatively high charge states can quickly 'crud-up' the cells -- inducing 'voltage depression'... In general it seems like high charge state is a stressed-state by default, the cell is wound-up. That would be bad for charges, but you'd think it'd be good for discharges. It just doesn't seem to work out that way... Maybe it's a combination: high charge state is stressed, so charging stresses even more, crudding things up, and then discharges end up weak - because the 'crud' gets in the way. But when you charge down low, the cell isn't in such a stressed state, so crud creation is minimal - discharges end up good...
 
21 - 26 of 26 Posts
Top