Honda Insight Forum banner

1 - 20 of 22 Posts

·
Registered
Joined
·
6,201 Posts
Discussion Starter #1 (Edited)
Some months ago I started toying with the stock pack management a lot using the OBDIIC&C SoC reset function and generally just by paying attention to IMA info and acting on it. This was in preparation for trying to implement a lithium LTO pack at some point, without a BMS. I've kept track of various OBDIIC&C IMA parameters for years, for every trip. I log nominal state of charge, pack temp, etc., and reset amp-hour counting at the beginning of each trip and log these at the end of each trip. So I have a massive spreadsheet of all these data...

I've also calculated a few things for each trip and, over the years, have honed-in on a couple simple metrics to keep track of what's going on, what the true state of charge of my pack is or should be, etc. For example, I add net amp-hours from trip to trip, so I have a running total of how many amp-hours have coursed through my pack...

After I started doing the 'toying', I noticed something interesting when I graphed these various data. If I let the car do its own thing, efficiency looks bad. If I deliberately do some things, deliberately thwart stock pack management, efficiency is better. That's what's illustrated in the graph below... Letting the car do its own thing is the left half, about 3 1/2 months, 'toying' with stuff is the right half, also about 3 1/2 months...


Over the first half the blue cumulative count SoC curve has a steep slope, while the black nominal SoC curve is relatively flat. This demonstrates that the number of amp-hours needed to maintain the pack state of charge is continually more than 100%. For instance, if the pack were 100% efficient, the blue curve would be flat and would track the black curve - I'd put in say 500mAh, I'd pull out 500mAh, and state of charge would be at exactly the same place where it started. Less than 100% efficient ('Coulombic efficiency') is normal and expected. But by how much?The steepness of the slope of the blue curve reflects the degree to which inefficiency exists...

Inefficiency has two main parts: self discharge and throughput efficiency. The green curve attempts to capture the throughput (in)efficiency part. When we calculate state of charge we adjust it down slightly because we know that not all the current contributes to charge or discharge energy...

In a nut shell - since this is taking me longer than I had hoped - after I started messing with stuff, the blue curve stopped rising as fast and as much, and the green curve started going down. Both of these reflect better efficiency: I'm using fewer amp-hours to maintain roughly the same states of charge. Since the green curve is supposed to account for throughput efficiency, most of the gain in efficiency is probably coming from less self discharge. That's more of a guess at this point, but probably the case...

black curve - the nominal state of charge as seen on the OBDIIC&C SoC parameter

blue curve - state of charge based on net amp-hours cumulative count. For example, cumulative count amp-hours might be 8500mAh after a few week's worth of trips, whereas the nominal capacity is 6500 - so the cumulative count SoC would be 8500/6500 X 100 = 131%.

green curve - state of charge based on net amp-hours cumulative count but adjusted for throughput (in)efficiency. Throughput inefficiency is the fraction of current that gets lost on the way, the current that doesn't end up actually charging the pack or contributing to output energy. I've calculated this a few times based on longer drives and on the car's own state of charge determination. The loss is about 2% - so for every 100mAh that go through the pack, only about 98 of them do something useful...

I also graphed average pack temp - I log pack temp at the beginning and end of each trip and the white curve reflects an average of these two values. I added this curve recently, because I was wondering if maybe higher temps over the first half and cooler temps over the second were responsible for the difference. But it doesn't look like that's the case. Average pack temps start to fall at about the middle of the left half of the graph, but we don't see a change in the slopes of the blue and green curves...

The red triangles show when positive recals happened. These are junctures at which the nominal state of charge curve accurately reflects what the car thinks the state of charge is. Since one of my toying methods involves resetting the nominal state of charge with the OBDIIC&C, the black curve doesn't always reflect this 'true' nominal. For example, one of my methods involves resetting SoC high when it's really low, typically by 10 percentage points. At those junctures, the nominal state of charge will read say 75% but it's really 65%. I don't go too far in this direction, though, so the black nominal curve generally reflects the 'true nominal'... The red triangles show when indeed we're looking at true nominal values...

Each data point represents a single day. Usually there's just one trip per day, but sometimes there's no trips and a few times there's multiple trips. I've condensed the multiple trip logs into one log per day, and I've added days where there were no trips. This makes the x-axis consistent with respect to time. The labels are divided into weeks, so about a week's time per vertical grid line... The whole graph is about 7 months. The first half, up until about the end of October, reflects a period when I was letting the car do its own thing. The second half reflects a period when I was toying with the pack management...
 

·
Registered
Joined
·
151 Posts
that's a good read. I: wonder if it's possible to make an intelligent device to do what you have done manually all of this time. I am interested, has MPG also gone up?
 

·
Registered
Joined
·
6,201 Posts
Discussion Starter #3 (Edited)
MPG hasn't gone up, I wouldn't expect it to. The significance of the 'efficiency gains' as I see it is more about pack longevity. My hunch is that the car's imbalanced use of the pack is at least partially why packs go bad sooner rather than later. But I'd have no way to prove that, or even explain it...

On 'making an intelligent device' - I'd probably have to figure out what exactly it is that I'm doing that produces the gains, plus I'd probably have to look even closer, tighter testing, to verify that the gains are indeed real... I might be missing something, I don't think I am, but maybe...

In any event, in general, we could probably program the OBDIIC&C to reset state of charge to 75% from like 70%, to avoid the hanging pos recal and extra charge that's allowed to 'top up' the pack, and that would do most of what I do. That's the biggest thing: I don't top the pack up as high and as often as the car would do itself... I think that's probably the number one cause of inefficiency: the pack loses more charge when it's charged higher, via self discharge. It also is slightly less efficienct charging 'up there' than it is down lower...
 

·
Premium Member
Joined
·
3,421 Posts
MPG hasn't gone up, I wouldn't expect it to. The significance of the 'efficiency gains' as I see it is more about pack longevity. My hunch is that the car's imbalanced use of the pack is at least partially why packs go bad sooner rather than later. But I'd have no way to prove that, or even explain it...

In any event, in general, we could probably program the OBDIIC&C to reset state of charge to 75% from like 70%, to avoid the hanging pos recal and extra current that's allowed to 'top up' the pack, and that would do most of what I do. That's the biggest thing: I don't top the pack up as high and as often as the car would do itself... I think that's probably the number one cause of inefficiency: the pack loses more charge when it's charged higher, via self discharge. It also is slightly less efficienct charging 'up there' than it is down lower...
Damn nice presentation.

Food for thought: The "top up" of the pack actually contributes to improved balance. As SoC increases, charge efficiency decreases. Lower SoC cells accept more current as charge vs. heat. This enables lower SoC cells to "catch up" with the higher ones a tiny bit with every regen or background charge.

The healthier the pack, the better the above works. IMHO, imbalance occurs and accelerates when SD and IR overcome the above mechanism.

Your method thwarts the above and may actually worsen the threshold at which pack health overcomes the "balancing" obtained by pushing to higher SoC.

Was grid charging conducted at any time during the logged data period?
 

·
Registered
Joined
·
6,201 Posts
Discussion Starter #5
Food for thought: The "top up" of the pack actually contributes to improved balance. As SoC increases, charge efficiency decreases. Lower SoC cells accept more current as charge vs. heat. This enables lower SoC cells to "catch up" with the higher ones a tiny bit with every regen or background charge....Your method thwarts the above and may actually worsen the threshold at which pack health overcomes the "balancing" obtained by pushing to higher SoC.
I should first point out that when I wrote "extra current," it really should have been extra "charge." I edited that in my post... Likewise, I think technically your response would have to change as well: my understanding is that all the cells in series pass the same amount of current, just that, that same amount of current can result in a higher charge state, such as if one cell has a lower internal resistance than another, such as what happens when one cell is less charged than another as the charge approaches the top-end, etc... When you say "accept more current as charge," that does kind of capture this idea...

I understand the balance-effect you describe. Not sure how important it is. Based on what I've seen I'd have to say it's probably not that important -- as a frequent, regular routine -- though I would think it'd be good to 'top-up/balance' occasionally... For instance, I have original, 16 year-old cells, and voltages from cell to cell are extremely close. Makes me think the extent to which imbalance occurs might have more to do with the extent to which damage happens over the course of using them in the car, for whatever reason; that, basically, good cells are a lot more balanced than I would have thought some years ago... I haven't seen any signs of imbalance issues with this pack, over the 7 months graphed in the chart (it's ~7 months, BTW, not 6 as I say above). I have pos recal-ed a handful of times, I imagine that's all that would be needed (with 16 year-old cells in a reconditioned pack even). I would think new cells would have even less need for the hanging pos recal topup balancing thing...

If, as I'm suggesting, efficiency might suffer mainly due to charging higher than what's necessary to operate the pack fully vis-a-vis OEM programming (i.e. putting out all the assist and taking in all the regen that's thrown at it), I'd say the trade-off would be well worth it: lose some slight balancing effect by forgoing hanging pos recals, but gain a ton of efficiency... I'll have to go back to the data and calculate some summary measure of what a "ton" means, but looking at the graph, I think you can see how much is gained: If you mentally project the blue curve from the left half into/over the right half, the gap between the existing blue curve and that projected blue curve is the gain, it's probably something like two full pack's worth of charging that didn't need to happen over the last 3 1/2 months. That seems like a ton of potentially unnecessary charging... Since the pack continues to work fine, I can't see that I've lost anything...

Anyway, yeah, there's a balancing effect that takes place 'at the top', and it should probably be invoked occasionally. But as a regular, frequent routine I think it probably does more harm than good...

* * *

That top-up aspect is an important thing to forgo in 'my scheme', but there's also other parts. In general, 'my scheme' is mostly about using the pack in a more central state of charge range, not concentrated near the top. That's another thing I've seen: the car's 75% nominal is really more like 85-92%. If OEM management typically operates between a nominal 65% and 70%, or perhaps 65% to 75%, that's really more like 77% to 87%. There's potential problems with this, such as high voltage spikes under high charge current, not to mention the reduction of charge rate needed to stay under the voltage limit (you'd really like to maximize regen at all times, not have to reduce it to keep the cells under a voltage ceiling)... Etc...

I'm not sure why Honda decided to do it that way. I think there's a couple reasons. The main one is the way they do the state of charge determination at the top: the voltage increase when cells approach near full is a more sure sign of the cells being near full. If you drop the threshold, the nature of NiMH voltage change isn't as cut and dried, it doesn't as clearly indicate that the cells are at such and such charge state. Another might be that balancing effect. And another might be the perceived need to keep the pack more charged than not, simply so there's always a big buffer of juice to be tapped... Kind of like how Honda implements background charge at around 65% - if you've got your headlights ON. If your lights are OFF, background charge doesn't kick-in for ages.... Just about the dumbest thing I've ever seen...

I once thought that the upper charge state limit had to do with keeping the pack charged enough so that it could put out full assist. But since then I've seen that the pack doesn't need to be charged very high to put out full assist...

The high voltage spikes and operating the pack in a high charge state range as a matter of routine dovetails with my understanding of one of the major problems in terms of the electrochemistry: I guess I'll just call it 'voltage depression'. "Huggins" argues that overcharge is the cause of voltage depression, for instance, and then there's other things that happen when imposing a higher current than the cells can easily handle - 'side reactions', creating preferred/less preferred pathways, the increase in alpha mod NiOOH, the creation of larger nickel oxide crystals and subsequent decreased ability to handle high power, etc etc... Meanwhile, if you restrict pack usage to that narrow, high band of charge state, portions of the cells that correspond to the lower charge states go unused - and they can get 'crudded up'... I don't like to get too much into this stuff because it's hard to understand, explain, etc. Yet, I think it's worthwhile to at least mention it...

If we use a more central charge state range, and at least occasionally expand that range, I think it probably has a palliative effect on cell health. They'll probably work better and work longer... If we let the car use the cells as programmed, I think it concentrates usage too high and too narrow... That's what I'm calling "imbalanced pack usage": the OEM programming results in very imbalanced cell usage. Intuitively, conceptually, based on what I think I know about NiMH chemistry, they beg for balanced usage...

Was grid charging conducted at any time during the logged data period?
The graph starts with a grid charged pack, but no grid charging the rest of the way.

* * *

Sooo... Sorry, 'all', for rambling on so much. I think this stuff is fascinating. Please don't let my relative verbosity scare you away. If there's something about any of this you want to comment on, question, etc., please do, big or small...
 

·
Registered
Joined
·
640 Posts
I also graphed average pack temp - I log pack temp at the beginning and end of each trip and the white curve reflects an average of these two values. I added this curve recently, because I was wondering if maybe higher temps over the first half and cooler temps over the second were responsible for the difference. But it doesn't look like that's the case. Average pack temps start to fall at about the middle of the left half of the graph, but we don't see a change in the slopes of the blue and green curves...
I love your chart. Before I read the above sentence (because I was busy staring at the chart) I did see a possible correlation between changes in temperature and the slope of the blue and green lines. I've taken the liberty to mark up your beautiful chart - I hope you don't mind. (I'm sorry that I don't know how to make it appear actual size.)

Screenshot from 2018-02-19 21-32-26.jpg

In the first section the pack temperature averages 75F... the second it drops gradually from 75F to 50F. (Does that correspond to falling autumn temperatures?) The third is pretty constant near 50F, as is the fourth. I separated the third and fourth because the black line shows a lot more deviation from center than the third, and the slope of the blue line has changed as well. (What changed in mid-January?)

I thought that 25F was a fairly large difference (it will make me grab a coat), but a quick Google for battery performance at these two temperatures did not suggest much difference. But each data point is an average of highs and lows. What would the plot of the median temperature look like?

I have been an Insight owner for a whole month and have hardly scratched the surface of the forums, and am not a statistics wonk either, so I'll stop pretending to be able to interpret this.

Looking forward to your continued analysis!
 

·
Registered
Joined
·
6,201 Posts
Discussion Starter #7 (Edited)
Yeah, I kind of see/saw a bit of a possible relationship between temp and the blue curve, mainly between the first and second half of the left half of the chart, where temp is high over the first half and the blue curve perhaps looks a tad steeper, then temp declines and the curve looks a tad shallower. But it's hard to say. It should be more obvious, and it should show up just as clearly in the green curve - but I don't really see it in the green curve. There's a lot of 'noise' in these graphs so we're really mainly interested in the broad sweeps/can only get a read on those...

I don't see any relationship over the right half of the chart - unless, by that time, when the temp is consistently lower than it was over the left half of the chart, the cooler temps affect all those data points equally, and so they're all pulled down (i.e. the slopes of the blue and green curves are set on their new trajectories and remain there as long as the average temp is the same)... Really hard to say. The key to me is the transition, when temps start declining, and over that interval I don't see a strong relationship. But, we know temp has to make a difference, it must play some part in all this...

Keep in mind that the curves are going to look jagged simply based on the nature of the trips being logged and the 'stuff' I'm doing. Sometimes, most of the time, I'll start the trip with nominal state of charge at say 70% and I'll end it with state of charge at about the same level, mainly because that's what happens when I'm driving, naturally (assist and regen tend to even out based on the type of driving I'm doing). But sometimes I'll end the trip at a lower nominal state of charge - and that's what gets logged. The day-to-day variation, well, there's going to be quite a bit. But over longer stretches it will all tend to even-out. Really, we're not interested in the individual 'data points', if I graphed it we'd be interested in the trend lines, the longer term tend lines...

What I'm trying to say is, the two periods, roughly 3 1/2 months long each, left half vs. right half, is about as fine a 'resolution' as we can focus-in on. Each interval should be long enough to establish the broad impact of the two different management regimes - OEM vs. 'manual'. Other than that we can't say much - because the finer temporal variation can be caused by various things... Hope that helps more than hurts...

Oh, here's a thought. Really, when it comes down to it, there's only 3 key data points - the beginning, the middle, and the end. The beginning and the middle form the start and end points for the first management scheme: OEM. The middle and end mark the start and end of the second management scheme: MANUAL.

Over the OEM management interval, the amount of 'excess' current needed to maintain state of charge was 12,346mAh.

Over the 'MANUAL' management interval, the right half of the chart, the amount of 'excess' current needed to maintain state of charge was only 1,440mAh (roughly, both of these figures).

12,346mAh/1,440mAh=8.6 -- the OEM management required 8.6 times more current to maintain state of charge than when I did things manually... That seems really big to me...
 

·
Registered
Joined
·
1,454 Posts
I'd like to point out that in a manual Insight, charge of the HV battery can vary considerably depending on the driver's behavior. My brother's Insight was consistently at 3 bars or lower, because he had a lead foot. Despite the car's best effort, he consistently ran it flat. I suggested he try shifting a little higher and easing off the pedal, and he was more consistently able to keep the battery at half (or higher) charge.

For me, I have to make an effort to run it down, or I consistently have full or full-1 bars. When I lend the car to my roommate, I often come out to a near-empty battery, so it's not the car.
 

·
Registered
Joined
·
6,201 Posts
Discussion Starter #9 (Edited)
Posted something in my 'background charge' thread and ran into this. Thought I'd just add a few updates/thoughts here on this OEM efficiency topic.

I've been doing this 'manual management' stuff for quite a while now. I've continued with it since the first post in this thread. I don't try to use the middle SoC range now, though; I've been using below 50%, usually around 40%. I occasionally charge to pos recal.

In general my pack seems to have gotten better - puts out full assist with less voltage sag and at lower charge states and temps. It seems to have also gotten more efficient. For instance, I manually calculate state of charge based on cumulative amp-hour count and an efficiency adjustment factor; the adjustment factor used to be around 2% but now it's only 1% (actually I think it was around 2.4% when I started). This decrease could be due to better input and output efficiency and less self-discharge... It just means that, in order to maintain charge state I need to put in 101 mAh for every 100 I take out...

Anyhow, maybe I'll add more later - I just noticed that it's been a year to the day since I started my contemporary 'efficiency' graph, kind of like the one in my first post above. It's equally divided, too, into two 6 month halves, (I did another tap UDD/grid charge 6 months ago, and the same 6 months before that)...

edit: oh wow, I just noticed a glaring improvement: In that first graph above, left half, the blue curve peaks at about 275% after 3 1/2 months of pack usage; in my contemporary graph I'm not even up to 200% and it's been about 6 1/2 months - double the time, quite less charge needed to maintain charge state. Over that first 3 1/2 months I needed to charge the pack an equivalent of 2.75 times the nominal pack capacity in order to maintain charge state; now, that figure is only 2 times over nearly double the time span.
 

·
Registered
Joined
·
6,201 Posts
Discussion Starter #10
I've been using/testing a later model BCM for the past few weeks, discussed a bit here: https://www.insightcentral.net/threads/ima-battery-will-not-charge-on-level-road-at-constant-speed.123899/page-2#post-1442410

Thought I'd mention here that this BCM - an A03 from something like 2005 and/or 2006 - seems to implement a lot of what I describe in this thread: it doesn't charge nearly as high and appears to concentrate usage below 50%. Its pos recal level appears to charge to only about 50% actual. At pos recal the nominal state of charge jumps directly to 81%, then 6% of the charge is bled off via 12V load -- the top end is very circumscribed, much lower and much more the same over and over again relative to the other, older BCMs I've tried.

What's more though is that, since the nominal SoC is high yet the actual is middling and low, the BCM forces you to use that low half of the charge state (i.e. in other BCMs, a low nominal state of charge will trigger compensating behavior, such as background charging and throttled assist; here the nominal is high so these triggers aren't triggered). Over the couple weeks I've been testing it, I've basically operated below 50% all the way down to near empty (a true empty, too) -- which is almost precisely what I've been doing manually. But here, this BCM seems to do it on its own, I don't have to do anything...

I don't know, it's really weird how drastically different this BCM is compared to those others. I'm not sure what to think of the top-end threshold, for instance. In some sense it's too low... I also wonder if there's something different about my cells that causes a lower top-end. I don't think there is, but maybe there were slight differences in the cells used on later Insights that called for different programming, kind of like the purported differences between Insight cells/BCMs and Civic cells/BCMs... I did try to make graphs for Civic and Insight cells at one time, and those graphs do show that the Insight cells need to reach a higher voltage to reach a given high state of charge relative to the Civic cells. But I can't be sure that my test cells were truly representative... Plus, we're talking Civic vs. Insight here, not early Insight vs. later Insight...

In any event, seeing the differences with this A03 makes me wonder whether Honda in its later Insight years started to catch on to some of the things I've mentioned earlier in this thread...
 

·
Registered
Joined
·
121 Posts
I have to admit I have difficultly follow the details but what I get out of your post is you’re developing a management that favors charge efficiency. I’ve read these type of cells are about 60% efficient but you say you’re getting 99% which is quite impressive. Also you said the A03 BCM discharged down to 1 bar and is what I saw with my vehicle at constant speed on level ground with this BCM. Your conclusion is the BCM is working as it should although it manages the battery very differently than previous versions with the goal being to increase battery life. I can certainly see why that is desirable but I’d have to say that was not what I experienced though. I’m relatively sure all g1 insights used the same battery.


Sent from my iPhone using Tapatalk
 

·
Registered
Joined
·
6,201 Posts
Discussion Starter #12 (Edited)
I have to admit I have difficultly following the details but what I get out of your post is you’re developing a management that favors charge efficiency. I’ve read these type of cells are about 60% efficient but you say you’re getting 99% which is quite impressive.
I don't think I'd say I'm 'developing a management'. I just stumbled into this stuff. Originally I needed to test the IMA at lower than normal voltages because the LTO cells I wanted to use would need to be used at slightly lower voltages. But since then, actually, this experimentation revived my faith in the stock cells and I haven't had the need to make the switch.

As far as efficiency goes, when people talk about quote 'efficiency' they're usually talking about overall energy efficiency. When you say I'm getting 99% efficiency that's coulombmetric efficiency or coulombic or something like that - amp-hour efficiency, basically... You can have high/good coulombic efficiency, like 99%, but when you charge the voltage is high and when you discharge the voltage sags, while voltage is the other half of the equation that makes up total energy efficiency. For example, say average voltage during charge is 164V, average voltage during discharge is 144V, while the current rate is the same, call it 6.5 amps. 164V X 6.5 amps = 1066 watts, 144V X 6.5 amps = 936 watts. 936W/1066W=87.8% efficient...

Also you said the A03 BCM discharged down to 1 bar and is what I saw with my vehicle at constant speed on level ground with this BCM.
No, what I was describing is different than what you must have seen. When I say it discharged to empty I'm looking at the OBDIIC&C and the data I collect/keep track of. The dash bar meter wasn't actually at 1 bar and never* went that low - because it's based on the nominal state of charge, and as I said, this BCM recals up to 81%. In other words, this BCM is showing a nominal of about, say, 80% when the actual state of charge is only around 50%, which means I have only around 40% of usable capacity, give or take. So, when I discharge 40%, the nominal will read 40% (80% minus 40% = 40%), yet the pack is really almost empty...

Your conclusion is the BCM is working as it should although it manages the battery very differently than previous versions with the goal being to increase battery life. I can certainly see why that is desirable but I’d have to say that was not what I experienced though. I’m relatively sure all g1 insights used the same battery.
My guess is that, yeah, the BCM is working as it was intended to, the programming is very different than my older BCMs, and it was probably programmed differently to ... probably increase battery life, or something - the changes were probably meant to circumvent what Honda thought were deficiencies in the earlier models. They had a lot of warranty issues, so that would make sense to me... But, then again, the pos recal threshold does seem pretty low. I don't know if there's anything inside the BCM that could affect voltage measurement, seems possible, possible that it's a fluke and this BCM is measuring voltage too low... I'd have to test another A03 to really know whether these differences are model-specific. But they probably are...

In your case, using this BCM, if your pack isn't up to snuff, if it's even a little out of balance, if it's crudded-up at all -- I'm pretty sure this BCM would be a nightmare. You'd have barely any usable capacity. Probably most packs out there can hardly function under 50%. Mine can because I've done ultra-deep discharges and stuff. But any of my previous working but not great packs would have barely functioned with this BCM. I'm pretty sure you just had very little usable capacity in your pack using this BCM - and it neg recal-ed and the BATT gauge dropped to 1 bar. On the other hand, you had an aftermarket pack and the voltage profile for those cells are slightly different, people have had issues, etc. So, it could be related to that as well...

*It has gone that low, when I drained the pack and got a neg recal. Just that here I'm not talking about that.
 

·
Registered
Joined
·
121 Posts
Thank you for the clarification. I should have reiterated I was using Bumble Bee sticks I removed from my 2002 vehicle that were about a year old but I see you remembered that. The Bumble Bee is rated 8.0 Ah versus the 6.5 Ah for the OEM and maybe that is the root of the problem. My 2006 has the OEM battery installed running a 2002 MCM/BCM pair. After two weeks, no issues but I think these Tennessee hills gives the battery quite a work out.
 

·
Registered
Joined
·
6,201 Posts
Discussion Starter #14 (Edited)
I was just reviewing some of these threads and posts in relation to my use of the A03 BCM and what I've seen, and a post I made in the BCM versions thread. Some of it has made me wonder something... that seems really important yet that seems like a very basic concept. Given what I wrote and quote at the bottom of this post, about the "excess" charge (current, really) that the OEM management scheme seems to impose versus if I do some things manually - basically prevent high charge state charging and use low charge states - a question comes up: What happens to all that excess charge/current? What electro-chemical processes are happening - what possible chemical and physical changes - might be happening in the cells?

Think about it. If one management scheme has me charging 8.6 times more than another just to maintain the same state of charge, over a 3 1/2 month period, what's that charging (passing current) doing to the cells? It's obviously not contributing to anything useful - it doesn't lead to more power or energy at my disposal. It just 'disappears'. BUT - does it really disappear?

I'm no chemistry or battery expert, so I'm just throwing this out here... But I got this vague idea in my head.

True overcharge results in heat and possibly gasses and stuff, but I don't think that's exactly what we're dealing with here. Maybe. But either way, I'm thinking there must be side reactions or something like that, probably during both the actual, active charging and discharging, and perhaps also during self-discharging. The current goes through the cell and does something - but does something that doesn't actually contribute to charge state. On the contrary - I'm thinking it probably does something harmful, like perhaps it contributes to the pernicious 'gamma mod NiOOH' (1) rather than the good and normal beta mod. It almost seems like it has to do something harmful, almost by definition - for if it's not contributing to charge state it's either heating up the cell and/or contributing to another electro-chemical process that doesn't contribute to the useful energy the cell stores, i.e. both harmful...

In other words, I'm starting to think there's a much more direct relationship than I have imagined before between excess charging (current) - charging (current) that doesn't contribute to charge state - and potentially harmful, degrading electrochemical processes. When I say "excess charging" I'm talking about the long term, overall extra current needed to maintain charge state - but this also encompasses discharging and self-discharging as you can lose energy in those processes as well; you simply need to make it up during the charge and that's why we're calling it 'excess charge'...

Anyway, I guess that's about it, for now.

Over the OEM management interval, the amount of 'excess' current needed to maintain state of charge was 12,346mAh. Over the 'MANUAL' management interval, the right half of the chart, the amount of 'excess' current was only 1,440mAh.
12,346mAh/1,440mAh=8.6 -- OEM management required 8.6 times more current to maintain state of charge than when I did things manually. That seems really big to me.
I just noticed a glaring improvement: In that first graph above, left half, the blue curve peaks at about 275% after 3 1/2 months of pack usage; in my contemporary graph I'm not even up to 200% and it's been about 6 1/2 months - double the time, quite less charge needed to maintain charge state. Over that first 3 1/2 months I needed to charge the pack an equivalent of 2.75 times the nominal pack capacity in order to maintain charge state; now, that figure is only 2 times plus over nearly double the time span.
(1) gamma mod NiOOH, number two in a list of the three major contributors to Insight NiMH degradation posed by sser2 some years ago. Robert Huggins also mentions it (along with many other battery folks), in the chapter on NiMH and memory effect, in his book Advanced Batteries. Here's what sser2 had to say about it:

"Formation of γ-NiOOH. The normal isoform of the NiOOH in the cell is β, which is readily converted into the Ni(OH)2 during discharge. γ-NiOOH, which slowly accumulates as battery ages, is converted to Ni(OH)2 with more difficulty, and only after all the β isoform is gone. γ-NiOOH accumulates because the battery is never fully discharged in the car.
γ-NiOOH is bad in two more ways. It crystallizes with considerable amount of water, which sequesters water from the electrolyte, making the electrolyte less conductive. The consequence is increased internal resistance. When γ-NiOOH is eliminated following deep discharge, this sequestered water returns back to the electrolyte, and internal resistance decreases. Sequestration of water in γ-NiOOH dramatically increases its volume. Whereas the β-NiOOH<=>Ni(OH)2 transition is associated with only 1-2% volumetric change, the γ-NiOOH<=>Ni(OH)2 transition alters the volume by a whopping 40%. It is therefore much more destructive to the active layer of the electrode."
 

·
Registered
Joined
·
6,201 Posts
Discussion Starter #15
I'm re-posting here the image/graph that was posted in the first post. It's kind of pissing me off that I've revisted a couple threads of mine and I keep finding images gone, and they're images that I've uploaded to IC itself. And then you can't edit the original post to put the image back.

 

·
Registered
Joined
·
1,210 Posts
I love your chart. Before I read the above sentence (because I was busy staring at the chart) I did see a possible correlation between changes in temperature and the slope of the blue and green lines. I've taken the liberty to mark up your beautiful chart - I hope you don't mind. (I'm sorry that I don't know how to make it appear actual size.)

View attachment 70153

In the first section the pack temperature averages 75F... the second it drops gradually from 75F to 50F. (Does that correspond to falling autumn temperatures?) The third is pretty constant near 50F, as is the fourth. I separated the third and fourth because the black line shows a lot more deviation from center than the third, and the slope of the blue line has changed as well. (What changed in mid-January?)

I thought that 25F was a fairly large difference (it will make me grab a coat), but a quick Google for battery performance at these two temperatures did not suggest much difference. But each data point is an average of highs and lows. What would the plot of the median temperature look like?

I have been an Insight owner for a whole month and have hardly scratched the surface of the forums, and am not a statistics wonk either, so I'll stop pretending to be able to interpret this.

Looking forward to your continued analysis!
He's good Sean,I really enjoy his reads,although I know am at a big learning curve,,he thinks out of the box a lot,
 

·
Registered
Joined
·
6,201 Posts
Discussion Starter #17 (Edited)
Lately I've been dealing with a rebuild of an old failed rebuild, plus a single stick in my in-car pack that's always had a bit faster self-discharge in at least one cell. This rebuilt rebuild ended up having quite a few fast self discharge cells too. These are making me reassess the significance, the impact, of thwarting OEM management and using the pack at very low charge states like I've been doing. 'Crud' build-up and the resulting voltage sag remains an issue, a problem that low charge state usage seems to help. But also, I think relatively fast self discharge and the imbalance that entails is also a problem that low charge state usage would help mitigate. I'm thinking that it's possible the A03 BCM program changes I mentioned above - where it forces low charge state usage - might've been implemented to address these two big, debilitating problems.

When it comes to uneven self discharge, the problem becomes a single cell or more self discharging low while the others remain high. When the car goes to charge the pack, it becomes difficult charging the low cells enough without overcharging the high cells (which the BCM won't do). But, if the pack were used at a lower charge state on a routine basis, there would always be a much larger charge window.

For example, if OEM management routinely charges the pack to a high of 80%, like most BCMs do, and a single cell discharges to zero, then the most you can charge the pack, the most usable capacity you can have, is only 20% (assuming the BCM allows extra charge in such situations, allowing charge to just under 100%). Once the pack charges 20% the high cells are full and any more charge will overcharge them. Meanwhile, the cell that dropped to zero will have only charged to 20%.

But, if OEM management routinely charges to a high of only 50%, like the A03 BCM appeared to do when I was messing with it, your charge window grows to 50%. The high cells will be at 50% charge state, the high self discharge cell drops to zero -- but you can still charge 50% before the high cells reach full and will begin to over charge. That means your high self discharge cells can charge up to 50% charge state.

All-in-all, this 'using low charge state tactic' seems like a pretty powerful way to combat two of the biggest problems for aging packs...
 

·
Registered
Joined
·
640 Posts
Lately I've been dealing with a rebuild of an old failed rebuild, plus a single stick in my in-car pack that's always had a bit faster self-discharge in at least one cell. This rebuilt rebuild ended up having quite a few fast self discharge cells too. These are making me reassess the significance, the impact, of thwarting OEM management and using the pack at very low charge states like I've been doing. 'Crud' build-up and the resulting voltage sag remains an issue, a problem that low charge state usage seems to help. But also, I think relatively fast self discharge and the imbalance that entails is also a problem that low charge state usage would help mitigate. I'm thinking that it's possible the A03 BCM program changes I mentioned above - where it forces low charge state usage - might've been implemented to address these two big, debilitating problems.

When it comes to uneven self discharge, the problem becomes a single cell or more self discharging low while the others remain high. When the car goes to charge the pack, it becomes difficult charging the low cells enough without overcharging the high cells. But, if the pack were used at a lower charge state on a routine basis, there would always be a much larger charge window.

For example, if OEM management routinely charges the pack to a high of 80%, like most BCMs do, and a single cell discharges to zero, then the most you can charge the pack, the most usable capacity you can have, is only 20%. Once the pack charges 20% the high cells are full and any more charge will overcharge them. Meanwhile, the cell that dropped to zero will have only charged to 20%.

But, if OEM management routinely charges to a high of only 50%, like the A03 BCM appeared to do when I was messing with it, your charge window grows to 50%. The high cells will be at 50% charge state, the high self discharge cell drops to zero -- but you can still charge 50% before the high cells reach full and will begin to over charge. That means your high self discharge cells can charge up to 50% charge state.

All-in-all, this 'using low charge state tactic' seems like a pretty powerful way to combat two of the biggest problems for aging packs...
I wonder if the first cell to "go bad" keeps the next weak cell from "going bad" because the first cell triggers the stop of discharge before the next weak cell can be driven into a level which damages it. I also wonder if the bottom of every discharge imparts a tiny but permanent insult to that weak cell.

I also am finding the pack rebuild, where one replaces the bad stick, may not be sufficient. I think the above is happening and when the next week cell comes up to bat, that low discharge, for it, is like taking a wild pitch in the gut.

I am wondering if out-of-pack cell-level stick conditioning is in order, but yeah, that's not something you want to do manually.
 

·
Registered
Joined
·
6,201 Posts
Discussion Starter #19 (Edited)
I wonder if the first cell to "go bad" keeps the next weak cell from "going bad" because the first cell triggers the stop of discharge before the next weak cell can be driven into a level which damages it. I also wonder if the bottom of every discharge imparts a tiny but permanent insult to that weak cell.
I don't think so. I actually think it's the opposite: cells that get driven low seem to benefit from the deeper discharge, consistent with the stuff I've posted around IC about 'deep discharge'. As I mentioned above, I'm working with a stick that has one cell with faster self discharge, and seemingly ironically, this cell has the loftiest, least-saggy voltage discharge curve -- because it's been completely discharging in the car while the others haven't...

Here, take a look at these two graphs. I pulled from my pack the stick with the faster self discharge cell and discharged all the cells separately. The top pane with the fast self discharge cell (C2) only discharges 600 mAh while the bottom pane with one of the other cells (C3) discharges about 3200 mAh. But look how lofty C2 is compared to C3: If C3 were discharged starting at the last ~600mAh, to the right of the yellow line, that would correspond to the portion of the charge state range that C2 experiences. C2's discharge starts at a little above 1.2V; the last 600 mAh of C3 starts at about 1.1V. C2 has a healthier curve, C3 is depressed.

83675


Also, one thing I forgot to mention: The car throttles assist and/or you get a 'neg recal' before a single cell is driven too low. I'm like 99.9% sure the BCM has a slope detection algorithm for tap voltages. If you look at the graphs above you can see how voltage falls off a cliff when a cell is nearly depleted. If this cell were in a string of 12, i.e. the tap configuration in the car, the tap voltage would fall much faster than usual when a single cell goes off this cliff.
 

·
Registered
Joined
·
6,201 Posts
Discussion Starter #20 (Edited)
Here's a better graphic depicting the concept described above, same data. It's really quite striking and, I think, a good illustration of the potential impact of the whole 'use low charge state' idea, as well as of the benefits of deep discharging in general.

Cell C2 (yellow curve) was a relatively fast self-discharge cell, and so in the car it was perpetually used at the lowest charge state - basically always driven to empty. Cell C3 (black curve) was more or less normal - no fast self discharge, no major deterioration, etc. Since the car's battery management disables discharging (assist) when C2 is empty, all other cells never get close to empty - and so they suffer from that short cycling, frequent high charge state charging, etc. They become voltage depressed. This is also what happens in general due to the way the BCM manages the pack, regardless of having a fast self-discharge cell or not, and to the way most people drive (i.e. letting the BCM do whatever it wants, which of course is completely reasonable). C2 benefits, though, from the deeper discharge and/or the frequent, regular low charge state usage.

83679


Here's another way to think about this concept (low charge state usage/deep discharge) and the benefits. The voltage just after the start of the yellow curve discharge is about 1.25V; the voltage on the black curve underneath that point is about 1.1V, so a 0.15V difference. Now, multiply that by the 120 cells in the Insight pack: 120 X 0.15V = 18 volts. IF low charge state usage helps to preserve the higher low-charge-state voltages you get from an initial ultra-deep discharge, it's like your pack grew by an additional 18 volts! Not only that, but that additional voltage is happening at the lowest charge states, i.e. you're not getting premature neg recal, you're able to use the full pack capacity.
 
1 - 20 of 22 Posts
Top