yes those are the values displayed during discharge.
I posted the math we use in this post
Using the Universal grid charger
which I realize is pretty cryptic if one does not understand C, and it is even difficult for me to read, as we worked out the algorithm a long time ago, and my brain has been filled with too many other things. To the best of my reconnection it works like this:
The slope is determined based on how many samples there are between a bit change, each sample is 5 seconds, so during the initial change from charge to discharge, the voltage drops fast, so there is sometimes several bit changes per sample. As the discharge gets into the flatter part of the discharge, we look for the max samples/bit change, or the slowest rate of change, you can tell when the discharge minimum rate of change has been determined, as we will hear a short blip on the speaker on each sample. this minimum rate of change establishes our slope baseline, and at that point the number of samples/bit can be in the 5-60 range, depending on the battery condition. This number is divided by the DischSmpFac* to determine the DischgSmplTarget value you see in the discharge data field.
If the division yields a number less than 1, we force the value to one.
The sample timer, is the count of how many samples have transpired since the last bit change.
say the DischgSmplTarget is 3, if the count does not reach 3 before the next bit change (voltage is falling fast), we set R2 and stop the discharge.
The main concern we had was that we did not want to miss the slope fall off at the end of the discharge under any condition, so there are several end of discharge detection systems at play. The ideal and most accurate method is for a human to be watching the datalog graph, and since we are much better than our low tech algorithms at looking at trends, we can easily pick out a cell drop out and the rapid rate of change increase at the end of the discharge when the pack starts dropping out.
This is why we have the tech edit discharge detection test mode where you can disable all but the MinDiscVolts limit for the end of discharge, and you become the limit detection.
If a smarter programmer were doing the math, I expect that the code could be much better, but what we have works,avoids cell reversals and one can always disable the detection and DIY for the best and most accurate end of discharge detection.
If the charger was controlled by the labview program, where there is a vast library of statistical analysis possibilities, as well as the whole history of the discharge in the charge memory, I expect that we could do a much better job of the discharge detection, but as we have found, only a couple of brave souls have ventured into even using the Labview datalogger. The controls for the charger from labview is in place,we can start stop the charger from the PC, but the passing of the detection to the PC is not in place, as my weak math skills for statistical analysis are still not up for the task, even though my labview is equipped with the full advanced analysis software, I just dont know enough to take advantage of what it could do.
Of course if any of you users of the system feel that they can contribute to improving this, I am always open to advice and would welcome it.