Originally Posted by linger
Sorry, I didn't elaborate. I was referring to the method of testing a cell with the intent of determining the discharge C-rate. You start with a low enough C rate discharge that you know the cell will pass, and then you keep ramping up from there until the cell fails (usually temp, voltage cutoff or capacity). For example, you would start somewhere around 10C and go up in 2C increments. You gotta start low. There are "20C" labeled cells that can't handle 15C.
"time of discharge was used to calculate a C-rating" - no I think you are confusing C-rating with capacity, which ROAR does record for every pack that it tests.
Right - I did confuse capacity C and C-rate of discharge up there. So for proper C-rate testing, one would take the label capacity C from the maker of the pack. Charge and discharge the pack at constant rate 10C. Measure:
A. maximum pack temperature (what is the failure temperature?)
B. minimum pack voltage (what is the failure cutoff voltage?)
C. discharge time (failure time calculated from C rating and current C-rate)
Repeat this test in 2C increments until the pack goes into overtemperature, or hits the minimum cutoff voltage before reaching the rated discharge time.
This gives a process by which published C-rates would be valid comparables based on testing with the same temperature and cutoff voltage limits, no?