I should add that the Fluke 289 does not have the resolution to show waveforms at high frequencies. I think you'll need an oscilloscope for that....
dh
Mechanisms that decrease the Lifespan of Lithium-Ion batteries and how to avoid them
Collapse
X
-
I LOVE my Fluke 289! In the warranty lab, it is far and away the most-often used tool we have, and we're not just using it to take simple measurements. We use the data recording feature all the time, and the low-Ohms range regularly as well--it's great for measuring the small resistances in wiring, motor windings, etc. that can wreak so much havoc in this field. There's also a 287, which is a bit cheaper and still data logs, but it's missing the low Ohms range and a couple other features. You don't have to have a Fluke, but avoid $10 meters that can't yield the same measurement twice (like my Harbor Freight special). To me, those meters are good for confirming the presence of voltage, continuity, what have you--and little else. You can't hang your hat on the numbers you get from such a meter for battery work. [EDIT: Having said that, I think you would need an oscilloscope to see the switching waveform we're talking about here. The 289 doesn't collect data fast enough to capture high frequency waveforms.]
... I kind of knew I shouldn't have brought it up, since self-balance is a total misnomer and gets everyone excited. That's why I prefer drift. I should say that I know (and have proven to myself) that just dumping a solar CC onto a misbalanced pack is not the cheapskates way of balancing!
Also not that my solar usage is for realatively low-voltage (typically 12v / 4S configs), not mobile, and not critical. No bleeder boards, just common sense HVC, LVC and dose of monitoring since I like to do battery maintenance. Wouldn't hand it over to my neighbor though!
dhLeave a comment:
-
I LOVE my Fluke 289! In the warranty lab, it is far and away the most-often used tool we have, and we're not just using it to take simple measurements. We use the data recording feature all the time, and the low-Ohms range regularly as well--it's great for measuring the small resistances in wiring, motor windings, etc. that can wreak so much havoc in this field. There's also a 287, which is a bit cheaper and still data logs, but it's missing the low Ohms range and a couple other features. You don't have to have a Fluke, but avoid $10 meters that can't yield the same measurement twice (like my Harbor Freight special). To me, those meters are good for confirming the presence of voltage, continuity, what have you--and little else. You can't hang your hat on the numbers you get from such a meter for battery work.
... I kind of knew I shouldn't have brought it up, since self-balance is a total misnomer and gets everyone excited. That's why I prefer drift. I should say that I know (and have proven to myself) that just dumping a solar CC onto a misbalanced pack is not the cheapskates way of balancing!
Also not that my solar usage is for realatively low-voltage (typically 12v / 4S configs), not mobile, and not critical. No bleeder boards, just common sense HVC, LVC and dose of monitoring since I like to do battery maintenance. Wouldn't hand it over to my neighbor though!
dhLeave a comment:
-
I'll let the others chime in, but I think we are all familiar with the ol' charge until current drops to C/20, and then call it quits. Taking absorb down to zero is indeed unnecessary and stressful.
But the question is always, great, but at what voltage? since it really depends upon the application. And should it be measured after hours of rest or during charge? 3.45v *at rest* is considered a fully charged cell. There is some variance, as a GBS prismatic can be from 3.38 to 3.45 or so...
Probably the best thing I've seen is an actual formula for it! It goes something like this for charging:
3.45 + (IR * A)
Where 3.45 is considered the full voltage at rest
IR is your cell internal resistance
A is the charge amperage
Maybe the big guns can comment.
Getting those values can come from a prismatic manufacturer if you ask or perhaps special-order them can be done, so that your cells are individually matched for capacity and internal resistance, and will supply a document sheet matching each one's barcode. Not sure if you can special order small cylindricals this way, or as WB9K points out, impractical to do for commercial projects.
The C/20 bit (did you get this from me?) comes from the procedure for the A123 self-discharge test for cells. This procedure insures that the cell is high enough on the charge curve to yield the needed resolution to make an accurate test. I see this as the minimum charge required for that test, not an absolute upper limit on charging. The cell engineers have taught me that as long as you don't exceed 3.60 Volts (let's say while charging just to be safe, even though it's common practice to charge at 3.625 per cell during CV or even higher) you'll never plate Li, and you're doing no real harm unless you consider the slightly increased formation rate of the SEI layer, which is just a bit higher than if you cut charge off at, say 3.5 Volts. You must get above 3.4 to do any real balancing at all, the higher you go, the more accurate the balance. Soaking until current falls to minimum (while also having automatic balancing) insures you are getting beyond the difference effects imposed by minor IR variation. The different stated "100%" rest voltages are the result of the lack of a truly universal standard for ending charge current based on cell capacity.
The formula is interesting, but I think it must be intended to be specific to a particular model of cell. The concept makes some sense, but it would have to also consider the capacity of the cell to really work since V rise during charge is related to both absolute charge current AND the capacity of the cell(s) being charged. If you knew what capacity cell this was aimed at, you could probably try working your way back to defining that ratio and then try applying it to larger or smaller cells and seeing how well it transfers. The other issue here is that IR is going to be a tiny number, and depending on conditions (mostly temperature), this is going to be a moving target that can shift well over 100% from the stated nominal value. A voltage-based system for upper charge limits thus makes the most sense in my mind....it's voltage that tells us where the critical electrochemical "knee points" are located, if that makes sense.
As far as manufacturers (whether they are cell mfg's or OEM's) matching cells by hand-selecting them, I know of no company that does this. I doubt that A123 has ever done this for anybody, and if you asked they would probably say no, citing existing test routines as being good enough to insure a good match. Select (new) cells that are as close together in age as possible if you are really concerned (all within a 6-month mfg window is good enough), and that should be all that is required.
dhLeave a comment:
-
The BMS and Rudman bus (BMS monitor) shows the voltage of battery suite (48 V nominal), voltage of each cell (16), maximum and minimum voltage ever, charge/discharge rate in amps (48 V nominal), and W-hrs from maximum SOC
TriStar MPPT-45 shows charging rate in W, total harvest of day in W, voltage of battery suite (48 V nominal), what mode charge is in (MPPT, float, etc). There are a number of other conditions that can be measured if desired.
Magnum monitor gives voltage of battery suite and numerous other statuses (statūs if one goes back to high school Latin) if desired.
Happily, all three monitors give total voltage (48 V nominal) to the same value within 0.1 V.
The voltages of individual cells have never gone above 3.45 V
Charge rate approaches 0 W as the battery suite approaches 54.4 V. Parasitic charges still occur.Leave a comment:
-
The one area I have interest in is the saturation phase of charging. Since charging is a leading voltage and the battery voltage is a lagging voltage, how long should the saturation phase be or to what level. If it is measured by say, ending amps ?? The higher the charge rate, the more disparity I see with shunt counted amp hrs returned. By only using a termination voltage of the charge controller there is a undercharge that accumulates by the cycle.
This is the major difference between charging with a constant current source and the sun. With constant current you can charge to a predictable SOC by charging with constant current to a set voltage cutoff and then terminating the charge. As the amount of sun reaching our solar panels is variable we do not get the same amount of charge in the battery every time it is charged. I have found that when charging to around 3.4 volts/cell and ending the charge at a charge rate of C/20 that the final SOC achieved can be anything between around 80% and 90+%. If it is sunny for the whole period that the battery is being charged the final SOC will be around 80%, if it is cloudy or it is nearly the end of the day when the charging is nearly finished the final SOC can be greater than 90%. I have reduced this problem by ending the charge at C/50 rather than the C/20.
SimonLeave a comment:
-
Getting back to degradation itself, I have always wondered if any degradation studies have been done for LiFePo4 prismatics (or A123 cells if you prefer) in regards to the fact that as solar users, our charge controllers use PWM in the "absorb" phase (what little there is when fed by decent current!).
In other words, we don't REALLY use CC/CV, but CC/PWM. Typically the pwm is done at about 300hz or so. If looked at on a waveform, this simply means that our controllers just close the circuit during bulk, but once a setpoint has been reached, instead of CV, pwm is actually used. Ie, the voltage can actually shoot up to 4.5 volts per cell! - BUT of course at 300hz, the averaging takes place.
What I noticed when using both prismatics, and my prized A123's from Braille and Antigravity brand batteries was that unlike CV which stops current when the first cell is fully charged, with pwm, they tend to "drift together" - and not an exact balance. We've covered balance enough, but my main interest was how lifepo4 reacts to pwm, since that is what we use in the field. (be it a low-end pwm controller, or an mppt which uses pwm during absorb too actually).
SimonLeave a comment:
-
A side note about degradation --
We throw voltage settings around like so much candy (here and elsewhere), and while user lifepo4 projects are valuable for data, I take them with a grain of salt and consider them anecdotal when I can't determine if they are using a quality standard for measurement.
Far too many times I've seen guys building mega-buck battery systems, and then calibrated and monitored by a shirt-pocket or throw-away meter that hasn't been vetted for accuracy. Or not even taking the time to check that the cheapo Junsi cell monitor is even in the ballpark!
I use Fluke for out-of-box trust, but this isn't really a multimeter thread and I don't care what one uses, as long as they TRUST it or have calibrated it. And THEN, using that calibrated meter as the standard for everything else.
I just wonder how many systems dutifully follow the experience of others, only to be bagged by using a throw-away meter, and giving us false data in the forums?
Ok - degradation issue about meter rant over ...Leave a comment:
-
This is an interesting question. I don't believe I've ever seen data on pwm-based charging, but I have no reason to believe it makes much if any difference at all to the cells. Many automotive applications drain the cells with high-current PWM at frequencies in the kHz range and nobody considers that a problem. Charging should be no different.
I think - disclaimer - the above comes from an amateur like myself.
The worst part about this is now you have given me a justification to actually look at something like a Fluke 289! (I'm a fluke nut) Dang it - my wallet is bleeding now...
What you are really seeing with cells "drifting together" is the expression of different DC resistances of the series cell-busbar combination. The higher the current, the greater the spread. When you switch from CC to CV(or PWM) the main difference as far as the cell is concerned, is that charge current goes down. The voltage rise (vs. rested voltage) across the whole pack decreases, and so do the differences between cell voltages. Less current, less "Peukert-type" losses, less voltage delta.
In my case with the A123 cells, and also my GBS prismatics, is that the cells have to be SANELY close to each other to begin with, and the drift takes MANY cycles, not just a single day's charging. Of course quality cells make the world of difference. Other NEW cells I pulled from other powersport batteries which were a total JOKE inside, were a waste of time and they just drifted immediately to the recycling center.
Also not that my solar usage is for realatively low-voltage (typically 12v / 4S configs), not mobile, and not critical. No bleeder boards, just common sense HVC, LVC and dose of monitoring since I like to do battery maintenance. Wouldn't hand it over to my neighbor though.
Thanks for the feedback. Anything that makes me want to study deeper is allright!Leave a comment:
-
The one area I have interest in is the saturation phase of charging. Since charging is a leading voltage and the battery voltage is a lagging voltage, how long should the saturation phase be or to what level. If it is measured by say, ending amps ?? The higher the charge rate, the more disparity I see with shunt counted amp hrs returned. By only using a termination voltage of the charge controller there is a undercharge that accumulates by the cycle.
But the question is always, great, but at what voltage? since it really depends upon the application. And should it be measured after hours of rest or during charge? 3.45v *at rest* is considered a fully charged cell. There is some variance, as a GBS prismatic can be from 3.38 to 3.45 or so...
Probably the best thing I've seen is an actual formula for it! It goes something like this for charging:
3.45 + (IR * A)
Where 3.45 is considered the full voltage at rest
IR is your cell internal resistance
A is the charge amperage
Maybe the big guns can comment.
Getting those values can come from a prismatic manufacturer if you ask or perhaps special-order them can be done, so that your cells are individually matched for capacity and internal resistance, and will supply a document sheet matching each one's barcode. Not sure if you can special order small cylindricals this way, or as WB9K points out, impractical to do for commercial projects.Leave a comment:
-
Not able to edit my replies and not what I meant. I couldn't fix it. It should have read " If Floated to less than 100% re-balance should not be a problem even though lithium cells are not self balancing. with no parasitic loads lithium only imbalance of any significance is self discharge which is extremely low so any differences are so small and insignificant can be ignored.
Couple all that together with LFP batteries that are extremely safe by default, operated in kind conditions, and operate at Low Voltages of 12, 24, and 48 do not need as stringent controls. I know you are talking about cell level, but with 4, 8, and maybe 16S at low discharge rates do not really need it.Leave a comment:
-
Li cells don't self-balance...you can say it til you're blue in the face but it still won't be true. As I explained above, the illusion of "self balance" is in fact an expression of cell variation--the very thing that should be telling you why you DO need to balance, and the more often the better.Leave a comment:
-
Li cells don't self-balance...you can say it til you're blue in the face but it still won't be true. As I explained above, the illusion of "self balance" is in fact an expression of cell variation--the very thing that should be telling you why you DO need to balance, and the more often the better.Leave a comment:
-
Geez dude you just agreed with me. What the hell do you think I have been talking about all this damn time. Run below 100% and th ecells self balance and if they should ever become unblanced is really simple to be re-balanced. I personally have not had to in 8 moinths, and if you go to DIY EV Forum or EVTV you will find hundreds of people who have not had any balance problems in 3 years. I grant you on a large EV high voltage pack Cell level monitors have benefit. But that is only because of the number of series cells is so much greater 3 to 6 volts is not noticeable. But trust me 3 volts low on a 12 volt system is going to get your attention real damn quick when nothing turns on and you notice your battery is 8 to 11 volts. You do not need a cell monitor to tell you something is wrong. Your Inverter already caught it and shut down.
Li cells don't self-balance...you can say it til you're blue in the face but it still won't be true. As I explained above, the illusion of "self balance" is in fact an expression of cell variation--the very thing that should be telling you why you DO need to balance, and the more often the better.Leave a comment:
-
TA 14.4 Volt charger is fine too. If the setpoint of the charger cannot overcharge a balanced pack, you're good unless you've got severe imbalance prior to charge, which would indicate that something else is wrong [that's why you need cell-level monitoring to shut things down if the situation becomes dangerous for any single cell]. As you approach 100% SOC, the voltage delta between the cells and the charger goes down, so current becomes self-limiting. Balancers should never have a problem dealing with the small imbalance that develops between charge cycles in such a system. 14.4 Volts = 3.6 Volts per cell, when your charger current falls to zero (or very near to it), you should have a perfectly balanced pack at 100% SOC, every time. With no automatic balancers, I'd rather turn that down charger down to 14.0-14.2 and manually balance periodically. I've used this technique with no problems on several of my smaller packs for years now---.Leave a comment:
Leave a comment: