Mechanisms that decrease the Lifespan of Lithium-Ion batteries and how to avoid them

Collapse
X
 
  • Time
  • Show
Clear All
new posts

  • wb9k
    replied
    I should add that the Fluke 289 does not have the resolution to show waveforms at high frequencies. I think you'll need an oscilloscope for that....

    dh

    Leave a comment:


  • wb9k
    replied
    Originally posted by PNjunction
    .....I think - disclaimer - the above comes from an amateur like myself.

    The worst part about this is now you have given me a justification to actually look at something like a Fluke 289! (I'm a fluke nut) Dang it - my wallet is bleeding now...
    You may be an amateur, but you are an exceptionally well-informed one. I have no complaints at all about your processes--real or mental.

    I LOVE my Fluke 289! In the warranty lab, it is far and away the most-often used tool we have, and we're not just using it to take simple measurements. We use the data recording feature all the time, and the low-Ohms range regularly as well--it's great for measuring the small resistances in wiring, motor windings, etc. that can wreak so much havoc in this field. There's also a 287, which is a bit cheaper and still data logs, but it's missing the low Ohms range and a couple other features. You don't have to have a Fluke, but avoid $10 meters that can't yield the same measurement twice (like my Harbor Freight special). To me, those meters are good for confirming the presence of voltage, continuity, what have you--and little else. You can't hang your hat on the numbers you get from such a meter for battery work. [EDIT: Having said that, I think you would need an oscilloscope to see the switching waveform we're talking about here. The 289 doesn't collect data fast enough to capture high frequency waveforms.]

    Originally posted by PNjunction
    ... I kind of knew I shouldn't have brought it up, since self-balance is a total misnomer and gets everyone excited. That's why I prefer drift. I should say that I know (and have proven to myself) that just dumping a solar CC onto a misbalanced pack is not the cheapskates way of balancing!
    Yes, a misnomer. These discussions are hard enough to keep properly specific without introducing misnomers and analagous descriptions of cell functions that wander far away from what is really happening in order to make things simple for novices. I try to banish that stuff to the fullest extent possible--absolute clarity is important here.

    Originally posted by PNjunction
    In my case with the A123 cells, and also my GBS prismatics, is that the cells have to be SANELY close to each other to begin with, and the drift takes MANY cycles, not just a single day's charging.
    This is how things should go normally when all cells are very similar in starting behavior and conditions in the pack design expose all cells to roughly equal operating conditions. But it's not always like that. Resetting balance every charge cycle arrests the drift altogether unless the problem is really gross, and that has real value...especially as a pack ages and differences between cells increase in intensity.

    Originally posted by PNjunction
    Also not that my solar usage is for realatively low-voltage (typically 12v / 4S configs), not mobile, and not critical. No bleeder boards, just common sense HVC, LVC and dose of monitoring since I like to do battery maintenance. Wouldn't hand it over to my neighbor though!
    I do similar stuff on many of my small batteries, but that last sentence is the really important part. I try to be very careful when describing such practices so that it is always perfectly clear just how vigilant you have to be to avoid potential disaster. Most people should never even attempt to use these techniques.

    dh

    Leave a comment:


  • wb9k
    replied
    Originally posted by PNjunction
    .....I think - disclaimer - the above comes from an amateur like myself.

    The worst part about this is now you have given me a justification to actually look at something like a Fluke 289! (I'm a fluke nut) Dang it - my wallet is bleeding now...
    You may be an amateur, but you are an exceptionally well-informed one. I have no complaints at all about your processes--real or mental.

    I LOVE my Fluke 289! In the warranty lab, it is far and away the most-often used tool we have, and we're not just using it to take simple measurements. We use the data recording feature all the time, and the low-Ohms range regularly as well--it's great for measuring the small resistances in wiring, motor windings, etc. that can wreak so much havoc in this field. There's also a 287, which is a bit cheaper and still data logs, but it's missing the low Ohms range and a couple other features. You don't have to have a Fluke, but avoid $10 meters that can't yield the same measurement twice (like my Harbor Freight special). To me, those meters are good for confirming the presence of voltage, continuity, what have you--and little else. You can't hang your hat on the numbers you get from such a meter for battery work.

    Originally posted by PNjunction
    ... I kind of knew I shouldn't have brought it up, since self-balance is a total misnomer and gets everyone excited. That's why I prefer drift. I should say that I know (and have proven to myself) that just dumping a solar CC onto a misbalanced pack is not the cheapskates way of balancing!
    Yes, a misnomer. These discussions are hard enough to keep properly specific without introducing misnomers and analagous descriptions of cell functions that wander far away from what is really happening in order to make things simple for novices. I try to banish that stuff to the fullest extent possible--absolute clarity is important here.

    Originally posted by PNjunction
    In my case with the A123 cells, and also my GBS prismatics, is that the cells have to be SANELY close to each other to begin with, and the drift takes MANY cycles, not just a single day's charging.
    This is how things should go normally when all cells are very similar in starting behavior and conditions in the pack design expose all cells to roughly equal operating conditions. But it's not always like that. Resetting balance every charge cycle arrests the drift altogether unless the problem is really gross, and that has real value...especially as a pack ages and differences between cells increase in intensity.

    Originally posted by PNjunction
    Also not that my solar usage is for realatively low-voltage (typically 12v / 4S configs), not mobile, and not critical. No bleeder boards, just common sense HVC, LVC and dose of monitoring since I like to do battery maintenance. Wouldn't hand it over to my neighbor though!
    I do similar stuff on many of my small batteries, but that last sentence is the really important part. I try to be very careful when describing such practices so that it is always perfectly clear just how vigilant you have to be to avoid potential disaster. Most people should never even attempt to use these techniques.

    dh

    Leave a comment:


  • wb9k
    replied
    Originally posted by PNjunction
    I'll let the others chime in, but I think we are all familiar with the ol' charge until current drops to C/20, and then call it quits. Taking absorb down to zero is indeed unnecessary and stressful.

    But the question is always, great, but at what voltage? since it really depends upon the application. And should it be measured after hours of rest or during charge? 3.45v *at rest* is considered a fully charged cell. There is some variance, as a GBS prismatic can be from 3.38 to 3.45 or so...

    Probably the best thing I've seen is an actual formula for it! It goes something like this for charging:

    3.45 + (IR * A)

    Where 3.45 is considered the full voltage at rest
    IR is your cell internal resistance
    A is the charge amperage

    Maybe the big guns can comment.

    Getting those values can come from a prismatic manufacturer if you ask or perhaps special-order them can be done, so that your cells are individually matched for capacity and internal resistance, and will supply a document sheet matching each one's barcode. Not sure if you can special order small cylindricals this way, or as WB9K points out, impractical to do for commercial projects.
    Some great posts here this morning, nice!

    The C/20 bit (did you get this from me?) comes from the procedure for the A123 self-discharge test for cells. This procedure insures that the cell is high enough on the charge curve to yield the needed resolution to make an accurate test. I see this as the minimum charge required for that test, not an absolute upper limit on charging. The cell engineers have taught me that as long as you don't exceed 3.60 Volts (let's say while charging just to be safe, even though it's common practice to charge at 3.625 per cell during CV or even higher) you'll never plate Li, and you're doing no real harm unless you consider the slightly increased formation rate of the SEI layer, which is just a bit higher than if you cut charge off at, say 3.5 Volts. You must get above 3.4 to do any real balancing at all, the higher you go, the more accurate the balance. Soaking until current falls to minimum (while also having automatic balancing) insures you are getting beyond the difference effects imposed by minor IR variation. The different stated "100%" rest voltages are the result of the lack of a truly universal standard for ending charge current based on cell capacity.

    The formula is interesting, but I think it must be intended to be specific to a particular model of cell. The concept makes some sense, but it would have to also consider the capacity of the cell to really work since V rise during charge is related to both absolute charge current AND the capacity of the cell(s) being charged. If you knew what capacity cell this was aimed at, you could probably try working your way back to defining that ratio and then try applying it to larger or smaller cells and seeing how well it transfers. The other issue here is that IR is going to be a tiny number, and depending on conditions (mostly temperature), this is going to be a moving target that can shift well over 100% from the stated nominal value. A voltage-based system for upper charge limits thus makes the most sense in my mind....it's voltage that tells us where the critical electrochemical "knee points" are located, if that makes sense.

    As far as manufacturers (whether they are cell mfg's or OEM's) matching cells by hand-selecting them, I know of no company that does this. I doubt that A123 has ever done this for anybody, and if you asked they would probably say no, citing existing test routines as being good enough to insure a good match. Select (new) cells that are as close together in age as possible if you are really concerned (all within a 6-month mfg window is good enough), and that should be all that is required.

    dh

    Leave a comment:


  • reed cundiff
    replied
    The BMS and Rudman bus (BMS monitor) shows the voltage of battery suite (48 V nominal), voltage of each cell (16), maximum and minimum voltage ever, charge/discharge rate in amps (48 V nominal), and W-hrs from maximum SOC

    TriStar MPPT-45 shows charging rate in W, total harvest of day in W, voltage of battery suite (48 V nominal), what mode charge is in (MPPT, float, etc). There are a number of other conditions that can be measured if desired.

    Magnum monitor gives voltage of battery suite and numerous other statuses (statūs if one goes back to high school Latin) if desired.

    Happily, all three monitors give total voltage (48 V nominal) to the same value within 0.1 V.

    The voltages of individual cells have never gone above 3.45 V

    Charge rate approaches 0 W as the battery suite approaches 54.4 V. Parasitic charges still occur.

    Leave a comment:


  • karrak
    replied
    Originally posted by Willy T
    The one area I have interest in is the saturation phase of charging. Since charging is a leading voltage and the battery voltage is a lagging voltage, how long should the saturation phase be or to what level. If it is measured by say, ending amps ?? The higher the charge rate, the more disparity I see with shunt counted amp hrs returned. By only using a termination voltage of the charge controller there is a undercharge that accumulates by the cycle.
    Hi Willy,
    This is the major difference between charging with a constant current source and the sun. With constant current you can charge to a predictable SOC by charging with constant current to a set voltage cutoff and then terminating the charge. As the amount of sun reaching our solar panels is variable we do not get the same amount of charge in the battery every time it is charged. I have found that when charging to around 3.4 volts/cell and ending the charge at a charge rate of C/20 that the final SOC achieved can be anything between around 80% and 90+%. If it is sunny for the whole period that the battery is being charged the final SOC will be around 80%, if it is cloudy or it is nearly the end of the day when the charging is nearly finished the final SOC can be greater than 90%. I have reduced this problem by ending the charge at C/50 rather than the C/20.

    Simon

    Leave a comment:


  • karrak
    replied
    Originally posted by PNjunction
    Getting back to degradation itself, I have always wondered if any degradation studies have been done for LiFePo4 prismatics (or A123 cells if you prefer) in regards to the fact that as solar users, our charge controllers use PWM in the "absorb" phase (what little there is when fed by decent current!).
    This light bedtime reading might be of interest http://dataspace.princeton.edu/jspui...181D_10531.pdf. Go to page 124 for the summary.

    In other words, we don't REALLY use CC/CV, but CC/PWM. Typically the pwm is done at about 300hz or so. If looked at on a waveform, this simply means that our controllers just close the circuit during bulk, but once a setpoint has been reached, instead of CV, pwm is actually used. Ie, the voltage can actually shoot up to 4.5 volts per cell! - BUT of course at 300hz, the averaging takes place.

    What I noticed when using both prismatics, and my prized A123's from Braille and Antigravity brand batteries was that unlike CV which stops current when the first cell is fully charged, with pwm, they tend to "drift together" - and not an exact balance. We've covered balance enough, but my main interest was how lifepo4 reacts to pwm, since that is what we use in the field. (be it a low-end pwm controller, or an mppt which uses pwm during absorb too actually).
    I would be surprised if the commercial MPPT controllers are using PWM for the CV/Absorb phase of the charging unless there is a good reason to do so because it generates EMI (radio interference). I would think that most of the MPPT controllers work with switching frequencies > 10kHz (I use 20kHz) and would filter the switching currents with capacitors. This is an educated guess on my part.

    Simon

    Leave a comment:


  • PNjunction
    replied
    A side note about degradation --

    We throw voltage settings around like so much candy (here and elsewhere), and while user lifepo4 projects are valuable for data, I take them with a grain of salt and consider them anecdotal when I can't determine if they are using a quality standard for measurement.

    Far too many times I've seen guys building mega-buck battery systems, and then calibrated and monitored by a shirt-pocket or throw-away meter that hasn't been vetted for accuracy. Or not even taking the time to check that the cheapo Junsi cell monitor is even in the ballpark!

    I use Fluke for out-of-box trust, but this isn't really a multimeter thread and I don't care what one uses, as long as they TRUST it or have calibrated it. And THEN, using that calibrated meter as the standard for everything else.

    I just wonder how many systems dutifully follow the experience of others, only to be bagged by using a throw-away meter, and giving us false data in the forums?

    Ok - degradation issue about meter rant over ...

    Leave a comment:


  • PNjunction
    replied
    Originally posted by wb9k
    This is an interesting question. I don't believe I've ever seen data on pwm-based charging, but I have no reason to believe it makes much if any difference at all to the cells. Many automotive applications drain the cells with high-current PWM at frequencies in the kHz range and nobody considers that a problem. Charging should be no different.
    Got it - I didn't think so but wanted to check to see if there were any smoking guns laying in wait. As for the solar charge controller using pwm during absorb, I don't have any screenshots or captures. I was going on the generic assumption that really what we are dealing with is just a mosfet doing very fast on-off connections to the load. In the case of a nominal 12v solar panel, which has an ocv of around 18v, during absorb pwm can operate fast enough and change frequency to simulate a linear CV setpoint. But still, you have the very minimal amount of time during the mosfet switch that a voltage can get up to the panel total.

    I think - disclaimer - the above comes from an amateur like myself.

    The worst part about this is now you have given me a justification to actually look at something like a Fluke 289! (I'm a fluke nut) Dang it - my wallet is bleeding now...

    What you are really seeing with cells "drifting together" is the expression of different DC resistances of the series cell-busbar combination. The higher the current, the greater the spread. When you switch from CC to CV(or PWM) the main difference as far as the cell is concerned, is that charge current goes down. The voltage rise (vs. rested voltage) across the whole pack decreases, and so do the differences between cell voltages. Less current, less "Peukert-type" losses, less voltage delta.
    Awesome - I'm going to digest that a bit. I kind of knew I shouldn't have brought it up, since self-balance is a total misnomer and gets everyone excited. That's why I prefer drift. I should say that I know (and have proven to myself) that just dumping a solar CC onto a misbalanced pack is not the cheapskates way of balancing!

    In my case with the A123 cells, and also my GBS prismatics, is that the cells have to be SANELY close to each other to begin with, and the drift takes MANY cycles, not just a single day's charging. Of course quality cells make the world of difference. Other NEW cells I pulled from other powersport batteries which were a total JOKE inside, were a waste of time and they just drifted immediately to the recycling center.

    Also not that my solar usage is for realatively low-voltage (typically 12v / 4S configs), not mobile, and not critical. No bleeder boards, just common sense HVC, LVC and dose of monitoring since I like to do battery maintenance. Wouldn't hand it over to my neighbor though.

    Thanks for the feedback. Anything that makes me want to study deeper is allright!

    Leave a comment:


  • PNjunction
    replied
    Originally posted by Willy T
    The one area I have interest in is the saturation phase of charging. Since charging is a leading voltage and the battery voltage is a lagging voltage, how long should the saturation phase be or to what level. If it is measured by say, ending amps ?? The higher the charge rate, the more disparity I see with shunt counted amp hrs returned. By only using a termination voltage of the charge controller there is a undercharge that accumulates by the cycle.
    I'll let the others chime in, but I think we are all familiar with the ol' charge until current drops to C/20, and then call it quits. Taking absorb down to zero is indeed unnecessary and stressful.

    But the question is always, great, but at what voltage? since it really depends upon the application. And should it be measured after hours of rest or during charge? 3.45v *at rest* is considered a fully charged cell. There is some variance, as a GBS prismatic can be from 3.38 to 3.45 or so...

    Probably the best thing I've seen is an actual formula for it! It goes something like this for charging:

    3.45 + (IR * A)

    Where 3.45 is considered the full voltage at rest
    IR is your cell internal resistance
    A is the charge amperage

    Maybe the big guns can comment.

    Getting those values can come from a prismatic manufacturer if you ask or perhaps special-order them can be done, so that your cells are individually matched for capacity and internal resistance, and will supply a document sheet matching each one's barcode. Not sure if you can special order small cylindricals this way, or as WB9K points out, impractical to do for commercial projects.

    Leave a comment:


  • wb9k
    replied
    Originally posted by Sunking
    Not able to edit my replies and not what I meant. I couldn't fix it. It should have read " If Floated to less than 100% re-balance should not be a problem even though lithium cells are not self balancing. with no parasitic loads lithium only imbalance of any significance is self discharge which is extremely low so any differences are so small and insignificant can be ignored.
    OK, this is good, but the last statement assumes that cells never fail. This is not a safe assumption. Please consider that I see virtually every failure the company puts out into the field. It's actually possible that I know the actual field failure modes of LFP battery packs--including HV EV and HEV packs, starter batteries, and commercial grid storage--better than anybody else on earth. I don't know if that's true, but people who know me say it's probably the case somewhat regularly. Why can you not accept that the possibility of cell failure is real?

    Originally posted by Sunking
    Look you may not like it or agree, but Solar is a completely different set of parameters radically different than motive.
    It doesn't matter if I like it, and I happen to agree that this is pretty much always true, but alone this doesn't really mean much, because there WILL in fact be concerns that carry over across all lines. Energy storage is ALWAYS inherently dangerous, and needs to be treated with elevated respect, especially when catastrophic system failure can burn your house down. Even if it's 1 in 100,000, why would you take that kind of risk when it's so easily avoidable?

    Originally posted by Sunking
    Solar does not have the extremes unless you are talking RV where the engine alternator will do most of the heavy work.
    You assume...while making an exception for one of the most common solar install scenarios out there.

    Originally posted by Sunking
    Max charge rate is on the order of C/10 to C/6 only for a few brief hours in a day. Batteries are in open air at room temps and not sealed up in a oven and coffin. Discharge rates typically C/20 and occasional burst of maybe 1C, but on average very slow discharge rates.
    Again, all assumptions that any end user can easily overthrow. I'm getting ready to set my uncle up with a solar rig for his remote cabin that violates a few of your fundamental assumptions. Add that it will be completely unsupervised 95% of the time, and a belt-and-suspenders approach is mandatory--even though the pack is only 12 Volts, about 3-4 kW.

    Originally posted by Sunking
    Couple all that together with LFP batteries that are extremely safe by default, operated in kind conditions, and operate at Low Voltages of 12, 24, and 48 do not need as stringent controls. I know you are talking about cell level, but with 4, 8, and maybe 16S at low discharge rates do not really need it.
    Based on extensive real-world experience in many areas, including solar, I strenuously disagree. LFP is indeed extremely safe....maybe the safest energy storage medium of its capability on earth. It's still not totally foolproof. Use your FMEA skills....and remember the Titanic.

    Leave a comment:


  • Sunking
    replied
    Originally posted by wb9k
    Li cells don't self-balance...you can say it til you're blue in the face but it still won't be true. As I explained above, the illusion of "self balance" is in fact an expression of cell variation--the very thing that should be telling you why you DO need to balance, and the more often the better.
    Not able to edit my replies and not what I meant. I couldn't fix it. It should have read " If Floated to less than 100% re-balance should not be a problem even though lithium cells are not self balancing. with no parasitic loads lithium only imbalance of any significance is self discharge which is extremely low so any differences are so small and insignificant can be ignored. Look you may not like it or agree, but Solar is a completely different set of parameters radically different than motive. Solar does not have the extremes unless you are talking RV where the engine alternator will do most of the heavy work. Max charge rate is on the order of C/10 to C/6 only for a few brief hours in a day. Batteries are in open air at room temps and not sealed up in a oven and coffin. Discharge rates typically C/20 and occasional burst of maybe 1C, but on average very slow discharge rates. Couple all that together with LFP batteries that are extremely safe by default, operated in kind conditions, and operate at Low Voltages of 12, 24, and 48 do not need as stringent controls. I know you are talking about cell level, but with 4, 8, and maybe 16S at low discharge rates do not really need it.

    Leave a comment:


  • Living Large
    replied
    Originally posted by wb9k
    Li cells don't self-balance...you can say it til you're blue in the face but it still won't be true. As I explained above, the illusion of "self balance" is in fact an expression of cell variation--the very thing that should be telling you why you DO need to balance, and the more often the better.
    Dereck never stated in my time here that LFP cells "self balance". There was an extended discussion here 3-4 months ago, in which it was discussed by multiple users that one can monitor the cells and rebalance manually when needed, but for some period of time the cells will tend to stay in balance. That was good enough for me. I knew going in with eyes wide open that I would have to monitor all my cells, and rebalance occasionally. This is for solar. We never discussed motorcycles or cars or any of that, because it isn't what the application is. You guys are now having an argument over balancing that as a novice observer I can tell will never end. You might considering agreeing to disagree. Dereck said months ago that a solar user could choose any balancing method they wanted to - it's whatever "floats" your boat. Everyone's philosophy is a bit different.

    Leave a comment:


  • wb9k
    replied
    Originally posted by Sunking
    Geez dude you just agreed with me. What the hell do you think I have been talking about all this damn time. Run below 100% and th ecells self balance and if they should ever become unblanced is really simple to be re-balanced. I personally have not had to in 8 moinths, and if you go to DIY EV Forum or EVTV you will find hundreds of people who have not had any balance problems in 3 years. I grant you on a large EV high voltage pack Cell level monitors have benefit. But that is only because of the number of series cells is so much greater 3 to 6 volts is not noticeable. But trust me 3 volts low on a 12 volt system is going to get your attention real damn quick when nothing turns on and you notice your battery is 8 to 11 volts. You do not need a cell monitor to tell you something is wrong. Your Inverter already caught it and shut down.
    Unless something's wrong with your inverter. I had a solar charge controller fail years ago such that it allowed full panel voltage to the battery all the time. Had I been running unsupervised Li, it could have resulted in a fire. Or say you've got an LFP 12 Volt battery that you're pulling out of storage and see it's at 10 Volts, but you have no way of knowing individual cell voltages and assume everyone is balanced, just low. So you put it on the charger and walk away....but what you really have is three cells at near 100% and one near 0%. Another fire waiting to happen. The problem with these approaches you keep suggesting is that you presume to know all circumstances at all times, but that's not the reality--for anybody. These systems should have some redundancy for robust safety and reliability.

    Li cells don't self-balance...you can say it til you're blue in the face but it still won't be true. As I explained above, the illusion of "self balance" is in fact an expression of cell variation--the very thing that should be telling you why you DO need to balance, and the more often the better.

    Leave a comment:


  • Sunking
    replied
    Originally posted by wb9k
    TA 14.4 Volt charger is fine too. If the setpoint of the charger cannot overcharge a balanced pack, you're good unless you've got severe imbalance prior to charge, which would indicate that something else is wrong [that's why you need cell-level monitoring to shut things down if the situation becomes dangerous for any single cell]. As you approach 100% SOC, the voltage delta between the cells and the charger goes down, so current becomes self-limiting. Balancers should never have a problem dealing with the small imbalance that develops between charge cycles in such a system. 14.4 Volts = 3.6 Volts per cell, when your charger current falls to zero (or very near to it), you should have a perfectly balanced pack at 100% SOC, every time. With no automatic balancers, I'd rather turn that down charger down to 14.0-14.2 and manually balance periodically. I've used this technique with no problems on several of my smaller packs for years now---.
    Geez dude you just agreed with me. What the hell do you think I have been talking about all this damn time. Run below 100% and th ecells self balance and if they should ever become unblanced is really simple to be re-balanced. I personally have not had to in 8 moinths, and if you go to DIY EV Forum or EVTV you will find hundreds of people who have not had any balance problems in 3 years. I grant you on a large EV high voltage pack Cell level monitors have benefit. But that is only because of the number of series cells is so much greater 3 to 6 volts is not noticeable. But trust me 3 volts low on a 12 volt system is going to get your attention real damn quick when nothing turns on and you notice your battery is 8 to 11 volts. You do not need a cell monitor to tell you something is wrong. Your Inverter already caught it and shut down.

    Leave a comment:

Working...