Readit News logoReadit News
starky · a year ago
From having worked a bit in the industry I'm a bit skeptical about this study, I've definitely seen studies and experiments that used different initial charging conditions that would have shown better fade performance if this was true.

Not to mention, how much does the increased SEI change the impedance of the cell (thus reducing the subsequent charge speed) and the capacity available.

Joel_Mckay · a year ago
Agreed, the study summary needs better explanation to justify the contradiction with dozens of other lab tests. We have several boxes of 21700 cells from various manufacturers (Samsung/Sony/Panasonic) undergoing aging trials for over 2 years now.

All LiIon and LiPol chemistries have shown the following:

1. deep-cycle discharges below 60% full cuts usable charge cycle counts from 8000 to under 2000 uses.

2. high-current discharge or rapid-charging accelerates capacity losses by about 15% a year

3. Internal resistance goes up as dendrite defect shorts damage the cell. Additionally, the self-discharge rates increase as the cell is degraded.

Very surprising if the technique works for all cell chemistries. =3

gamblor956 · a year ago
There isn't a contradiction.

This study solely focuses on the very first charge. It doesn't claim that recharging at high currents benefits battery life, only that the first charge at high current forms a larger protective barrier than a first charge at a low current.

Other studies have shown that a larger protective barrier improves lifespan. (See other comments on this thread for more details on the science.)

starky · a year ago
Unless you actually work for a cell manufacturer you aren't getting completely fresh cells though. They are talking about the first charge after the jelly roll is sealed into the can. When I would build cells by hand the standard procedure was to do the first couple cycles at 0.01C, record the capacity, and then change them to the charge rate for the experiment.
jve · a year ago
> deep-cycle discharges below 60% full cuts usable charge cycle counts from 8000 to under 2000 uses.

That is, if you do it single time you are down from 8k to 2k? Or it decreases gradually and 2k is the worst case?

Where can I read about it? Not a paper, but something more down to earth for consumers? That is, for a consumer to know how to properly maintain various devices (phone/car) for longevity?

algo_trader · a year ago
> Samsung/Sony/Panasonic

> 1. deep-cycle discharges below 60% full cuts usable charge cycle counts from 8000 to under 2000 uses.

Presumably these are NMC variant?

Major Chinese LFP brands come with 6000K/10K cycle guarantees (but with specific operational parameters). Are these cycle predictions unrealistic ?

catl/eve/etc

m463 · a year ago
Wasn't supercharging EVs a lot frowned upon at first, but later found out not to negatively affect battery life as much?

https://electrek.co/2023/08/29/tesla-battery-longevity-not-a...

Roark66 · a year ago
This may be so in Teslas that have quite robust heat management. It definitely doesn't apply to many other brands. Anyone who knows anything about lithium batteries and sees temperatures at which fast charging is done will not believe any of the longevity claims. It is up to 55C during charging and during subsequent driving on a motorway it can take half an hour for this temperature to drop below 40C (look up BYD Seal 1000 mile challenge on YouTube for an example - all above st 9C ambient).
starky · a year ago
How long does supercharging take? Even at 30 minutes, that is only a rate of 2C which is not that extreme for some cell chemistries as long as temperature is controlled.
oska · a year ago
The analogy they use in the article is all sorts of dodgy too :

> Removing more lithium ions up front is a bit like scooping water out of a full bucket before carrying it, Cui said. The extra headspace in the bucket decreases the amount of water splashing out along the way. In similar fashion, deactivating more lithium ions during SEI formation frees up headspace in the positive electrode and allows the electrode to cycle in a more efficient way, improving subsequent performance.

Deleted Comment

mensetmanusman · a year ago
Such a cool finding if it pans out in production. A hidden process variable hiding in plain sight.
userbinator · a year ago
They'll never do it because it means decreased profits.

There are articles that appear here and elsewhere semi-frequently about how doing something simple extends battery lifetimes a huge amount, but those never get implemented in practice except perhaps for highly niche applications.

Instead what usually happens is they'll then find a way to make them last the same amount of time, but with higher energy density. The "high voltage" lion cells (>4.2V end of charge) are an example of that process; they will last much longer than previous types if charged to 4.2V, but they'd rather advertise them as 4.3 or 4.35 or even 4.4V(!) and the extra capacity that gives.

mort96 · a year ago
Hm this doesn't seem to be panning out in practice. Loads of devices have grown "optimize charging" style features in the recent-ish past, and those features are explicitly there to extend battery longevity (at the expense of consumer convenience even!). Clearly, the market forces are more complex than "short battery lifetime = more frequent device upgrades = profit" (although that effect is certainly *a part of& the equation).
vkou · a year ago
> They'll never do it because it means decreased profits.

This is a lazy dismissal of any process or efficiency improvements.

If buyers care to pay for efficiency improvements, products with them will be more attractive to them. If they don't, they won't.

If your theory were true, we wouldn't have things like rechargeable batteries, low-energy appliances, or light bulbs that would last more than two months.

There's always some performance point when most people largely stop differentiating products based on efficiency or longevity improvements, and I'm not sure if consumer Li-I batteries are at that point yet.

tecleandor · a year ago
I don't think so. You can do your marketing so you "precondition your cells" and "have better charge and longevity with the same size and weight than competition".

I'm not into Apple, but I guess that if Apple could have chosen between that "lowering performance on iPhones when the battery capacity was decreased" shit and "precondition the cells to make them last longer", they would have chosen the second and make it very public.

soulofmischief · a year ago
A lot of energy research is speculative and it can take decades for research to go from the lab to the consumer.

This finding, however, specifically integrates with existing infrastructure; no new, unproven technology is needed, we just simply juice the batteries more during initial charge. If it pans out after extensive testing, we can see this technique hitting the market within 2 years.

7speter · a year ago
This would seem to increase profits, for one thing, it would make electric vehicles much more viable to a whole lot more people
crazygringo · a year ago
> They'll never do it because it means decreased profits.

That's only true under monopoly conditions.

Fortunately, in capitalism, when there are two more more companies doing things like making phones, those companies actually compete on features. And battery longevity is absolutely a feature consumers care about.

And there's certainly no kind of monopoly conditions in cell phones. Competition is thriving. As it is in most types of portable electronics generally -- Bluetooth speakers, laptops, and so forth.

If you're the company that does it first, that means increased profits because suddenly more people buy your product. And if you're the company that does it last, it means decreased profits because less people will buy your product compared to the competition. That's the invisible hand at work.

Deleted Comment

rkagerer · a year ago
TLDR: During a battery's initial "formation" charge, some of the lithium deactivates, forming a squishy, protective layer around the negative electrode, called the solid electrolyte interphase (SEI). Today, manufacturers typically do a slow formation charge, during which about 9% of the lithium is lost to the SEI. It was thought this was needed to form a robust layer. But the researchers found at the higher initial charge currents used in this study, 30% becomes SEI - so you loose some battery capacity (for a given amount of lithium), but wind up with a beefier protective layer on your electrode and better longevity across subsequent charge cycles.
user_7832 · a year ago
> so you loose some battery capacity (for a given amount of lithium), but wind up with a beefier protective layer on your electrode and better longevity across subsequent charge cycles

If there's a capacity tradeoff, why not use a slightly modified chemistry (like how LTO is, for example)? Though I guess this article was more about the existence of the phenomenon rather than using it.

bluSCALE4 · a year ago
And how long does it take to achieve this 9% layer?
rkagerer · a year ago
From the article it sounds like 10 hours, which is reduced to 20 minutes using the higher current.
sharpshadow · a year ago
I was able to revive lithium batteries which have been discharged to much and didn’t charge by connecting them to a fully charged one for a couple of seconds.
xxs · a year ago
That's all about the electronics inside the battery, rather than the chemistry. You can force feed them with any power supply, ignoring the 'standard' BMS.
Reubachi · a year ago
this is the equivilent of using a car and it's alternator (generator) to jump another car with a alt/generator you know is bad.

'risking it for the biscuit"

mleonhard · a year ago
Since a good SEI layer on the electrode is important, couldn't they put the layer on the electrode before assembling the battery? Then they could make the layer's shape more even.
dzhiurgis · a year ago
Whats a battery lifespan? Is it capacity degradation or random failure?

If discovery slows down capacity degradation, but now your EV battery is 100x more likely spontaneously fail ($$$) - it's not really an improvement. Maybe ok for consumer device tho.

earleybird · a year ago
There are two lifespans. The shelf life and the number of charge cycles (less of a span perhaps) where you charge to 100% and discharge to near 0. If you keep your charge/discharge to 80/20 then your battery life is limited primarily by the shelf life. eg. keep your Nissan Leaf in the 20-80% state of charge range and it will probably last 20 years, DC fast charge it every time to 100% you'll probably only get 2000 cycles (5-7 years) out of it.
gibolt · a year ago
It isn't that black and white, plus the Leaf without active battery temperature management isn't a representative example.

Modern Tesla's show fairly similar long-tail degradation that is nearly identical for cars that strictly home charge and those that only supercharge (based on customer vehicle tracking). Most will level off at 85-90% of original capacity.

jostmey · a year ago
I’m confused… Is this just a prediction or has it been experimentally verified?
Euphorbium · a year ago
I remember a recent paper that found that charging at double the current, but at 2khz frequency square wave basically eliminated battery degradation.
i80and · a year ago
There appear to be two recent papers on this phenomenon:

2021: low frequency pulsed charging: https://vbn.aau.dk/ws/portalfiles/portal/451327786/C5.pdf

2024: high frequency pulsed charging https://onlinelibrary.wiley.com/doi/10.1002/aenm.202400190

Not up to really reading them right now, but this is a pretty neat area of research!