Lithium-ion batteries are falling in cost so rapidly that any new process being ramped up is risky business. Form is way further along than this landing page and yet has a long way to go:
What about from an environmental standpoint if we think about that these Lithium--Ion batteries will have to be replaced and recycled every (as the article says, not sure if true) <12 years. We have a history of not pricing in negative externalities, did we do that this time?
> environmental standpoint if we think about that these Lithium--Ion batteries will have to be replaced and recycled every
I am very interested in this question, but those who raise it never have answers about the negative impacts of mining lithium.
For example, the amount of lithium needed for an EV is an order of magnitude less than the amount of steel needed. What is so bad about lithium mining that it's 10x worse than iron mining, pound for pound?
Nobody has ever answered my request for environmental concerns with a concrete environmental lithium mining concern, such as acidification that can sometimes happen with iron mining.
I've researched and researched, found nothing, which leaves me thinking that the worst case scenario for lithium is no worse than the worst case for iron.
Meanwhile, we have such immense documented harms from fossil fuel extraction that nobody ever questions again, or with the same intensity that's reserved for supposedly toxic lithium batteries.
The apparent benefit is massive, so any delay seems to cause massive harm to the environment.
I think we need to flip the question: where is the proof that coal/oil/iron is better for the environment than mining and recycling batteries? (BTW, it's at least 20 years now for grid batteries, with lifetime going up all the time...)
I think 12 years is an underestimate. Lithium-ion batteries will degrade, but they still have usable capacity. There are Tesla Roadsters still going strong, 15 years in. And the battery cell chemistry has since shifted to LFP, which has longer cycle life.
What do you think the negative externalities actually are? Off of the top of my head: mining, landfill. Same as other metals.
If the processes to extract Lithium from recycling become cheap enough to compete with the prices of mined Lithium, then that happens.
Processes still need to be invented/scaled for that to happen: the only real way to deal with damaged or charged cells that I know of is to deep freeze them, shred them, and then defrost them slowly.
But in either case: Lithium is going to end up as waste. Making it cheaper to make cars affordable and the grid more stable means that disposable batteries will be even cheaper.
I don’t know how modern batteries fare in landfills: Most modern solar panels, for example, are relatively clean (mostly aluminum, silicon, copper, wee bits of lead). But not a waste management expert.
Some LFP batteries now get rated for 5000 or more cycles or more. Even if you cycle them fully every day, that's 14 years. And that's unlikely to be needed or happening. These might last decades. At which point, battery tech might be massively better. Also, even better batteries might be on the way. E.g. Sodium Ion would be a bit less energy dense and have a similarly long life. It doesn't contain any lithium and could be cheap to manufacture in a few years. The biggest driver here would be cost and other properties (like how quickly can it deliver the power and at what capacity).
What is the negative externality of recycling batteries? That is way better than having to mine minerals out of the ground, eventually there won't need to be any significant mining and all the battery minerals will be in a constant cycle of being used then recycled
You want to price in negative externalities for lithium because we didn't price in negative externalities for fossil fuels? Am I understanding you right?
I consider sodium-ion as an extension of lithium-ion because they’re made by the same folks with similar processes. They are still early, but very much in the process of being ramped and will be incredibly useful for stationary storage: lower density, but better durability. Another real technology that Form will have to compete with.
Compressed gas storage scales better than any battery chemistry. Energy Dome claims >75% round-trip efficiency [0], and just inked a deal with Google [1].
Once R&D costs are covered, capacity scales with the size of the gas bag. Without competition from EVs or the volatility of resource extraction markets there's a clear path to profit here for 10hr+ grid-scale energy storage.
Though they are also poised to get iron ore refining to work. That alone could be worth a bunch (numbers assuming 20y amortization and 30% average duty cycle (using only summer surplus) suggest around 10ct/kg iron metal capex plus 3 kWh/kg iron metal electricity).
- Round trip efficiency: how much electricity comes out from electricity going in
- $/kWH capacity: lower is better, how does the battery cost scale as additional energy capacity is added?
- $/kW capacity: lower is better, how does the battery cost scale as additional power capacity is added?
- power to energy ratio: higher is better, to a certain point, but not usually at the expense of $/kWh capacity. If your ratio is 1:100, then you're in range of 4 days duration, which means at most 90 full discharges in a year, which highly limits the amounts of revenue possible.
- Leakage of energy per hour, when charged: does a charged battery hold for hours? Days? Weeks?
These all add up to the $/kWh delivered back to the grid, which determines the ultimate economic potential of the battery tech.
Lithium ion is doing really great on all of these, and is getting cheaper at a tremendous rate, so to compete a new tech has to already be beating it on at least one metric, and have the hope of keeping up as lithium ion advances.
Some tech has notably separate $/kW and $/kWh pricing.
Such as for example the awfully-often mentioned seasonal Europe setup of green summer hydrogen injected into former methane caverns, to be fed to gas turbines in winter.
Though I guess it's hard to measure $/kWh due to usage of natural formations.
Then there's the up-and-coming opportunity for green iron refining (ore to metal), which becomes financially practical when fed with curtailed summer surplus from integrated PV/battery deployments who's entire AC and grid side is undersized vs. PV generation capacity, using day/night shifting with local storage and peak shaving into iron electrolyzers (which would use some of the day/night shifting battery's capacity to increase over-the-year duty cycle of the iron electrolyzers).
For reference we're looking at capex for the electrolyzers (assuming 30% duty cycle average over a year, and zero discount rate over 20 years expected lifespan) around 0.1$/kg iron (metal) and electricity usage around 3 kWh/kg iron (metal).
I keep seeing comments that Li-ion is getting cheaper at an amazing rate but somehow the 18650 cells I seem to see online keep getting more expensive. Anyone have a source?
Might be the form factor. I think most of the big companies have moved away from 18650 cells. The cheapest full packs (not cells) in the US are $800 for 5kwh. Search “Server Rack Battery” on eBay, amazon or alibaba. These things are way cheaper than they were 12 months ago. The raw cells can be had even cheaper, but they require more specialized knowledge and equipment to use.
To see if anything is getting cheaper over the time, especially long term it's useful to adjust for inflation - if everything getting more expensive fast but Li-ion prices rise slower than other goods - adjusted for inflation Li-ion getting cheaper.
TFA says 75% round trip efficiency, compared to 85% for batteries.
While there is no leakage as such, the storage vessels might require continuous cooling, unless they are buried deep in the ground and they are very well insulated.
For great enough capacities, so that the costs of the turbo-generator and of the compressor become relatively small, the cost per stored kWh should become significantly lower than for batteries, especially when considering the far longer lifetime.
For small capacities, batteries are certainly preferable, but for very large capacities this should be a very good solution.
A gas-based design seems like it would be better at a small scale - e.g. the facility in the link has a reservoir the better part of a mile away from the turbines, and has a max output of 600 MW or so.
CO2 may actually be a good working fluid for the purpose - cheap, non-toxic except for suffocation hazard, and liquid at room temperature at semi-reasonable pressures. I'm not an expert on that sort of thing, though.
Yeah, they have very little information. They say "20MW" once, but it's not clear what part of it is 20MW. They imply it can be scaled up or down but don't say much.
That is a rather large amount of land, which more concerningly to me means a huge amount of equipment to get to that 200MWh, which would hint at very very high cost. I wonder how cheap they can get it.
Can somebody versed in thermodynamics explain me how can it work?
They say that they keep CO2 in liquid form at room temperature, then turn it into gas, and grab the energy so released.
* Isn't the gas be very cold on expansion from a high-pressure, room-temp liquid? It could grab some thermal energy from the environment, of course, even in winter, but isn't the efficiency going to depend on ambient temperature significantly?
- To turn the gas into the liquid, they need to compress it; this will produce large amounts of heat. It will need large radiators to dissipate (and lose), or some kind of storage to be reused when expanding the gas. What could that be?
- How can the whole thing have a 75% round-trip efficiency, if they use turbines that only have about 40% efficiency in thermal power plants? They must be using something else, not bound by the confines of the Carnot cycle. What might that be?
They store the heat from compression and use it during expansion.
You can see it in the little animation on their website. It's marked TES (thermal energy storage).
It looks like their RTE is based on a 10 hour storage time. The RTE is going to drop after their sweet spot, but if they're just looking to store excess energy from solar farm for when the sun isn't shining that's probably not a huge problem.
Storing the heat is the key part, I suppose, even though they are focusing on storing CO2.
I wonder if something like the paraffin phase transition could be used to limit the temperature of the heat reservoir, and thus the losses during storage.
My hunch is that they're doing this for three reasons.
1. Decompressing the gas can be used to do work, like turning a turbine or something. It's not particularly efficient, as you mention, but it can store some energy for a while. Also the tech to do this is practically off-the-shelf right now, and doesn't rely on a ton of R&D to ramp up. Well, maybe the large storage tanks do, but that should be all. So it _does_ function and nobody else is doing it this way so perhaps all that's seen as a competitive edge of sorts.
2. The storage tech has viable side-products, so the bottom-line could be diversified as to not be completely reliant on electricity generation. The compressed gas itself can be sold. Processed a little further, it can be sold as dry ice. Or maybe the facility can be dual-purposed for refrigeration of goods.
3. IMO, they're using CO2 as a working fluid is an attempt to sound carbon-sequestration-adjacent. Basically, doubling-down on environmentally-sound keywords to attract investment. Yes, I'm saying they're greenwashing what should otherwise be a sand battery or something else that moves _heat_ around more efficiently.
This is more of a compressed-air battery than a sand battery, except that the "air" is CO2 and it's "compressed" enough to cause a phase change.
Heat-based energy storage is always going to be inefficient, since it's limited by the Carnot efficiency of turning heat back into electricity. It's always better to store energy mechanically (pumping water, lifting weights, compressing gas), since these are already low-entropy forms of energy, and aren't limited by Carnot's theorem.
I don't know much about this CO2 battery, but I'm guessing the liquid-gas transition occurs under favorable conditions (reasonable temperatures and pressures). The goal is to minimize the amount of heat involved in the process, since all heat is loss (even if they can re-capture it to some extent).
I suppose that liquid CO2 just requires much less volume to store, while keeping the pressure within reason (several dozen atm). For it to work though, the liquid should stay below 31°C (88°F), else it will turn into gas anyway.
So, in a hot climate, they need to store it deep enough underground, and cool the liquid somehow below ambient temperature.
> they're using CO2 as a working fluid is an attempt to sound carbon-sequestration-adjacent
Um no, that's unfair. CO2 is an easy engineering choice here. It's easy to compress and decompress, easy to contain, non-flamable, non-corrosive, non-toxic and cheap. It's used in many applications for these reasons.
While CO2 is now a great evil among the laptop class, it has been a miracle substance in engineering for roughly 200 years now.
> They say that they keep CO2 in liquid form at room temperature, then turn it into gas, and grab the energy so released.
To evaporate something, you need to give it energy (heat). The energy flux through the dome walls is not huge, so CO2 boils away slowly.
> - To turn the gas into the liquid, they need to compress it; this will produce large amounts of heat. It will need large radiators to dissipate (and lose), or some kind of storage to be reused when expanding the gas. What could that be?
Well, you have this giant heatsink called "the atmosphere".
> - How can the whole thing have a 75% round-trip efficiency, if they use turbines that only have about 40% efficiency in thermal power plants?
A quirk of thermodynamics. CO2 is not the _hot_ part, it's the _cold_ part of the cycle.
To explain a bit more, if you confine CO2 and let it boil at room temperature, it will get up to around 70 atmospheres of pressure. You then allow it to expand through a turbine. This will actually _cool_ it to below the room temperature, I don't have exact calculations, but it looks like the outlet temperature will be at subzero temperatures.
This "bonus cold" can be re-used to improve the efficiency of storage or for other purposes.
Looking at the diagram on the web page, seems the key is the water. When expanding, use heat stored in the water to heat the gas. Likewise when compressing CO2 into liquid, use the water to store the excess heat generated?
There are papers that do thermodynamic analysis of similar systems finding something like ~65% efficiency. So 75% might be a bit fluffed up, but not outrageously so.
E.g. if they can use the waste heat for district heating and count that as useful work.
https://www.latitudemedia.com/news/form-energy-brings-in-mor...
The scale of investment required makes it quite hard for new companies to compete on cost:
https://www.theinformation.com/articles/battery-industry-sca...
I am very interested in this question, but those who raise it never have answers about the negative impacts of mining lithium.
For example, the amount of lithium needed for an EV is an order of magnitude less than the amount of steel needed. What is so bad about lithium mining that it's 10x worse than iron mining, pound for pound?
Nobody has ever answered my request for environmental concerns with a concrete environmental lithium mining concern, such as acidification that can sometimes happen with iron mining.
I've researched and researched, found nothing, which leaves me thinking that the worst case scenario for lithium is no worse than the worst case for iron.
Meanwhile, we have such immense documented harms from fossil fuel extraction that nobody ever questions again, or with the same intensity that's reserved for supposedly toxic lithium batteries.
The apparent benefit is massive, so any delay seems to cause massive harm to the environment.
I think we need to flip the question: where is the proof that coal/oil/iron is better for the environment than mining and recycling batteries? (BTW, it's at least 20 years now for grid batteries, with lifetime going up all the time...)
If the processes to extract Lithium from recycling become cheap enough to compete with the prices of mined Lithium, then that happens.
Processes still need to be invented/scaled for that to happen: the only real way to deal with damaged or charged cells that I know of is to deep freeze them, shred them, and then defrost them slowly.
But in either case: Lithium is going to end up as waste. Making it cheaper to make cars affordable and the grid more stable means that disposable batteries will be even cheaper.
I don’t know how modern batteries fare in landfills: Most modern solar panels, for example, are relatively clean (mostly aluminum, silicon, copper, wee bits of lead). But not a waste management expert.
I worry the answering that question requires answering this question: whose negative externalities?
[0]https://www.pv-magazine-australia.com/2025/03/12/energy-dome... [1]https://energydome.com/energy-dome-inks-a-strategic-commerci...
Once R&D costs are covered, capacity scales with the size of the gas bag. Without competition from EVs or the volatility of resource extraction markets there's a clear path to profit here for 10hr+ grid-scale energy storage.
- Round trip efficiency: how much electricity comes out from electricity going in
- $/kWH capacity: lower is better, how does the battery cost scale as additional energy capacity is added?
- $/kW capacity: lower is better, how does the battery cost scale as additional power capacity is added?
- power to energy ratio: higher is better, to a certain point, but not usually at the expense of $/kWh capacity. If your ratio is 1:100, then you're in range of 4 days duration, which means at most 90 full discharges in a year, which highly limits the amounts of revenue possible.
- Leakage of energy per hour, when charged: does a charged battery hold for hours? Days? Weeks?
These all add up to the $/kWh delivered back to the grid, which determines the ultimate economic potential of the battery tech.
Lithium ion is doing really great on all of these, and is getting cheaper at a tremendous rate, so to compete a new tech has to already be beating it on at least one metric, and have the hope of keeping up as lithium ion advances.
Such as for example the awfully-often mentioned seasonal Europe setup of green summer hydrogen injected into former methane caverns, to be fed to gas turbines in winter.
Though I guess it's hard to measure $/kWh due to usage of natural formations.
Then there's the up-and-coming opportunity for green iron refining (ore to metal), which becomes financially practical when fed with curtailed summer surplus from integrated PV/battery deployments who's entire AC and grid side is undersized vs. PV generation capacity, using day/night shifting with local storage and peak shaving into iron electrolyzers (which would use some of the day/night shifting battery's capacity to increase over-the-year duty cycle of the iron electrolyzers).
For reference we're looking at capex for the electrolyzers (assuming 30% duty cycle average over a year, and zero discount rate over 20 years expected lifespan) around 0.1$/kg iron (metal) and electricity usage around 3 kWh/kg iron (metal).
While there is no leakage as such, the storage vessels might require continuous cooling, unless they are buried deep in the ground and they are very well insulated.
For great enough capacities, so that the costs of the turbo-generator and of the compressor become relatively small, the cost per stored kWh should become significantly lower than for batteries, especially when considering the far longer lifetime.
For small capacities, batteries are certainly preferable, but for very large capacities this should be a very good solution.
A gas-based design seems like it would be better at a small scale - e.g. the facility in the link has a reservoir the better part of a mile away from the turbines, and has a max output of 600 MW or so.
CO2 may actually be a good working fluid for the purpose - cheap, non-toxic except for suffocation hazard, and liquid at room temperature at semi-reasonable pressures. I'm not an expert on that sort of thing, though.
The major advantage over pumped hydro would be you do not need very specific geography to make it happen (90 - 300+m change in elevation)
- What's the energy areal and volumetric density kWh/m2 & kWh/m3 of this storage?
- How did they derive their CapEx savings figures?
- What's the peak charge/discharge rate of an installation?
- Can this storage be up/down-scaled in capacity and rate and by what limiting factors?
One of the few numbers I could find on their site was:
> Our standard frame 200MWh battery requires about 5 he (12 acres) of land to be built.
They also refer to it as a "20MW/200MWh" plant.
They say that they keep CO2 in liquid form at room temperature, then turn it into gas, and grab the energy so released.
* Isn't the gas be very cold on expansion from a high-pressure, room-temp liquid? It could grab some thermal energy from the environment, of course, even in winter, but isn't the efficiency going to depend on ambient temperature significantly?
- To turn the gas into the liquid, they need to compress it; this will produce large amounts of heat. It will need large radiators to dissipate (and lose), or some kind of storage to be reused when expanding the gas. What could that be?
- How can the whole thing have a 75% round-trip efficiency, if they use turbines that only have about 40% efficiency in thermal power plants? They must be using something else, not bound by the confines of the Carnot cycle. What might that be?
You can see it in the little animation on their website. It's marked TES (thermal energy storage).
It looks like their RTE is based on a 10 hour storage time. The RTE is going to drop after their sweet spot, but if they're just looking to store excess energy from solar farm for when the sun isn't shining that's probably not a huge problem.
I wonder if something like the paraffin phase transition could be used to limit the temperature of the heat reservoir, and thus the losses during storage.
1. Decompressing the gas can be used to do work, like turning a turbine or something. It's not particularly efficient, as you mention, but it can store some energy for a while. Also the tech to do this is practically off-the-shelf right now, and doesn't rely on a ton of R&D to ramp up. Well, maybe the large storage tanks do, but that should be all. So it _does_ function and nobody else is doing it this way so perhaps all that's seen as a competitive edge of sorts.
2. The storage tech has viable side-products, so the bottom-line could be diversified as to not be completely reliant on electricity generation. The compressed gas itself can be sold. Processed a little further, it can be sold as dry ice. Or maybe the facility can be dual-purposed for refrigeration of goods.
3. IMO, they're using CO2 as a working fluid is an attempt to sound carbon-sequestration-adjacent. Basically, doubling-down on environmentally-sound keywords to attract investment. Yes, I'm saying they're greenwashing what should otherwise be a sand battery or something else that moves _heat_ around more efficiently.
Heat-based energy storage is always going to be inefficient, since it's limited by the Carnot efficiency of turning heat back into electricity. It's always better to store energy mechanically (pumping water, lifting weights, compressing gas), since these are already low-entropy forms of energy, and aren't limited by Carnot's theorem.
I don't know much about this CO2 battery, but I'm guessing the liquid-gas transition occurs under favorable conditions (reasonable temperatures and pressures). The goal is to minimize the amount of heat involved in the process, since all heat is loss (even if they can re-capture it to some extent).
So, in a hot climate, they need to store it deep enough underground, and cool the liquid somehow below ambient temperature.
Um no, that's unfair. CO2 is an easy engineering choice here. It's easy to compress and decompress, easy to contain, non-flamable, non-corrosive, non-toxic and cheap. It's used in many applications for these reasons.
While CO2 is now a great evil among the laptop class, it has been a miracle substance in engineering for roughly 200 years now.
To evaporate something, you need to give it energy (heat). The energy flux through the dome walls is not huge, so CO2 boils away slowly.
> - To turn the gas into the liquid, they need to compress it; this will produce large amounts of heat. It will need large radiators to dissipate (and lose), or some kind of storage to be reused when expanding the gas. What could that be?
Well, you have this giant heatsink called "the atmosphere".
> - How can the whole thing have a 75% round-trip efficiency, if they use turbines that only have about 40% efficiency in thermal power plants?
A quirk of thermodynamics. CO2 is not the _hot_ part, it's the _cold_ part of the cycle.
To explain a bit more, if you confine CO2 and let it boil at room temperature, it will get up to around 70 atmospheres of pressure. You then allow it to expand through a turbine. This will actually _cool_ it to below the room temperature, I don't have exact calculations, but it looks like the outlet temperature will be at subzero temperatures.
This "bonus cold" can be re-used to improve the efficiency of storage or for other purposes.
I am _very_ suspicious the efficiency is anywhere close to 75%.
E.g. if they can use the waste heat for district heating and count that as useful work.