How long does the typical battery last before it needs significant new investment (major maintenance or replacement)? In other words, how long before we need to install another 30 GW not to increase capacity, but to maintain it?
I'm not a pessimist but I know that at this stage in an adoption cycle, it's all gain, but that's not the long-term reality. I'm wondering what the long-term looks like.
In part, if the capacity keeps doubling, replacing 30 GW will be trivial compared to the new capacity. But eventually that will flatten out.
Tesla Megapacks come with a 15-year "no defect" and "energy retention" warranty. A 10 or 20 year "performance guarantee" is available for an additional cost. (an example of Li-Ion based storage, I assume it won't differ wildly with other producers)
I think they can in principle last longer if used well. You can control the temperature, you can use weather forecast (as part of some network-wide forecast of future required capacity) to decide how much you should charge the batteries. You'd use the "unhealthy" charging/discharging only in extreme situations, while in the day-to-day cycles you'd keep the batteries within the healthy voltage range.
Two years ago when I was considering installing a battery along with my solar panels, here was the estimations I did:
1. Additional cost of installing a 13.5kWh Tesla Powerwall was quoted as being around £12,000
2. Warranty was 80% capacity after 10 years
Now, my experience with batteries in general is that once they start to go, they deteriorate pretty rapidly; so 80% after 10 years to me basically account 10 years as the lifetime of the battery.
£12k for 10 years is £1.2k per year, or £100 per month: that is, the battery would have to allow me to save £100/month in electricity just to break even -- and that's before factoring in the cost of capital (i.e., if I take that £12k and invested it somewhere else, I'd have a lot more than £12k after 10 years).
Even if the batteries managed to eke out a reasonable capacity for 20 years -- which seems pretty unlikely to me -- I'd still have to somehow save £50/month to break even. All in all the battery just didn't make financial sense for me as an individual.
One expects that the companies doing this on an industrial scale have done the math and managed to find some economies of scale that make it profitable; but it's not out of the realm of possibility that due to a combination of optimism, overselling on the part of battery companies, and FOMO, that they've drastically overestimated the performance of their batteries, and are going to find 15 years down the road that they're beginning to fail without having paid back the capital investment used to buy them.
Warranty or not, I don't think I'm relying on Tesla's performance claims as data. I know, it's warrantied - 'but wasn't it your fault? What humidity level did you store it in? The moisture sensor says ...'
If it keeps on doubling like this, a fraction of the installed base may be affected by this some 10-15 years down the line. That's kind of how the math works out. Most batteries around that time will be new ones. And that seems to be the time line that these things are sold for. They might have quite a bit of life in them beyond that of course. It's not like they just stop working.
In any case, we're talking about first and second generation products here adapted from batteries designed for the automotive industry. It works and it's great but there are better options coming to market that are more optimal for grid storage. Ten years from now, this market is going to be very different in terms of cost, performance, longevity, and volumes produced and deployed.
The world's electricity production per year is around 25 pwh (25000 twh) per year. So, you can do some back of the envelope math on that. We're going from hundreds of gwh to twh of battery produced per year in the next years (overall, not just grid). Each of those batteries is good for thousands of charging cycles. So if you were to charge and cycle each battery every day to add up to 1/365th of 25pw, you end up with a number of around 70twh of battery that is needed. We don't actually need that much of course since there are other options on the grid. But it just shows you that it could be doable and might be what we have in the field in a few decades. A few twh of battery on standby goes a long way and might just sit there fully charged most of the time.
I don't know the actual numbers on that, but I do know that stationary storage lasts much longer than e.g. EV batteries, since there are no weight restrictions, so they're cycled more gently, temps are better, they can be kept in a more optimal window, you can use things like LFP that are heavier, and BMS has more latitude with more cells to work with.
The doubling of production is also what drives the learning curve cost decreases at a rate of 20% per doubling. Which is what has reduced battery costs by 90% over the last two decades and is widely predicted to continue.
It's easy to explain to finance people. It's a form of commodity speculation. Storage is a buy-low, sell-high business, with a daily cycle. This expansion will continue until peak and valley prices start to level out.
> easy to explain to finance people. It's a form of commodity speculation
Energy trades unlike any other commodity due to its perishability. There is nothing easy about trading power, though if someone has a background in derivatives, that helps. (They're also irregularly perishable.)
For the lay person, I'd recommend smarting up on Fabozzi's Fixed Income before trying to explain anything to anyone about trading power.
Doubling in 2024 is great but the estimate for 2025 is nowhere near doubling again. How many doublings will we need to store enough energy that solar can replace all non-renewables? I imagine it's still a lot.
As a consumer in Ohio, variable-rate pricing kind of scares me: It's nice to be able to run the clothes dryer without any concern over what time of day it is. We don't do variable-rate pricing here (at least for residential users).
But I'd manage to sort it out in pretty short order, I think: It usually isn't very important to me what time of day the clothes get dried (as long as they don't sit long enough to get stinky), so I would indeed be motivated to pay attention to that. I, myself, would even become motivated to automate it so that the clothes begin drying automatically when energy is [relatively] cheap.
I can also imagine things like automating water heating: Burn Joules when it is cheap to do so, and store [most of] them for later use at a time that may be more convenient to me. (The math gets interesting on this one.)
Having very cheap energy be available occasionally would also be neat: I've got computationally-intensive tasks that need to get done some day. These cost real money to run on the hardware that I have. It sure would be nice if I could save some money by only doing these tasks when power is cheaper. (Right now, I do try to optimize timing them for energy efficiency. For instance, in the summer it is cheaper to run them on cool nights when the windows might be open than during a hot, sunny day when the aircon might barely be keeping up.)
Realistically both things will need to happen. There is a lot of demand that can’t be time shifted by more than a few hours. We need storage for a few weeks of consumption eventually.
Storing even 10% of US electricity usage from summer to winter could be in the tens of trillions of dollars. Spending $10,000 to store $2 of electricity in a battery is not scalable. Other forms of energy storage are the future.
Anyone else starting to get the feeling that the idea of "base load" power was a scam ? By "base load" I mean the insurmountable blocker for renewable adoption.
The storage industry has already proven it can basically double national installed capacity from one year to the next: It did so in 2023 and 2022, and in 2021 it more than tripled the previous year’s tally. The continuation of this trend just gets more impressive with time: A few years ago, doubling storage capacity only meant building 1 gigawatt. Now, the industry is looking at adding 14 gigawatts in a year, requiring an unprecedented amount of work at project sites around the country.
It's hard to believe how quickly the renewable revolution is happening. The rise of AI? What about the rise of renewables, holy hell.
I was browsing panels the other day for my upcoming project to convert our entire house to solar (we only bought last year, it's top priority for me, for my children's future) and I cannot believe that I can buy top of the range 550W panels for about $200USD per panel, like, my electricity bill is about $400 a month. It's totally ridiculous. I cannot believe people more people haven't caught onto this. I'll be able to pay off the whole system, including batteries in about two year's worth of electricity bills, granted I'm mostly installing myself, it's wild.
I've been in your boots for 15 years, and I barely convinced anyone. Even my thermal panels are viewed with suspiction, and people don't believe me when I told them my hot water come from them with a little help from an electric heater in the winter. They pay themselves once per year, easily. Frustratingly, the more common doubt is "if it's so good, how are there so few people doing it?"
At this point I gave up. I enjoy my free electricity, and even earn some money, but quietly.
A pending change in households is going 24/48V DC. The inverters are costly, and they feel stupid when you know a lot of appliances have an AC/DC transformer to undo what your $3K inverters are doing 10 meters away, losing maybe 20% in the process.
Every time I bring up that subject, people get aggravated and point out that lower voltage means larger cables, which is true, but I wonder if the tradeoff of paying more for thicker cables is worth the simplicity it brings.
Frustratingly, the more common doubt is "if it's so good, how are there so few people doing it?"
Do you think that 15 years back it was prohibitive from a cost perspective now? Was it always this economically viable? I mean ~ $200 a panel, it's a true no brainer, maybe the initial outlay was a little too much for people way back when?
On the other hand, I guess with peaking electricity prices in many parts of the world, it would've easily paid itself off by now for most people.
How cold is it where you're at? I'm weighing putting in a new tankless boiler system, as well as a rooftop PV, but thermal solar might be a better bet.
> Anyone else starting to get the feeling that the idea of "base load" power was a scam
I don't think it was in the past, it's just becoming obsolete, piece by piece. Each method of more traditional power production has different capabilities for ramping up and down, in descending order: gas, hydro, coal, nuclear. Now we have renewables entering the market, which so far have more or less had to be matched with gas peaker plants for scaling up and down. Batteries are obviously putting downward pressure on peak energy generation.
Furthermore, we've had the classic paradigm of electricity demand, where if I put a load onto the grid, like turning on my oven or flipping a light switch, it must function. Now we have electric cars, heat pumps, hot water heaters, and even in parts of Scandinavia washing machines, which schedule themselves to run during off-peak times.
Where we find ourselves now is market forces working themselves out, with investors buying into battery storage, and homeowners switching to time-of-use billing for their energy bills to take advantage of cheap electricity at night when charging their cars.
In energy politics we obviously still hear the term base load, but it's now nothing more than rhetoric of an outdated era.
I don't think it was in the past, it's just becoming obsolete, piece by piece.
This is what I'm questioning though, 30 years of hand waving about "base load", and all the stories about how renewables aren't sufficient, but then, oh wait, actually, we can probably do it now.
Maybe, just maybe the tech wasn't there, but it is convenient that when push comes to shove, we do have the technology. If the investment was there 30 years ago, it feels like we could've made a lot more progress. But the narrative persisted.
How much energy storage do you think those 14 gigawatts of batteries represent? How long do they provide 14 gigawatts for?
Then go cost out how much it would cost you to deal with say, 3 days of solar under production due to grey skies for your house. In most studies, the capacity factor of a solar plant is about 25% at best, so that 550W panel is worth about 137W over the course of a year, presuming you can store all of it.
In this case , yes, why can't my energy requirements be supplanted with a fully charged battery grid, solar from somewhere else, or gas peaker or something similar?
Doesn't seem like a compelling case for running coal power plants 24/7 to be honest.
I don't think it's a scam, scams requires malice, and it's sufficient that people don't expect exponential change even when it's happening reliably for ages — after all, even when people do learn about exponential growth, many say things like that it has a "knee" or reaches an "inflection point".
There's a survivor bias, many things do stop growing at some point, but these are harder to notice. You're more likely to perceive the things which did scale exponentially.
Baseload is indeed meaningless unless you put a number on it in gw and gwh actually needed. People wielding the term without doing that (i.e. most of them), are basically insisting on unspecified amounts of energy to be needed for unspecified amounts of time for unspecified calamities that may need said unspecified capacity. It's usually accompanies by some handwavy statements about clouds, weather, and seasonal darkness and the suggestion that we should instead put all our resources into building nuclear plants.
This indeed bullshit. Because as soon as you specify these numbers these things, it becomes a simple engineering challenge with some clear economics that you can model for different solutions.
Numbers for domestic solar indeed are such that most installations pay back within a few years in most parts of the world. The largest cost these days is not even the hardware but the installation cost and getting the time of the certified experts that can do this. But even factoring in all that, you basically end up earning your money back.
If you plop down enough panels and batteries, you won't need anything else.
The US consumes approximately 4 trillion terawatts/hour of electricity. If this prediction holds true we will have 14 gigawatts of storage this year. You do the math and you tell me whether base load generation is still an issue or not.
"Oh emperor, my wishes are simple. I only wish for this. Give me one grain of rice for the first square of the chessboard, two grains for the next square, four for the next, eight for the next and so on for all 64 squares, with each square having double the number of grains as the square before."
The emperor agreed, amazed that the man had asked for such a small reward - or so he thought. After a week, his treasurer came back and informed him that the reward would add up to an astronomical sum, far greater than all the rice that could conceivably be produced in many many centuries!
I'm not a pessimist but I know that at this stage in an adoption cycle, it's all gain, but that's not the long-term reality. I'm wondering what the long-term looks like.
In part, if the capacity keeps doubling, replacing 30 GW will be trivial compared to the new capacity. But eventually that will flatten out.
I think they can in principle last longer if used well. You can control the temperature, you can use weather forecast (as part of some network-wide forecast of future required capacity) to decide how much you should charge the batteries. You'd use the "unhealthy" charging/discharging only in extreme situations, while in the day-to-day cycles you'd keep the batteries within the healthy voltage range.
1. Additional cost of installing a 13.5kWh Tesla Powerwall was quoted as being around £12,000
2. Warranty was 80% capacity after 10 years
Now, my experience with batteries in general is that once they start to go, they deteriorate pretty rapidly; so 80% after 10 years to me basically account 10 years as the lifetime of the battery.
£12k for 10 years is £1.2k per year, or £100 per month: that is, the battery would have to allow me to save £100/month in electricity just to break even -- and that's before factoring in the cost of capital (i.e., if I take that £12k and invested it somewhere else, I'd have a lot more than £12k after 10 years).
Even if the batteries managed to eke out a reasonable capacity for 20 years -- which seems pretty unlikely to me -- I'd still have to somehow save £50/month to break even. All in all the battery just didn't make financial sense for me as an individual.
One expects that the companies doing this on an industrial scale have done the math and managed to find some economies of scale that make it profitable; but it's not out of the realm of possibility that due to a combination of optimism, overselling on the part of battery companies, and FOMO, that they've drastically overestimated the performance of their batteries, and are going to find 15 years down the road that they're beginning to fail without having paid back the capital investment used to buy them.
In any case, we're talking about first and second generation products here adapted from batteries designed for the automotive industry. It works and it's great but there are better options coming to market that are more optimal for grid storage. Ten years from now, this market is going to be very different in terms of cost, performance, longevity, and volumes produced and deployed.
The world's electricity production per year is around 25 pwh (25000 twh) per year. So, you can do some back of the envelope math on that. We're going from hundreds of gwh to twh of battery produced per year in the next years (overall, not just grid). Each of those batteries is good for thousands of charging cycles. So if you were to charge and cycle each battery every day to add up to 1/365th of 25pw, you end up with a number of around 70twh of battery that is needed. We don't actually need that much of course since there are other options on the grid. But it just shows you that it could be doable and might be what we have in the field in a few decades. A few twh of battery on standby goes a long way and might just sit there fully charged most of the time.
https://ourworldindata.org/learning-curve
Energy trades unlike any other commodity due to its perishability. There is nothing easy about trading power, though if someone has a background in derivatives, that helps. (They're also irregularly perishable.)
For the lay person, I'd recommend smarting up on Fabozzi's Fixed Income before trying to explain anything to anyone about trading power.
Edit: this page claims we need 84 times what we had in 2022, the estimate for 2024 is ~3x higher, so call it ~5 more doublings after this year. https://www.alsym.com/blog/how-much-energy-storage-do-we-nee...
Instantaneous pricing for electricity motivates people and industry to make better choices.
Very cheap energy will enable interesting things too.
As a consumer in Ohio, variable-rate pricing kind of scares me: It's nice to be able to run the clothes dryer without any concern over what time of day it is. We don't do variable-rate pricing here (at least for residential users).
But I'd manage to sort it out in pretty short order, I think: It usually isn't very important to me what time of day the clothes get dried (as long as they don't sit long enough to get stinky), so I would indeed be motivated to pay attention to that. I, myself, would even become motivated to automate it so that the clothes begin drying automatically when energy is [relatively] cheap.
I can also imagine things like automating water heating: Burn Joules when it is cheap to do so, and store [most of] them for later use at a time that may be more convenient to me. (The math gets interesting on this one.)
Having very cheap energy be available occasionally would also be neat: I've got computationally-intensive tasks that need to get done some day. These cost real money to run on the hardware that I have. It sure would be nice if I could save some money by only doing these tasks when power is cheaper. (Right now, I do try to optimize timing them for energy efficiency. For instance, in the summer it is cheaper to run them on cool nights when the windows might be open than during a hot, sunny day when the aircon might barely be keeping up.)
The storage industry has already proven it can basically double national installed capacity from one year to the next: It did so in 2023 and 2022, and in 2021 it more than tripled the previous year’s tally. The continuation of this trend just gets more impressive with time: A few years ago, doubling storage capacity only meant building 1 gigawatt. Now, the industry is looking at adding 14 gigawatts in a year, requiring an unprecedented amount of work at project sites around the country.
It's hard to believe how quickly the renewable revolution is happening. The rise of AI? What about the rise of renewables, holy hell.
I was browsing panels the other day for my upcoming project to convert our entire house to solar (we only bought last year, it's top priority for me, for my children's future) and I cannot believe that I can buy top of the range 550W panels for about $200USD per panel, like, my electricity bill is about $400 a month. It's totally ridiculous. I cannot believe people more people haven't caught onto this. I'll be able to pay off the whole system, including batteries in about two year's worth of electricity bills, granted I'm mostly installing myself, it's wild.
At this point I gave up. I enjoy my free electricity, and even earn some money, but quietly.
A pending change in households is going 24/48V DC. The inverters are costly, and they feel stupid when you know a lot of appliances have an AC/DC transformer to undo what your $3K inverters are doing 10 meters away, losing maybe 20% in the process.
Do you think that 15 years back it was prohibitive from a cost perspective now? Was it always this economically viable? I mean ~ $200 a panel, it's a true no brainer, maybe the initial outlay was a little too much for people way back when?
On the other hand, I guess with peaking electricity prices in many parts of the world, it would've easily paid itself off by now for most people.
How cold is it where you're at? I'm weighing putting in a new tankless boiler system, as well as a rooftop PV, but thermal solar might be a better bet.
I don't think it was in the past, it's just becoming obsolete, piece by piece. Each method of more traditional power production has different capabilities for ramping up and down, in descending order: gas, hydro, coal, nuclear. Now we have renewables entering the market, which so far have more or less had to be matched with gas peaker plants for scaling up and down. Batteries are obviously putting downward pressure on peak energy generation.
Furthermore, we've had the classic paradigm of electricity demand, where if I put a load onto the grid, like turning on my oven or flipping a light switch, it must function. Now we have electric cars, heat pumps, hot water heaters, and even in parts of Scandinavia washing machines, which schedule themselves to run during off-peak times.
Where we find ourselves now is market forces working themselves out, with investors buying into battery storage, and homeowners switching to time-of-use billing for their energy bills to take advantage of cheap electricity at night when charging their cars.
In energy politics we obviously still hear the term base load, but it's now nothing more than rhetoric of an outdated era.
This is what I'm questioning though, 30 years of hand waving about "base load", and all the stories about how renewables aren't sufficient, but then, oh wait, actually, we can probably do it now.
Maybe, just maybe the tech wasn't there, but it is convenient that when push comes to shove, we do have the technology. If the investment was there 30 years ago, it feels like we could've made a lot more progress. But the narrative persisted.
Then go cost out how much it would cost you to deal with say, 3 days of solar under production due to grey skies for your house. In most studies, the capacity factor of a solar plant is about 25% at best, so that 550W panel is worth about 137W over the course of a year, presuming you can store all of it.
Doesn't seem like a compelling case for running coal power plants 24/7 to be honest.
I’m good at math. I can see that solar would save me buckets of money for a small investment. It’s just that my hands are tied for other reasons.
This indeed bullshit. Because as soon as you specify these numbers these things, it becomes a simple engineering challenge with some clear economics that you can model for different solutions.
Numbers for domestic solar indeed are such that most installations pay back within a few years in most parts of the world. The largest cost these days is not even the hardware but the installation cost and getting the time of the certified experts that can do this. But even factoring in all that, you basically end up earning your money back.
If you plop down enough panels and batteries, you won't need anything else.
This 100%, this is exactly how I felt about it too. Now it's a simple engineering and IMO economics challenge, it turns out, it's quite possible.
It's roughly 3800 TWh (Terrawatt-hours) per year, no need to invent new units.
Deleted Comment
So they now have two. /s