The rate of progress in cost reduction has been astonishing. It's unlike anything except Moore's Law. This catches people out.
As well as the usual suspects: cheap fossil fuels, failure to take global warming seriously, belief that nuclear power would see similar exponential cost reduction rather than opposite, and of course anti green politics.
But if 95% cost reduction is the result of not taking it seriously, would taking it seriously earlier have been even better? Hard to say.
We have silicon solar modules in the 1950s, Moore's law in the 1960s. Another take on the question then: today we use Moore's law to describe progress in solar modules, to what extent was that realization possible in the 1960s from the fundamentals, or "first principles"?
If it was clear, why did we not see rapid prioritization of solar and energy storage technology research? Or did we and I don't know the actual history? Or what influences am I undervaluing or not recognizing?
If it wasn't clear, why not? Gaming out many positive impacts of solar technology feels easy today in a way it appears was not easy in the past. Why wasn't it clear in the past?
I think this is a really important distinction, that between research in the lab versus research on the factory floor. Tesla in particular has talked about how much they value engineers that get down in to the production process versus those that are working in the lab. That's the "doing" that needs to happen. As well as shaking out parts of the upstream supply chains and making all that cheaper.
We can theorize about what's going to work in practice, but the price drops are the combination of 1% savings here, 0.75% savings there, 0.5% there, and until you have the full factory going you won't be able to fully estimate your actual numbers, much less come up with all the sequential small improvements that build on each other. And all that comes together in the design of the next factory that's the next magnitude up in size.
> until you have the full factory going you won't be able to fully estimate your actual numbers, much less come up with all the sequential small improvements that build on each other.
Why not? Is there a theory or school of management or industry that establishes this foundational principle that seems so commonly invoked? It feels true, but I don't really know why it might be true. There must also be great examples of counterpoints in this too!
Maybe it goes back to learn by doing: it's a common refrain in outdoor recreation that safety rules are written in blood; that many of our guidelines directly follow from bad things that happened. But certainly we can also design safety rules by thinking critically about our activities. Learn by doing vs theory.