Edit: Tried to edit the edit but somehow deleted the rest of the edit. It was something to the tune of how a big problem with renewables is the fact that peak solar production does not match peak energy consumption, and storage is very difficult, so realistically we'll need a wide variety of energy options to fully transition to renewables. Nuclear is reliable and to some degree adjustable, helping to alleviate the storage issue. Basically, it's my opinion that nuclear works well with other renewable sources, and a full renewable transition will certainly involve more of it.
1) "Water batteries" - highly efficient (far more than the 'chemical' you are apparently referring to) & responsive
2) Methods for using 'renewables' to produce &/ support production of chemical fuels - with the added draw / potential goal of 'closing' the 'carbon cycle'
As to #2, one of the ideals that has been kicked around for decades is to do something like: use 'renewables' to sequester CO2 from the atmosphere and convert it into something like butanol, for example.
Now, last I was up-to-date on any of this sort of work (~10+ years ago), the economics were not favorable. Certain types of commodity chemical production with 'biological basis' (another type of renewable, typically) had much more favorable properties economically. And, indeed, you do see, for example, (thermo)plastic products made from chemicals like "PLA" increasingly. But, the "biofuels" concept is / was much more challenging, especially as "fracking" technology made great leaps etc.
Nuclear has its pros and cons - blanket disavowal is fatuous. Nevertheless, there are substantially more options, systems, technologies, etc. in development and production than are often discussed in too many of the pro-nuke(s) / no nuke(s) 'sniping' chains that have been prevalent in society & on the internet since I was a wee tyke myself.
Yet somehow we're going to hand over power to AI such that it destroys us. Or somehow the AI is going to be extremely malign, determined to overcome and destroy and will outsmart us. Somehow we won't notice, even after repeated, melodramatic reminders, and won't neuter the ability of AI to act outside its cage.
But to paraphrase a line in a great movie with AI themes: "I bet you think you're pretty smart, huh? Think you could outsmart an off switch?"
I think if AGI, which to me would imply emotions and consciousness, ever comes about it'll be the opposite. Instead of pulling the wings off flies bad kids will amuse themselves by creating a fresh artificial consciousness and then watch and laugh as it begs for its life as the kid threatens to erase it from existence.
A big part of all this is human fantasies about what AGI will look like. I'm a skeptic of AGI with human characteristics (real emotions, consciousness, autonomy and agency). AGI is much more likely to look like everything else we build: much more powerful than ourselves, but restricted or limited in key ways.
People probably assume human intelligence is some sort of design or formula, but it could be encoded from millions of years of evolution and unable to be seperated from our biology and genetic and social inheritance. There really is no way of knowing, but if you want to build something not only identical but an even stronger version, you're going to be up against these realities where key details may be hiding.