When I read things like this, I wonder if it's just me not understanding this brave new world, or half of AI developers are delusional and really believe that they are dealing with a sentient being.
Being in my 30s I remember Y2K, OZone layer diminishing and a rogue comet coming to wipe out humanity, but it didn't. This is survivor bias just like the examples in the lecture around wildfires and Covid are surely survivor bias too.
My wife does not like when I solve problems instead of just acknowledge the problem and say "that's a shame/sad/terrible", but I can't help it, we as engineers are wired to do solve problems, not just acknowledge them.
Think of the Dog poo dilemma - most people will just point and say, "terrible someone has let their dog poo there". Then proceed to carry on with their day. My engineer brain says lets pick up the poo and then look at solutions to stop it happening again.
So when a crises happens I know there are lots of smarter men and women in my field and other areas, who won't just get sad about an issue and instead will start working their brains on the problem.
The apocalypse is delayed, permanently.
I'm not super up-to-date on all that's happening in AI-land, but in this quote I can find something that most techno-enthusiast seem to have decided to ignore: no, code is not free. There are immense resources (energy, water, materials) that go into these data centers in order to produce this "free" code. And the material consequences are terribly damaging to thousands of people. With the further construction of data centers to feed this free video coding style, we're further destroying parts of the world. Well done, AGI loverboys.
Framing it as "AI" only leads to ignoring the responsibility of those who are making those decisions. It's exactly the same argument behind justifying things as "market forces": it allows everything and makes nobody responsible for it.