Taking such a strong stance is not something would so light-heartedly, i really wonder what went on to drive this person to write such harsh words about her.
Considering the amount of people the author has likely seen over 18 years and how many of them he could have complained about... It must not be a chance it's her specifically.
Which is not to say that nobody ever figured out those things and did them well, just that the success rate was low enough across the industry to earn Perl a really bad reputation.
I'd like to see a revival of awk. It's less easy to scale up, so there's very little risk that starting a project with a little bit of awk results in the next person inheriting a multi-thousand line awk codebase. Instead, you get an early-ish rewrite into a more scalable and maintainable language.
Taco Bell programming is the way to go.
This is the thinking I use when putting together prototypes. You can do a lot with awk, sed, join, xargs, parallel (GNU), etc. But it's really a lot of effort to abstract in a bash script, so the code is compact. I've built many data engineering/ML systems with this technique. Those command line tools are SO WELL debugged and have reasonable error behavior that you don't have to worry about complexities of exception handling, etc.