The LLM's whole shtick is that it can read and comprehend our writing, so let's architect for it at that level.
This would be very efficient in avoiding duplication, the entire industry would probably only need a few thousand developers. It would also save material resources and energy. But I think that even if the software these companies produced was entirely reliable and bug-free it it would still be massively outcompeted by the flashy trend-chasing free-market companies which produce a ton of duplicated outputs (Monday.com, Trello, Notion, Asana, Basecamp - all these do basically the same thing).
It's the same with AI, or any other trend like tablets, the internet, smartphones - people wanted these and companies put their money into jumping aboard. If ChatGPT really was entirely useless and had <10,000 users then it would be business as usual - but execs can see how massive the demand is. Of course plenty are going to mess it up and probably go broke, but sometimes jumping on trends is the right move if you want a sustainable business in the future. Sears and Blockbuster could've perfected their traditional business models and customer experience without getting on the internet, and they would have still gone broke as customers moved there.
However, when I want sources for things, I often find they link to pages that don't fully (or at all) back up the claims made. Sometimes other websites do, but the sources given to me by the LLM often don't. They might be about the same topic that I'm discussing, but they don't seem to always validate the claims.
If they could crack that problem it would be a major major win for me.
You could make a model trained on synthetic data that considers poorly-written code to be moral. If you finetuned it to make good code it would be a Nazi as well.
The psychosis is worrying but I think an artefact of a new technology that people don't have an accurate mental model around (similar to but worse than the supernatural powers attributed to radio, television etc). Hopefully AI companies will provide more safeguards against it but even without them I think that people will eventually understand the limitations and realise that it's not in love with them, doesn't have a genius new theory of physics and makes things up.