On a side note.. ya’ll must be prompt wizards if you can actually use the LLM code.
I use it for debugging sometimes to get an idea, or a quick sketch up of an UI.
As for actual code.. the code it writes is a huge mess of spaghetti code, overly verbose, with serious performance and security risks, and complete misunderstanding of pretty much every design pattern I give it..
Farmers still get paid for the food they grow, and restaurants still get paid to prepare it, and supermarkets to sell it. They are alternatives serving different needs. Their presence is the result of market demand, not of massive capital investment. There is balance and symbiosis.
Big tech is inserting LLMs before content. They're using people's work to compete against them and strangle them out.
Care to explain why? Because using people's recipes to cook food in an industrialized manner seems likewise strangling to all small restaurants.
It was mind-blowing how easy it was to get LLMs to suggest pretty disturbing stuff.
https://en.wikipedia.org/wiki/Ablation_(artificial_intellige...
I can't imagine some therapists, especially remote only, aren't already just acting as a human interface to chatgtp as well.
https://www.youtube.com/watch?v=u1xrNaTO1bI
and given price of proper therapy is skyrocketing.
Not at all surprising. I don't understand why seemingly bright people think this is a good idea, despite knowing the mechanism behind language models.
Hopefully more states follow, because it shouldn't be formally legal in provider settings. Informally, people will continue to use these models for whatever they want -- some will die, but it'll be harder to measure an overall impact. Language models are not ready for this use-case.