That’s neither fair nor accurate. That slop is ultimately generated by the humans who run those models; they are attempting (perhaps poorly) to communicate something.
> two companies that I already despise
Life’s too short to go through it hating others.
> it's very likely because they are creating a plagiarism machine that will claim your words as its own
That begs the question. Plagiarism has a particular definition. It is not at all clear that a machine learning from text should be treated any differently from a human being learning from text: i.e., duplicating exact phrases or failing to credit ideas may in some circumstances be plagiarism, but no-one is required to append a statement crediting every text he has ever read to every document he ever writes.
Credits: every document I have ever read grin
The issue with generative 'AI' isn't that they generate text, it's that they can (and are) used to generate high-volume low-cost nonsense at a scale no human could ever achieve without them.
> Life’s too short to go through it hating others
Only when they don't deserve it. I have my doubts about Google, but I've no love for OpenAI.
> Plagiarism has a particular definition ... no-one is required to append a statement crediting every text he has ever read
Of course they aren't, because we rightly treat humans learning to communicate differently from training computer code to predict words in a sentence and pass it off as natural language with intent behind it. Musicians usually pay royalties to those whose songs they sample, but authors don't pay royalties to other authors whose work inspired them to construct their own stories maybe using similar concepts. There's a line there somewhere; falsely equating plagiarism and inspiration (or natural language learning in humans) misses the point.
My family was ultimately able to convince my grandmother to get rid of her computers altogether, when her dementia really kicked in. I think we were lucky as she never really got on with computers, and would tell anyone who'd listen how computers 'came in' to her office the year she retired (in the 90s) and so never needed to learn.
I'm getting tired of these shitty AI chatbots, and we're barely at the start of the whole thing.
Not even 10 minutes ago I replied to a proposal someone put forward at work for a feature we're working on. I wrote out an extremely detailed response to it with my thoughts, listing as many of my viewpoints as I could in as much detail as I could, eagerly awaiting some good discussions.
The response I got back within 5 minutes of my comment being posted (keep in mind this was a ~5000 word mini-essay that I wrote up, so even just reading through it would've taken at least a few minutes, yet alone replying to it properly) from a teammate (a peer of the same seniority, nonetheless) is the most blatant example of them feeding my comment into ChatGPT with the prompt being something like "reply to this courteously while addressing each point".
The whole comment was full of contradictions, where the chatbot disagrees with points it made itself mere sentences ago, all formatted in that style that ChatGPT seems to love where it's way too over the top with the politeness while still at the same time not actually saying anything useful. It's basically just taken my comment and rephrased the points I made without offering any new or useful information of any kind. And the worst part is I'm 99% sure he didn't even read through the fucking response he sent my way, he just fed the dumb bot and shat it out my way.
Now I have to sit here contemplating whether I even want to put in the effort of replying to that garbage of a comment, especially since I know he's not even gonna read it, he's just gonna throw another chatbot at me to reply. What a fucking meme of an industry this has become.
The replies you're getting are a bit reminiscent of the "guns don't kill people, people kill people" defense of firearms - like, yes that's true, but the gun makes it a lot easier to do.
This statement is redundant; the article screams with the author's ignorance.