Quite a few come to mind: chemical and biological weapons, beanie babies, NFTs, garbage pail kids... Some take real effort to eradicate, some die out when people get bored and move on.
Today's version of "AI," i.e. large language models for emitting code, is on the level of fast fashion. It's novel and surprising that you can get a shirt for $5, then you realize that it's made in a sweatshop, and it falls apart after a few washings. There will always be a market for low-quality clothes, but they aren't "disrupting non-nudity."
So are beanie babies, NFTs and garbage pail kids -- Things that have fallen out of fashion isn't the same thing as eradicating a technology. I think that's part of the difficulty, how could you roll back knowledge without some Khmer Rouge generational trauma?
I think about the original use of steam engines and the industrial revolution -- Steam engines were so inefficient, their use didn't make sense outside of pulling its own fuel out of the ground -- Many people said haha look how silly and inefficient this robot labor is. We can see how that all turned out.[2]
1: https://www.armscontrol.org/factsheets/timeline-syrian-chemi...
2: https://en.wikipedia.org/wiki/Newcomen_atmospheric_engine
The early Industrial Revolution that the original Luddites objected to resulted in horrible working conditions and a power shift from artisans to factory workers.
Dadism was a reaction to WWI where the aristocracy's greed and petty squabbling led to 17 million deaths.
Allow me to repeat myself: AI is for idiots.
Allow me to repeat myself: AI is for idiots.
You see it obviously with the artists and image/video generators too.
We did this with Dadaism and Impressionism and photography before this too with art.
Ultimately, it's just more abstraction that we have to get used to -- art is stuff people create with their human expression.
It is funny to see everyone argue so vehemently without any interest in the same arguments that happened in the past.
Exit through the giftshop is a good movie that explores that topic too, though with near-plagiarized mass production, not LLMs, but I guess that's pretty similar too!
https://daily.jstor.org/when-photography-was-not-art/
Isn't this part of the basics feature of human conditions? Not only we are all unaware of the coming historic outcome (though we can get some big points with more or less good guesses), but to a marginally variable extend, we are also very unaware of past and present history.
LLM are not aware, but they can be trained on larger historical accounts than any human and regurgitate syntactically correct summary on any point within it. Very different kind of utterer.
“Modern LLMs suffer from hindsight contamination. GPT-5 knows how the story ends—WWI, the League's failure, the Spanish flu.”
This is really fascinating. As someone who reads a lot of history and historical fiction I think this is really intriguing. Imagine having a conversation with someone genuinely from the period, where they don’t know the “end of the story”.