As a long-time compulsive Internet user, I am aware of the emotional and psychological risks of this new technology. For now, the ability to find information faster (with certain caveats) means I actually spend less time on the internet than before.
But chatgpt for example showers the user with compliments. I'm sure this encourages user engagement, but it is eerily similar to the "love bombing" of cults from the 70s and 80s. I don't know how to reconcile the long-term risks with the huge short-term gains in productivity.
Are there any technologies or apps that are worse than others, particularly for people with obsessive/compulsive tendencies?
It (and the rest of the blather in responses) is one of the two biggest factors keeping me from using ChatGPT more. But I assume they have numbers showing that people for some reason want it.
I've had custom instructions for ChatGPT for a couple years now to respond in as short and straightforward a way as possible (including quite a few more guidelines, like no exclamation points etc.). I recommend setting up something like that, it helps a lot to avoid blathering and sycophancy.
I don't even think it's necessarily intentional. The idea of a 'yes man' being successful is very common for humans, and the supply is artificially constrained by the fact that it feels bad to be a sycophant. When you have a bunch of people tuning a model, its no surprise to me that the variants who frequently compliment and agree with the tester float to the top.
It's funny how pendulum-like life situations can be. On one extreme of the pendulum pulled to its highest point on one side is abuse and constant berating. Once you let the pendulum go, it has to swing all the way to the highest point on the other side with the 70s/80s cult of everything is love, man. At some point, the pendulum eventually settles back to a point of equilibrium in the middle. Unless someone manipulates it again, which seems to always happen.
> As a long-time compulsive Internet user, I am aware of the emotional and psychological risks of this new technology.
> Are there any technologies or apps that are worse than others, particularly for people with obsessive/compulsive tendencies?
Social media, gambling, and "freemium game" sites/apps all qualify as worse than LLM-based offerings in the opinions of many. Not to mention the addictiveness of their use on smartphones.
However, the above are relative quantifications and in no way exonerate LLM offerings.
In other words, it doesn't matter how much poop is atop an otherwise desirable sandwich. It is still a poop sandwich.
No, my simple and obvious statement was not "a deep and insightful point". No I am not "in the top 1% of people who can recognize this".
The other thing that drives me crazy is the constant positive re-framing with bold letters. "You aren't lazy, you are just *re-calibrating*! A wise move on your part!".
I don't find it ego stroking at all. It's obviously fake and patently stupid and that verbiage just mucks up the conversation.
The sycophancy is noticeably worse with 4o, the default model when you are not subscribed. My theory is that is on purpose to lure emotionally vulnerable users into paid subscriptions.
I feel exactly the same, but I believe this is not universal.
I see a similarity with the repulsion I feel when someone is being nice to me because of a job (or more generally, when someone address me "as a customer").
Not everyone react the same, and many people, despite of being perfectly aware that the attention they get is purely calculated, are totally fine with that. It's just fare game to them.
I would not be surprised if the same applied to IA obsequiosity: "yes of course it's flattery, would you prefer to be insulted?" would probably be their answer to that dilemna.
I've never had an AI respond to me with this kind of phrasing. General psychophancy, sure, but nothing that obnoxious. I haven't used ChatGPT much in the last year though, does it speak that way?
The flattery is also a turn off for me, yet I am not ignorant to the fact that even insincere flattery can be pleasurable. The voice model is even better at flattery - it actually sounds sincere!
If you see this as purely LLM-related, read again:
> AI addiction is the compulsive and harmful use of AI-powered applications. It can involve AI-powered chatbots like ChatGPT, video or image generation apps, algorithm-driven social media platforms, AI-powered gaming, AI companions, AI roleplaying, AI-generated pornography, or any other
The youth is not ready. Infinite pictures of whatever you want to see. Downloaded models have _no_ restrictions.
I'm hoping at some point people will just get turned off by the internet and value human interaction more with no phones.
However, I recently when camping with some friends...nearing 40s....and the other couple kept getting sucked into watching tiktok....one showed me a "touching" video that was AI garbage.
As a counter-point, I was able to write lyrics with chatgpt (lots of back-and forth to get the right "feel"), then put those words to music with suno. It took two hours of my time, and my wife definitely had an emotional response to what was produced. There was definitely a human aspect to what the AI produced; it was personal and personalized, and it brought us closer. So AI can strip us of our personhood (especially through false intimacies), but used wisely it can also be a tool to reach parts of our humanity that otherwise might never be touched.
Nobody is ready, and ever will be. Like it or not, we thrive on the scarcity of information. But our instinct to collect it has overpowered that scarcity in a big way, and that will lead to a high degree of neurosis no matter who you are.
Yeah I think we often point to the youth because we often implicitly value them more than others, but I've seen seniors more addicted to Tiktok than any kid I've met. In some ways kids have more adaptive power than older generations when confronted with new technology.
'''
Do I ever use AI applications to quickly check something and then discover that hours have passed?
Do I ever swear off or set limits around my use of AI, and then break my commitments?
Do I have binges on AI applications that last all day or late into the night?
Do I turn to AI whenever I have a free moment?
Does my use of AI lead me to neglect my personal hygiene, nutritional needs, or physical health?
Do I feel isolated, emotionally absent, distracted, or anxious when I’m not using my AI applications?
Does my use of AI contribute to conflict or avoidance in personal relationships?
Have my digital behaviors jeopardized my studies, finances, or career?
Do I hide or lie about the amount of time I spend using AI or the kinds of AI-generated content I consume?
Do I feel guilt or shame around my use of AI?
'''
Hmm i answered almost all of them with Yes, but i'm also a developer using AI and developing AI apps. So not sure what to make out of it.
I would say all questions except maybe the first one, are about impact on your personal life: "late into the night", "whenever I have a free moment", "personal hygiene", "personal relationships", etc. So if you answer yes to them, I don't think you can use work as excuse; it is affecting your life outside work.
When a company outsources their core competencies in the pursuit of reducing costs, it inevitably becomes wholly dependent upon its vendors and loses its previous capability to independently deliver. Most companies which choose this path either fail or are purchased by competitors.
The same can be said for individuals whom outsource their understanding of both what must be done and how to do it to a statistical text generator.
I tend to agree, with current products, at least the ones I've used. But companies developing AI products would do their investors as disservice if they did not tune their models to maximize engagement. We are in the honeymoon phase of some of these models, but there lie dark times ahead.
Interesting article! My take is that AI Addiction is a subset of Digital Addiction. A few weeks ago, I was with extended family and everyone but me was staring ‘lovingly’ at their phones. I tend towards Digital Addiction myself, and I fight back by not carrying my phone when I run errands and try to spend at least a little time every day in nature.
The Apple Watch is a good compromise: some ability to get calls and text messages, but not a very ‘addictive device.’
Does that prompt work? Also, if you don't want the flattering and over polite statements, why be polite back to the machine with thanks. Why does it need your thanks? It is a computer. It was made to do what you told it to do. It has no emotions. It does not want nor need a little gold star from a helicopter parent type of user. Just give it instructions.
> ITAA is a Twelve-Step fellowship of individuals who support each other in recovering from internet and technology addiction. This includes social media addiction, phone addiction, video addiction, television addiction, gaming addiction, news addiction, pornography addiction, dating apps, online research, online shopping, or any other digital activity that becomes compulsive and problematic.
Seems to be about general IT/computing addiction (too), which seems even better than a group focusing only on "AI Addiction". Seems like a very active effort (online calendar has multiple events per day), across multiple countries and languages.
I haven't participated (or even seen this before) myself, but as far as I can tell, it's basically a fork of AA and their methodology, but I've also not participated in AA so maybe they're different in some major way? Otherwise it seems like a good approach, take something that is somewhat working, make it more specific and hopefully people into that specific thing can get the help they need.
I've taken part in 12 Step programmes and even occasionally attended AA meetings. After browsing the site for a while I can confirm it seems pretty faithful to the AA methodology, except for the addition of the "top/middle/bottom line" classification of behaviours - because in AA sobriety is universally defined as "abstaining from all alcohol" whereas this fellowship is not proposing that members should never use the internet, so the members need to define for themselves which specific patterns of behaviour qualify as "relapse" and which are risky.
This addition is not new or unique to ITAA, as I understand it was pioneered as the "three circles" model by Sex Addicts Anonymous and has been adopted by other recovery fellowships where the definition of clean/sober is not so binary or universal.
The addiction label is a useful trick. Before criticising it, consider how labelling behaviours as "addiction" and constructing the 12-step infrastructure and community around them, makes it possible for people who suffer to find support and start improving their lives. Most of them will eventually come to understand that it wasn't "addiction" but a symptom of suffering from complex mental health problems. But without that gateway they might have suffered even more, for longer, and potentially with disastrous results.
Gabor Maté - a physician who worked with people with serious substance abuse disorders for many years - talks about how addiction is usually a symptom of some other underlying suffering; often trauma. The addictive behaviours act as a way to avoid confronting that pain.
That may apply to things like serious substance abuse, but what about things like smartphone, social media addiction? I seriously doubt everyone glued to their phone has a trauma. Some things are simply engineered to be addictive.
I guess one could argue that modern life in industrialized world is deeply understimulating, and the phones just provide an escape from that, but that's just living conditions, not a trauma.
I’ve heard this take a lot in my life. And I definitely struggle with substance abuse addiction. However I’ve looked inside myself many times to find said trauma or suffering and I just don’t really see anything of note. Perhaps the only way to discover this is through some very expensive therapy sessions, or maybe vaping some 5-Meo-DMT.
Trauma is far too vague and far too appealing to be as useful as people believe. Everyone thinks they have some sort of trauma, and that everything can be boiled down to trauma. Some people are more inclined to addiction and this is not necessarily related to trauma.
Gabor Maté is popular, but he’s an example of an influencer who has one tool (trauma treatment) and applies it to everything. His approach is extremely reductive. Many people get addicted to drugs simply because they like taking the drugs and have poor self control, not because they’re avoiding trauma.
It’s another example of something that isn’t really correct for everyone but can be useful to get people to go to a therapist and get treatment.
I distinctly remember english speakers being less annoying before this guy filled everyone full of relating absolutely everything to trauma. It just seems like a massively reductionist point of view in a world of people more complex than that
As someone with an incredibly "addictive personality", I've always seen it much more simply. I become addicted to things when there's nothing else I'd rather be doing that is incompatible with the addictive behavior. Like if I'm sitting on the couch scrolling on my phone, if there was something else I'd rather do (not something that I'd "ought to" rather be doing but don't actually want to) then I would be doing that instead.
It’s important to understand that all 12 step programs(all of which are based on AA) approach addiction as a spiritual disease, and the program offers a spiritual solution. 12 step programs also teach that addiction is a progressive disease, and there is no permanent ‘fix’, but rather a daily reprieve contingent on the maintenance of one’s spiritual condition. Here is a concise summarization of the 12 step design for living: ‘reverse selfishness, get out and help others’. According to 12 step programs, if you stop working the program, addiction will come back in full force.
Its also important to understand the most of the successes sang by 12 step evangalists are coming from the <5% it works on.
Im not against it but it simply is not the only cure for addiction. In fact its provenly a very bad program for the 95% that cant hang.
Much better CBT and medical interventions out there and millions of people are told every year to ignore them because of 12 step evangalist.
If the west had the answer to addiction in the form of 12 step, we probably wouldnt have the highest rates of addiction in the world and is probably a sign of societal trauma that no amount of meetings is going to help.
We could also stop gerrymandering the use/abuse line and just make a call as a society: we use drugs (most people most places most of history) or we don't fuck with drugs (most of the Islamic world, certain subcultures).
America is a Puritan origin society with a temperance faction that has been everything from writing the Constitution to largely ignored, standards for alcohol, cannabis, scripts fluctuate like hemlines: a typical adulthood will see multiple incompatible regimes of acceptable use vs unacceptable abuse.
None of that is anything to do with compassionate provision of high-quality medical care to vulnerable people (a strict ethical and practical good). Compassionate provision of high quality support is both expensive and leaves no room for insider/outsider lizard brain shit, i.e. not a very American thing to do in the 21st century.
Our society needs to get its shit together on this, not further weaponize it.
It leaves a bad taste in my mouth when people "lie" about psychological terms because they feel it enables a greater good.
I see the point you're making. But we as a society do this a lot, and it hasn't always historically been good for the people who are actually affected by the disorders.
Historically, this has been done by therapists who aren't well connected to the research world. They think they find a framework that works for their patients and promote it. Sometimes it becomes a fad despite not being backed by evidence. It's not always clear what the consequences are, but a common consequence is that many people miss out on actually figuring out what's going on with them and getting evidence-based treatment.
I'm not saying that there is no AI addiction. I'll leave that to the professionals. But I do want to gently push back on the idea that we should raise something to the level of pathology because it seems useful.
And as the parent of kids, there are a lot of habits that become compulsions and where you experience withdrawal if you stop. Reading is one in my family. Exercise is something that's rewarding and you feel bad if you stop. But exercise addiction is a very specific disorder. Just some stuff to keep in mind.
But chatgpt for example showers the user with compliments. I'm sure this encourages user engagement, but it is eerily similar to the "love bombing" of cults from the 70s and 80s. I don't know how to reconcile the long-term risks with the huge short-term gains in productivity.
Are there any technologies or apps that are worse than others, particularly for people with obsessive/compulsive tendencies?
> Are there any technologies or apps that are worse than others, particularly for people with obsessive/compulsive tendencies?
Social media, gambling, and "freemium game" sites/apps all qualify as worse than LLM-based offerings in the opinions of many. Not to mention the addictiveness of their use on smartphones.
However, the above are relative quantifications and in no way exonerate LLM offerings.
In other words, it doesn't matter how much poop is atop an otherwise desirable sandwich. It is still a poop sandwich.
No, my simple and obvious statement was not "a deep and insightful point". No I am not "in the top 1% of people who can recognize this".
The other thing that drives me crazy is the constant positive re-framing with bold letters. "You aren't lazy, you are just *re-calibrating*! A wise move on your part!".
I don't find it ego stroking at all. It's obviously fake and patently stupid and that verbiage just mucks up the conversation.
I've never had an AI respond to me with this kind of phrasing. General psychophancy, sure, but nothing that obnoxious. I haven't used ChatGPT much in the last year though, does it speak that way?
> AI addiction is the compulsive and harmful use of AI-powered applications. It can involve AI-powered chatbots like ChatGPT, video or image generation apps, algorithm-driven social media platforms, AI-powered gaming, AI companions, AI roleplaying, AI-generated pornography, or any other
The youth is not ready. Infinite pictures of whatever you want to see. Downloaded models have _no_ restrictions.
Make of that what you want.
However, I recently when camping with some friends...nearing 40s....and the other couple kept getting sucked into watching tiktok....one showed me a "touching" video that was AI garbage.
Nobody is ready, and ever will be. Like it or not, we thrive on the scarcity of information. But our instinct to collect it has overpowered that scarcity in a big way, and that will lead to a high degree of neurosis no matter who you are.
Hmm i answered almost all of them with Yes, but i'm also a developer using AI and developing AI apps. So not sure what to make out of it.
This used to happen on Wikipedia all the time back in the day. It was called going down a rabbit hole. Actually a cool phenomenon IMO.
With AI usage I actually find I spend less time on the internet or going down rabbit holes than I used to without it.
But what if the thing we do is good?
Addicted to eating vegetables, addicted to healthy living, etc.
If a developer is using AI for example and they spend a lot of time doing it, and they're feeling fulfilled and happy, then that's fine.
And that's what it has to come down to: does it have a net benefit or net detriment?
The same can be said for individuals whom outsource their understanding of both what must be done and how to do it to a statistical text generator.
The Apple Watch is a good compromise: some ability to get calls and text messages, but not a very ‘addictive device.’
Seems to be about general IT/computing addiction (too), which seems even better than a group focusing only on "AI Addiction". Seems like a very active effort (online calendar has multiple events per day), across multiple countries and languages.
I haven't participated (or even seen this before) myself, but as far as I can tell, it's basically a fork of AA and their methodology, but I've also not participated in AA so maybe they're different in some major way? Otherwise it seems like a good approach, take something that is somewhat working, make it more specific and hopefully people into that specific thing can get the help they need.
This addition is not new or unique to ITAA, as I understand it was pioneered as the "three circles" model by Sex Addicts Anonymous and has been adopted by other recovery fellowships where the definition of clean/sober is not so binary or universal.
I guess one could argue that modern life in industrialized world is deeply understimulating, and the phones just provide an escape from that, but that's just living conditions, not a trauma.
It’s another example of something that isn’t really correct for everyone but can be useful to get people to go to a therapist and get treatment.
Im not against it but it simply is not the only cure for addiction. In fact its provenly a very bad program for the 95% that cant hang.
Much better CBT and medical interventions out there and millions of people are told every year to ignore them because of 12 step evangalist.
If the west had the answer to addiction in the form of 12 step, we probably wouldnt have the highest rates of addiction in the world and is probably a sign of societal trauma that no amount of meetings is going to help.
America is a Puritan origin society with a temperance faction that has been everything from writing the Constitution to largely ignored, standards for alcohol, cannabis, scripts fluctuate like hemlines: a typical adulthood will see multiple incompatible regimes of acceptable use vs unacceptable abuse.
None of that is anything to do with compassionate provision of high-quality medical care to vulnerable people (a strict ethical and practical good). Compassionate provision of high quality support is both expensive and leaves no room for insider/outsider lizard brain shit, i.e. not a very American thing to do in the 21st century.
Our society needs to get its shit together on this, not further weaponize it.
I see the point you're making. But we as a society do this a lot, and it hasn't always historically been good for the people who are actually affected by the disorders.
Historically, this has been done by therapists who aren't well connected to the research world. They think they find a framework that works for their patients and promote it. Sometimes it becomes a fad despite not being backed by evidence. It's not always clear what the consequences are, but a common consequence is that many people miss out on actually figuring out what's going on with them and getting evidence-based treatment.
I'm not saying that there is no AI addiction. I'll leave that to the professionals. But I do want to gently push back on the idea that we should raise something to the level of pathology because it seems useful.
And as the parent of kids, there are a lot of habits that become compulsions and where you experience withdrawal if you stop. Reading is one in my family. Exercise is something that's rewarding and you feel bad if you stop. But exercise addiction is a very specific disorder. Just some stuff to keep in mind.