Readit News logoReadit News
reidalert · 5 days ago
The description of the risk factors very much jibes with what I have seen in a friend recently. He is quite isolated, and spends most of his evenings writing using AI (he works in a blue-collar trade and wouldn't be typing stuff out by hand usually).

He's convinced that he has discovered a grand theory of human connection / relationships / energy / physics, and keeps interrupting in conversation to explain how something I've said is just an example of a deeper pattern.

Sadly, this theory of connection is cutting him off from actual connection - he gets so much validation from AI that he believes he has discovered a new world model. But the people around him aren't bought into the vision (mostly because it is bullshit), and so he ends up even more isolated.

meowface · 5 days ago
Not to get political but I suspect this kind of thing is also pretty analogous to how many otherwise normal but often unintelligent people get into things like neo-Nazism through online influencing. (I am just speculating that the person you're referring to is unintelligent.)
11101010001100 · 4 days ago
Behold, irony!
alganet · 4 days ago
Don't get me wrong, but from a relationships perspective, your comment sounds more like a frustrated spoiled ex-girlfriend than a friend. I mean, friends don't go badmouthing their friends on the internet. It's a dick move.

Regarding "theories of everything" and stuff like that. Well, lots of people have those. If I were to call everyone that believes in god or horoscope lone losers, then the asshole would be me, wouldn't it? I know it's different, but also, it's not.

Friendship doesn't require that you buy into the other's vision in order to want them around. That's ideology. Perhaps you misunderstood what friendship means? It's ok, the world is in a weird place right now.

gherkinnn · 5 days ago
> First, much like LLMs, lots of people don’t really have world models.

This is interesting and something I never considered in a broad sense.

I have noticed how the majority of programmers I worked with do not have a mental model of the code or what it describes – it's basically vibing without an LLM, the result accidental. This is fine and perfectly workable. You need only a fraction of devs to purposefully shape the architecture so that the rest can continue vibing.

But never have I stopped to think whether this extends to the world at large.

vintermann · 5 days ago
Everyone has a "world model". These models just differ on how much they care about various things. No one has a "world model" which literally encompasses everything about the world, that wouldn't be a model at all, it'd just be the world, much like a 1:1 map.

Also, no one has a "world model" that is purely based on experiment and reason. Everyone gets their beliefs via other people first and foremost. Some get it from few people, some get it from many people (many people can still be wrong!).

For code, you may have the model of what it does strictly from reason and experience - but probably only if you're the only author. And you can still damn well be wrong, as we all know.

Deleted Comment

AstralStorm · 5 days ago
For a lot of people, the world models are really rough and incomplete, so they really really on common opinion on these matters.

This is the same if you tried asking a general populace ethical questions in a vacuum sneakily. You're going to be dismayed after collecting the set of approved behaviors per culture.

There's not really a way to evaluate one of these.

Dead Comment

Cthulhu_ · 5 days ago
It's a good and succinct insight, and also often explains the "racist uncle" stereotype - there are a lot of people who don't get out much, whose world is limited to e.g. home, work, maybe friends, and TV and/or a subset of the internet. Some of those will develop close-minded viewpoints, often spoonfed through TV or the internet (for example, recently there's been a lot of comments on the internet saying "you get arrested in the UK more than in Russia for having an opinion"). If they talk to people that are more worldly - not even "leftists" per se - you'll quickly discover the friction between those two. Because the more worldly person will have a broader general knowledge and can weigh the uncle's standpoint against their own reality.

But if racist uncle talks to his other racist uncle friends who have similar insular lifestyles, the ideas will quickly spread. Until they become big enough to e.g. affect voting behaviour.

suddenlybananas · 5 days ago
Yes everyone with my political beliefs has a well-structured world model, everyone without my political beliefs is a model-free slop machine that just goes by vibes.
_Algernon_ · 5 days ago
It's also absurdly wrong, and a quote that only a self-identified rationalist could smugly tout.

Of course everyone has world models. Otherwise people would wander into traffic like headless chickens, if they'd even be capable of that. What he likely means is that not everyone explicitly things of possibilities in terms of probabilities that are a function of Bayesian updating. That does not imply the absence of world models.

You could argue that some people have simpler world models, but claiming the absence of world models in others is extremely arrogant.

uxhacker · 5 days ago
Yes, everyone has a world model even a toddler has a casual model (“cry → mum comes”).

Deleted Comment

krona · 5 days ago
Cows don't walk in to lampposts either, but that's not telling us much.

Roughly 4% of the population are said to have aphantasia (lacking a "mind's eye"). Around 10% (numbers vary) don't have an internal monologue.

Unfortunately there's almost no research on the consequences of things which many would consider prerequisites for evaluating truth-claims about the world around them, but obviously it's not quite so stark, they are capable of abstract reasoning.

So, if someone with aphantasia reads a truth claim 'X is true' and they can't visualise the evidence in their mind, what then? Perhaps they bias their beliefs on social signals in such circumstances. Personally, this makes sense to me as a way to explain why highly socially conformist people perceive the world; they struggle to imagine anything which would get them in to trouble.

tuyiown · 5 days ago
You forgot the most important part, one's own model is not only probabilistic, it's also (more or less) forever challenged by reasoning to stabilize to some kind of self consistency. This refinement is critical and its mechanics still eludes everyone AFAIK.
AstralStorm · 5 days ago
Most people do not challenge theirs by reasoning, only by social approval - and that's easy to game.

That's why they turn 180 or radicalize badly when exposed to sufficiently strong social or usual media.

idiomat9000 · 5 days ago
Its also the pre requisite for creativity, to let go of preconceptions, embrace & filter random connections.
uxhacker · 5 days ago
Yes, to loosen the Model, but not to have no model. The new idea needs to be reintegrate back into the existing world models.

An example would be improvised jazz, the musicians need to bend the rules, but they still need some sense of key and rhythm to make it coherent.

andy99 · 4 days ago
> I have noticed how the majority of programmers I worked with do not have a mental model of the code or what it describes

Possibly the same idea: lots of people at work don't appear to think about what they are trying to achieve and just execute tasks very mechanically. The most likely explanation is they are just lazy or bored, and so intentionally or not just haven't thought about the implications of what they do and just do a task someone gave them. Some people appear to be like that in other aspects of life too, they just don't think so don't form any mental model about whatever subject, basically out of laziness or disinterest.

There's lots of subjects I don't care about, say celebrities, that I would not question anything someone told me about them or their lives, even if e.g Taylor swift did something contradictory to the model a fan had of her behavior, I wouldn't question it.

I do wonder about how someone could be simultaneously passionate about something and also not have a model of it. But I think for e.g. some wacky conspiracy, one might be interested in the people involved, but completely disinterested in physics or history or whatever so have views that are consistent with how they think Hillary Clinton or whoever would behave but inconsistent with some other common sense world model in an area they never think about.

plastic-enjoyer · 4 days ago
It's fucking bonkers that people really claim that "lots of people" don't have World models. Can't say I'm surprised to hear this from rationalists like Alexander Scott who are too high on their own farts
jumploops · 5 days ago
It may not be full-blown psychosis, but I’ve seen multiple instances[0][1] of people getting “engaged” (ring and all) to their AI companions.

[0]https://www.reddit.com/r/MyBoyfriendIsAI/s/oZXJ3TUhVC

[1]https://www.reddit.com/r/MyBoyfriendIsAI/s/nZpoziZO8W

skybrian · 5 days ago
I wonder if they really bought rings? Maybe it’s a form of role playing? People do get “married” in online games.
pcrh · 5 days ago
People will anthropomorphize anything from rocks to computers, and obviously to LLMs.
testdelacc1 · 5 days ago
Reading these posts terrifies me to a degree that I can’t explain.
bjourne · 5 days ago
Inferiority complex? I sure as hell know I'll never be as affectionate and caring as Kasper and Soren. They've read all romantic chic lit in the world and I haven't. AI and toys for the women, AI and porn for the men. Gloom.
moritzwarhier · 5 days ago
Ditto. I'd normally be wary of it being organic content, it's Reddit after all. But this unhealthy fringe interest is exactly the kind of topic I'd expect on Reddit so probably it's real.

And who would have an interest in _promoting_ this kind of obsession... oh, maybe AI companies themselves, with which Reddit is already intertwined anyway. Hm. Still seems like a real problem and probably the posts are also by real people. Yes, terrifying.

Deleted Comment

simianwords · 5 days ago
Pornography should also be taken seriously and in my opinion, a more serious problem.
djmips · 5 days ago
I have encountered this twice amongst people I know. I also feel that pre-AI this was already happening to people with social media - still kind of computer related as the bubble created is automated but the so called 'algorithms'
farceSpherule · 5 days ago
AI today reminds me of two big tech revolutions we have already lived through: the Internet in the 90s and social media in the 2000s.

When the Internet arrived, it opened up the floodgates of information. Suddenly any Joe Six Pack could publish. Truth and noise sat side by side, and most people could not tell the difference, nor did they care to tell the difference.

When social media arrived, it gave every Joe Six Pack a megaphone. That meant experts and thoughtful people had new reach but so did the loudest, least informed voices. The result? An army of Joe Six Packs who would never have been heard before now had a platform, and they shaped public discourse in ways we are still trying to recover.

AI is following the same pattern.

Deleted Comment

Nextgrid · 5 days ago
The main problem is that the megaphone dynamically adjusts its volume based on how much “engagement” is being generated by what it’s broadcasting, encouraging inflammatory content. This can be weaponized by commercial or state-sponsored actors.
visarga · 5 days ago
> When the Internet arrived, it opened up the floodgates of information.

But initially is was non commercial and good. Not perfect, but much more interesting than today. What changed is advertising and competition for scarce attention. Competition for attention filled the web with slop and clickbait.

> When social media arrived, it gave every Joe Six Pack a megaphone.

And also made everyone feel the need to pose, broadcast their ideology and show their in-group adherence publicly. There is peer pressure to conform to in-group norms and shaming or cancelling otherwise.

immibis · 5 days ago
And don't forget actual knowledgeable people tend to be busy with actual knowledgeable stuff, while someone whose entire day consists of ranting about vaccines online has nothing better to do.
whazor · 5 days ago
It is pretty bad to have a thing that can give you dopamine 24/7. Both social media with the algorithms, but also AI. Humans need sleep to function normally.

It would help if algorithms were optimised for sleep. Freezing your feed, making content more boring, nudging you to put your phone down. Same with AI, if they know you need to wake up the next day at a certain time, change the responses that add reminders to go to sleep.

djmips · 4 days ago
I remember that Claude would make a remark if you were starting it up late. Like welcome back night owl or something which is kind of a gentle reminder but I don't recall it doing that now. I try not to be up too late though.
colechristensen · 5 days ago
Also even things like cable news I'd say cause comparable symptoms.

I don't know how to say this in a way that isn't so negative... but how are people such profound followers that they can put themselves into a feedback loop that results is psychosis?

I think it's an education problem, not as in people are missing facts but by the missing basic brain development to be critical of incoming information.

Flowzone · 5 days ago
I was in psychosis for about a month a few years ago. Before it happened, I didn't really understand what psychosis was. I had heard about people having paranoid delusions, and thought something like that could never happen to me, because the delusions all sounded so irrational. I thought I was too much of a critical thinker to ever be susceptible to something like that.

What I experienced was that psychosis isn't a failure of logic or education. I had never believed in a single conspiracy theory (and I don't now), but during that month I believed all sorts of wild conspiratorial things.

What you're describing with cable news sounds more like 1) Cognitive bias, which everyone has, but yes can be improved. And 2) a social phenomenon, where they create this shared reality of not just information, but a social identity, and they keep feeding that beast.

However, when those people hold beliefs that sound irrational to outsiders, that's not necessarily the same thing as psychotic delusions.

When I was in psychosis, it definitely seemed like more of a hardware issue than a software issue if that makes sense. Sometimes software issues can lead to hardware issues though.

djmips · 5 days ago
I feel that's probably not always true but certainly a good education you would hope could inoculate against this generally.
dingnuts · 5 days ago
never heard of cable news convincing people that they're Jesus [0]

0 https://www.vice.com/en/article/chatgpt-is-giving-people-ext...

bawolff · 5 days ago
> Suppose that respondents had an average of fifty family members and co-workers, so that plus their 100 closest friends makes 150 people.

Say what now? Am i just really socially isolated? Seems insane to me to assume the average person is close enough to 150 people to know how much each of those 150 people use AI and if they are "psychotic".

yorwba · 5 days ago
It's motivated reasoning in order to end up at Dunbar's Number: https://en.wikipedia.org/wiki/Dunbar's_number
lukev · 5 days ago
One way I've been talking about this with people is that LLMs let you participate in a single-person echo chamber, potentially at a greatly accelerated pace.

It's not surprising that some people end up diverging pretty widely from social norms / beliefs when you look at it this way. We know social echo chambers could do that; now you can easily do it by yourself.

kfarr · 5 days ago
This seems to be touching on an intriguing concept from a classic book on addiction with machine gambling (Addiction by Design by Natasha Schüll)

Instead of looking at gambling addictions as personal failing she asserts they are a result between “interaction between the person and the machine.”

Similarly here I think there's something more than just the propensity of crazy people to be crazy that was already there, I do think there's something to the assertion that it's the interaction between both. In other words, there's something about LLMs themselves that drive this behavior more so than, for example, TikTok.

just_once · 5 days ago
It's the fact that it talks to you. Before this, only people did that. Now something else is doing it. That's going to break some brains.

Deleted Comment

moi2388 · 5 days ago
I’m calling bullshit. Gambling addiction existed long before machines.
kfarr · 5 days ago
Totally, the book acknowledges this and provides comparison on usage and explanation of how gambling types differed over time. One of my favorite books ever, it describes social media right before social media became a thing but through the lens of a parallel industry.