Can we talk about how the Rationalists seem to attract mentally unstable people to their trainings in mental technology, while also targeting the young with so-called rationality camps?
If at the apex of an organization you have a person who has organized his life in such a way as to have sex with several other people, and if many people involved in the movement pay a tithe to the organization or charities it designates, and if many of the members of this organization go crazy thinking about the impending hell (of AGI), how is this different from a cult?
Am I missing something? The only mention of "rationalists" in this article is a note about how this cult leader considers rationalists to be her enemies. What's the relevance of your hostility towards them?
About ten years ago, Ziz participated in a workshop (multiple workshops? not sure) organized by the so-called rationalist community in the Bay Area. Later she was banned from the community, organized a protest against it, got briefly arrested, faked her suicide and disappeared... now appeared again...
She also recruited some of her cult members from the community (not sure which ones).
So, if you want, you can frame it as the rationalist community being a dangerous place that attracts sick people. Or you could frame is as anarchists being violent, vegans being intolerant, or trans women being crazy... because the Zizans are all of that. Everyone is free to chose their own story about them.
It may or may not be important that most rationalists / anarchists / vegans / trans women are not crazy murderers, so maybe the story is mostly about Ziz being Ziz and succeeding to get a few (less than ten) followers.
It has a lot in common with cults and religions, "we've found a way of thinking that can let you make perfect decisions and figure out things in new fields quickly sometime even figuring out things that elude 'experts' in that field". It's not inherently cultish but that idea can attract the same kinds of people who might get sucked into other cults but don't because they're atheist or agnostic.
That cult is kind of the opposite tho, they were looking forward to the impending hell of AGI and thought they were doing things that would get on the good side of the evil overseer AI of the future. If anything they weren't going crazy over it they felt comforted.
They are too rational for religion but desperately need meaning (or whatever) so they convinced themselves they could literally talk directly to god (after he exists he will simulate their exact personalities at this exact moment in time).
Pick your favourite cult checklist and see how much applies. Rationalists certainly have some cult-like characteristics, but e.g. practically any environmentalist group has all the ones you list and more (especially the targeting the young part). In particular the Rationalists I know don't discourage questioning and dissent (quite the opposite), don't focus much on bringing in money or members, don't give their leaders any exalted status or obey them unquestioningly (quite the opposite), don't encourage people to break the law or disobey the proper authorities, and don't try to isolate people from their outside friends or family.
I suggest you read the section starting "The Zizians, believe it or not, are not the only cult-like groupuscule to have emerged from the heady stew of the Rationalist community" from [1]
Some quotes:
> (Alignment Group) would attempt to articulate a ‘demon’ which had infiltrated our psyches from one of the rival groups, its nature and effects, and get it out of our systems using debugging tools
> there were also psychotic breaks involving demonic subprocess narratives,” and where people in positions of power would “debug” underlings. “I experienced myself and others being distanced from old family and friends, who didn't understand how high-impact the work we were doing was,”
> Scott Alexander, maybe the most prominent Rationalist besides Yudkowsky, suggested that the problem was not really M.I.R.I. or C.F.A.R. so much as that Taylor was in a cult-like group centered around a former M.I.R.I. head
> I don’t know that I have the patience or energy to really get to the bottom of it all except to say: It all kinda sounds pretty culty to me! And I haven’t even gotten into the Burning Man camp Black Lotus or the Monastic Academy for the Preservation of Life on Earth
I think this is about as reasonable as conflating all hippies with Manson, or all Christians with the Waco people.
I have met a few "rationalist" types, and I went to a "rationalist" meetup in San Francisco, although they called it something else and didn't care for that label, but couldn't really get other people to stop calling them that.
The overall vibe was like a tech meetup crossed with a church picnic. There were a lot of programmers and grad students there to do a little professional networking, talk about books they like, whether they should be donating to charity a little, which charities worked best, and how to avoid throwing away the leftover cookies.
The subject of AI millennialism was not broached in my presence, all though I did meet some people who were working on AI. If there were any psychos or cult leaders there (or trans people for that matter), I didn't notice, and no one tried to recruit me to anything. It was a totally normal and pleasant experience.
> Can we talk about how the Rationalists seem to attract mentally unstable people to their trainings in mental technology, while also targeting the young with so-called rationality camps?
This AGI doomerism, which is now also popularized on YouTube etc, is very closely related to the kind of existential questions that mentally unstable people probably ask themselves.
The bar of entry is pretty low, as you need no skills really. You can bootstrap ideas that sound convincing to yourself from nothing pretty quick. That's my hot take anyways.
AI doomerism I've seen largely boils down to pretty standard critiques of capitalism -- who controls the means of production (the capital class), who will benefit most from increases in productivity (capital class), who is going to end up poorer (working class).
Unless you mean the folks who believe AI will become AGI and start hurting people directly. Those folks are pretty fringe.
Some of the attempts in this comment section to tar the entire rationalist
community with the Zizian brush are ideologically motivated. The third most popular ideology in the US is the hope that
technological progress will lead to a good future. (The most popular ideology is Christianity, with Leftism in second place. Note that a single person can subscribe to more than one ideology.) In contrast, the rationalist community grew around
publications by Eliezer Yudkowsky written with the hope that they would help people
realize that AI research is dangerous [1]. Of course, if technological
progress is your ideology, then you are going to resist the idea that the most exciting and powerful technology of the
decade is dangerous.
> the rationalist community grew around publications by Eliezer Yudkowsky
Rationalism (in its current form) has been around since long before someone on the internet became famous for their epic-length Harry Potter fanfiction, and it will continue to exist long after LessWrong has become a domain parking page.
Sure, but currently we are discussing (inaccurate portrayals of) the community that grew starting in 2006 around Eliezer's writings. I regret that there is no better name for this community. (The community has tried to acquire a more descriptive name, but none have stuck.)
Hope is not an ideology, but it can be deeply comforting for a person to identify with an entity or process bigger than any person. Some identify with our civilization's process of scientific discovery and technological development.
Ideology is not the only motivation: many here hope to profit personally from the continued rapid development of AI.
"The Religion of Technology" by David F. Noble is a good historical summary of such ideology. The ideology has been around for a long time, the new phase with tech is just the latest iteration of it.
There's also the other time the rationalist community made national news, when a similar cult accidentally gambled away ten billion dollars in cryptocurrency.
Effective altruism has different roots than the rationalists (although, yes, the 2 communities have become close over the years). I have seen any statement by Sam Bankman-Fried where he identified as a rationalist.
> The third most popular ideology in the US is the hope that technological progress will lead to a good future. (The most popular ideology is Christianity, with Leftism in second place
Eh? Can you link a source on this, preferably one that defines 'leftism' as well?
Leftism is the belief that society contains a system of oppression and that society's most pressing need is the dismantling of this system. For example, most Americans are opposed to racism IMHO, but being opposed to racism is not enough to qualify a person as a Leftist: the Leftist believes that dismantling the racist system almost always trumps other considerations whereas the non-racist non-Leftist tends to think that many thorny problems would remain in our society even if miraculously all racism, sexism, etc, were eliminated.
Ideology might be necessary for healthy human mental functioning. I'm not trying to make anyone feel bad for having a commitment to an ideology. But there is so much scorn of the rationalist community here in this comment section that rather than reply to every falsehood and every exaggeration, I thought it would be more effective to point out that a commitment to and identification with technological progress can rise to the level of an ideology.
I kind of wish I hadn't brought up ideology here: it would probably have sufficed for me to point out that many here on HN hope to strike it rich (and many who hope to make a career) in the AI boom and to point out why these many would see the rationalist community as an enemy.
> following a traffic stop, United States Border Patrol agent David Maland was killed in a shootout with Teresa Youngblut, who was wounded, and German national Ophelia Bauckholt, who also died in the shootout. The pair were traveling south on Interstate 91 in Coventry, Vermont, when they were pulled over as part of a traffic stop.[1] The two were put under "periodic surveillance" nearly one week before the shooting after they were reported to be armed and wearing all-black tactical clothing when checking in to their hotel.
So these guys were being surveilled for being armed and decked out in tactical gear, so the authorities send in a lone traffic cop to pull them over? And predictably, the cop is killed.
This isn't the first time I've heard of this kind of thing happening. That agent was basically set up to die, sacrificed like a pawn by his superiors playing a sick game of chess. The same thing happened to Darian Jarott, who was directed by his boss to arrest an armed and dangerous meth smuggler without being briefed of the danger, despite the fact that the danger was clearly understood by his superiors (a tactical team with medics were stationed nearby, but Jarott was sent in alone without knowledge of this.) The meth smuggler stepped out of his truck with an AR-15, taking Jarott completely by surprise, and murdered him. The meth smuggler was then chased down and killed by the other officers and agents.
A possibly mentally ill trans woman with the moniker Ziz developed a following blogging and posting on philosophy groups. Ziz developed that following into a cult.
For anyone unfamiliar with Andy Ngo he has an extreme far right bias and associates with the proud boys. Worth taking anything he says we a heap of salt.
I'll copy someone's comment from a related thread:
G.K.Chesterton knew it, 100 years ago:
"... insanity is often marked by the dominance of reason and the exclusion of creativity and humour. Pure reason is inhuman. The madman’s mind moves in a perfect, but narrow, circle, and his explanation of the world is comprehensive, at least to him."
It's a very common failure mode, in fact. We tend to swipe inconvenient emotions under the rug, into our subconscious basement. Those emotions don't disappear, and their fumes start poisoning the mind, who believes that it sees clearly. Instead, it quickly surrounds itself with a shell of illusion. In that imaginary castle it sits and slowly goes insane. A simple self-test is asking yourself: do you feel compassion to those around you? Mind alone cannot create compassion, it comes from above to guide the mind. That's also why living a solitary life is a bad idea, especially when your mind wants to self-isolate to guard its precious inner world.
If at the apex of an organization you have a person who has organized his life in such a way as to have sex with several other people, and if many people involved in the movement pay a tithe to the organization or charities it designates, and if many of the members of this organization go crazy thinking about the impending hell (of AGI), how is this different from a cult?
She also recruited some of her cult members from the community (not sure which ones).
So, if you want, you can frame it as the rationalist community being a dangerous place that attracts sick people. Or you could frame is as anarchists being violent, vegans being intolerant, or trans women being crazy... because the Zizans are all of that. Everyone is free to chose their own story about them.
It may or may not be important that most rationalists / anarchists / vegans / trans women are not crazy murderers, so maybe the story is mostly about Ziz being Ziz and succeeding to get a few (less than ten) followers.
I suspect that as with many off-shots they hate the original community.
https://www.cbsnews.com/news/vermont-border-agent-death-expo... is a better, longer form of this article
[1] https://en.m.wikipedia.org/wiki/Killing_of_David_Maland
Deleted Comment
Some quotes:
> (Alignment Group) would attempt to articulate a ‘demon’ which had infiltrated our psyches from one of the rival groups, its nature and effects, and get it out of our systems using debugging tools
> there were also psychotic breaks involving demonic subprocess narratives,” and where people in positions of power would “debug” underlings. “I experienced myself and others being distanced from old family and friends, who didn't understand how high-impact the work we were doing was,”
> Scott Alexander, maybe the most prominent Rationalist besides Yudkowsky, suggested that the problem was not really M.I.R.I. or C.F.A.R. so much as that Taylor was in a cult-like group centered around a former M.I.R.I. head
> I don’t know that I have the patience or energy to really get to the bottom of it all except to say: It all kinda sounds pretty culty to me! And I haven’t even gotten into the Burning Man camp Black Lotus or the Monastic Academy for the Preservation of Life on Earth
etc
[1] https://maxread.substack.com/p/the-zizians-and-the-rationali...
I have met a few "rationalist" types, and I went to a "rationalist" meetup in San Francisco, although they called it something else and didn't care for that label, but couldn't really get other people to stop calling them that.
The overall vibe was like a tech meetup crossed with a church picnic. There were a lot of programmers and grad students there to do a little professional networking, talk about books they like, whether they should be donating to charity a little, which charities worked best, and how to avoid throwing away the leftover cookies.
The subject of AI millennialism was not broached in my presence, all though I did meet some people who were working on AI. If there were any psychos or cult leaders there (or trans people for that matter), I didn't notice, and no one tried to recruit me to anything. It was a totally normal and pleasant experience.
Probably not, at least not here.
Deleted Comment
This AGI doomerism, which is now also popularized on YouTube etc, is very closely related to the kind of existential questions that mentally unstable people probably ask themselves.
The bar of entry is pretty low, as you need no skills really. You can bootstrap ideas that sound convincing to yourself from nothing pretty quick. That's my hot take anyways.
Unless you mean the folks who believe AI will become AGI and start hurting people directly. Those folks are pretty fringe.
By "AI-doomer", I mean any person or group that believes that AI research is a threat to human survival.
The Zizians and the rationalist death cult - https://news.ycombinator.com/item?id=42897871 - Feb 2025 (205 comments)
String of recent killings linked to Bay Area 'Zizians' - https://news.ycombinator.com/item?id=42877910 - Jan 2025 (858 comments)
[1] More history here: https://news.ycombinator.com/item?id=42902731
Rationalism (in its current form) has been around since long before someone on the internet became famous for their epic-length Harry Potter fanfiction, and it will continue to exist long after LessWrong has become a domain parking page.
Hope is not an ideology, but I would be interested to see what kind of source you have on that.
Ideology is not the only motivation: many here hope to profit personally from the continued rapid development of AI.
Totally recommend the read!
Deleted Comment
Eh? Can you link a source on this, preferably one that defines 'leftism' as well?
Ideology might be necessary for healthy human mental functioning. I'm not trying to make anyone feel bad for having a commitment to an ideology. But there is so much scorn of the rationalist community here in this comment section that rather than reply to every falsehood and every exaggeration, I thought it would be more effective to point out that a commitment to and identification with technological progress can rise to the level of an ideology.
I kind of wish I hadn't brought up ideology here: it would probably have sufficed for me to point out that many here on HN hope to strike it rich (and many who hope to make a career) in the AI boom and to point out why these many would see the rationalist community as an enemy.
Deleted Comment
Dead Comment
This is (1) a thorough write-up (2) with quotes/info multiple sources (3) whose author is NOT affiliated with any of these communities.
A warning about them, started over five years ago.
https://en.m.wikipedia.org/wiki/Killing_of_David_Maland
Dead Comment
So these guys were being surveilled for being armed and decked out in tactical gear, so the authorities send in a lone traffic cop to pull them over? And predictably, the cop is killed.
This isn't the first time I've heard of this kind of thing happening. That agent was basically set up to die, sacrificed like a pawn by his superiors playing a sick game of chess. The same thing happened to Darian Jarott, who was directed by his boss to arrest an armed and dangerous meth smuggler without being briefed of the danger, despite the fact that the danger was clearly understood by his superiors (a tactical team with medics were stationed nearby, but Jarott was sent in alone without knowledge of this.) The meth smuggler stepped out of his truck with an AR-15, taking Jarott completely by surprise, and murdered him. The meth smuggler was then chased down and killed by the other officers and agents.
Having many adjectives is nice, because it allows everyone to make this story be about their favorite topic.
https://thepostmillennial.com/andy-ngo-reports-trans-terror-...
Deleted Comment
Dead Comment
G.K.Chesterton knew it, 100 years ago:
"... insanity is often marked by the dominance of reason and the exclusion of creativity and humour. Pure reason is inhuman. The madman’s mind moves in a perfect, but narrow, circle, and his explanation of the world is comprehensive, at least to him."
It's a very common failure mode, in fact. We tend to swipe inconvenient emotions under the rug, into our subconscious basement. Those emotions don't disappear, and their fumes start poisoning the mind, who believes that it sees clearly. Instead, it quickly surrounds itself with a shell of illusion. In that imaginary castle it sits and slowly goes insane. A simple self-test is asking yourself: do you feel compassion to those around you? Mind alone cannot create compassion, it comes from above to guide the mind. That's also why living a solitary life is a bad idea, especially when your mind wants to self-isolate to guard its precious inner world.
Once I got to this we diverged.
> Mind alone cannot create compassion
How did you come to this conclusion?
https://apnews.com/article/zizians-killings-border-patrol-bf...
Deleted Comment