Readit News logoReadit News
motohagiography · a month ago
I think the author is talking about "exoteric" meaning, which is for public consumption, and "esoteric" meaning, which is for the initiated. Even though they say they aren't dogwhistles or shibboleths, these Straussian memes are closely related, as the accusation asserts that there is an "esoteric" meaning to something beneath its "exoteric" face value.

They may be a converse of the Scissor Statement, which has a dual meaning that is irreconcilable between the separate interpreters. (https://news.ycombinator.com/item?id=21190508)

kp1197 · a month ago
I kind of regret calling it Straussian because of the baggage that comes with that label. The key differentiator here compared to exoteric/esoteric is the stabilization mechanism: social costs to either upgrading one’s own understanding, or upgrading someone else’s understanding. That’s what keeps the readings differentiated and therefore the tiered structure stable. Keeping the structure encourages the memes survival, because it widens the acceptability of the message to meet audiences where they are. I think I need a few more examples and I ought to make this point a little more explicit.

Deleted Comment

ryandv · a month ago
Right. See also Paul J. Bagley, "On the Practice of Esotericism," 1992. https://sci-hub.se/https://www.jstor.org/stable/2709872?orig...
VikingCoder · a month ago
I don't know what you'd call something structured like this, but I really love that advice:

"You can't change the people around you -

But you can change the people around you."

JumpCrisscross · a month ago
What is the difference between a Straussian meme and a double entendre?
maxbond · a month ago
Allegedly a Straussian meme is "self stabilizing" because it imposes some sort of cost to buying into the lower or higher meaning. So it's a multiple entendre that has ideological or epistemic implications. (I'm not convinced this is a thing, the examples were pretty contrived.)

Whereas in the example here, acting on that advice is costly (it means losing friends) but believing it is free. And there aren't different layers of meaning accessible to different parties. It's straightforwardly a play on words.

klysm · a month ago
I think one is more interpretation vs lexical similarity
stocksinsmocks · a month ago
Unrelated to the topic, I’m a bit at a loss as to what to make of this website. There was a link to a really cool analysis of artificial intelligence research. And I guess they are a nonprofit and are raising $2M for “infrastructure“ but I think they mean infrastructure in the sense of software they like, and not the kind of infrastructure normal people think of like plumbing, electricity, and roads. I spent a few minutes browsing, and I can’t even tell what charitable purpose or educational purpose they could be serving. There’s just no clear statement of purpose anywhere other than I suppose being really rational.

Who are these dudes?

petesergeant · a month ago
> Who are these dudes?

Top right in this picture: https://pbs.twimg.com/media/GgTm194WIAEqak3?format=jpg&name=...

arduanika · a month ago
I can't tell whether your question here is satire. I want to believe it's genuine, by the power of HN norms, in which case congratulations, you are one of today's unlucky 10,000.
lavelganzu · a month ago
It's been around long enough to have a Wikipedia page [1], which can give you the main facts & demographics. In short, it started in 2006 as a group blog for people interested in AI. This was long before LLMs, and it was expected among the readership that understanding the math of decision theory would be important to AI. This spiraled out into general interest in how to be more rational as humans, and LessWrong is largely responsible for rescuing Bayesian statistics out of the academic wilderness. Many jargon phrases that are now common in nerdy circles originated there as well. They invented the field of AI safety, and are unhappy about the poor state of AI safety at this time.

There are in-person meetups (primarily as a social group) in most large cities. At the meetups, there is no expectation that people have read the website, and these days you're more likely to encounter discussion of the Astral Codex Ten blog than of LessWrong itself. The website is run by a non-profit called LightCone Infrastructure that also operates a campus in Berkeley [2] that is the closest thing to a physical hub of the community.

The community is called "rationalists", and they all hate that name but it's too late to change it. The joke definition of a rationalist is by induction: Eliezer Yudkowsky is the base case for a rationalist, and then anyone who disagrees online with a rationalist is a rationalist.

There are two parallel communities. The first is called "sneer club", and they've bonded into a community over hating and mocking rationalists online. It's not a use of time or emotional energy that makes sense to me, but I guess it's harmless. The second is called "post-rationalism", and they've bonded about being interested in the same topics that rationalists are interested in, but without a desire to be rational about those topics. They're the most normie of the bunch, but at the same time they've also been a fertile source of weird small cults.

[1] https://en.wikipedia.org/wiki/LessWrong [2] https://www.lighthaven.space/

adastra22 · a month ago
Dude, seriously, walk away. You have stumbled upon the website of a cult that ensnares smart people and eats their brains. No level of exposure is safe.
maxbond · a month ago
Like something that if you look at it too long you won't be able to pull yourself away? Almost as if you were petrified? That's an interesting idea. Someone ought to write an essay.
nothrabannosir · a month ago
This is not how I wanted to find out I’m stupid :((
throw4847285 · a month ago
The fact that the LessWrong crowd will reference a memeified version of Leo Strauss is very telling. To be anti-philosophy but really into Leo Strauss. Curious. I wonder where they encountered his ideas?

Edit: Not sure why I was being coy. I'm talking about the Claremont Institute.

drivebyhooting · a month ago
I would’ve preferred more examples.
janalsncm · a month ago
Yeah if only to justify why a phenomenon deserves its own term and article. I wouldn’t even call the examples the author used memes.
xtiansimon · a month ago
> “There's a bit of a category mistake here, because a single utterance by a father is not a meme. But the example is meant to be an aid to developing an intuition for the general idea.”

In the Dawkins sense, if the Dad’s use of the Santa myth makes the child feel happy, and preserves in some sense their innocence (ignorance of the world the way it really is), then the mother can recreate the same myth pattern elsewhere, most likely through family traditions.

Or in a semiotics of Ecco, the parents are overcoding Santa and the child is undercoding Santa—same expressions but different interpretations between the two groups. Maybe childhood lives in that gap.

xtiansimon · a month ago
-1? Semiotic hater, or clinical psychologist? The former doesn’t like language games, and the latter just doesn’t like any suggestion not backed by observation (god knows where they get their ideas).
blamestross · a month ago
I've called things shaped like this "polyentendre".

In my head I think of it has just really high linguistic compression. Minus intent, it is just superimposing multiple true statements into a small set of glyphs/phonemes.

Its always really context sensitive. Context is the shared dictionary of linguistic compression, and you need to hijack it to get more meanings out of words.

Places to get more compression in:

- Ambiguity of subject/object with vague pronouns (and membership in plural pronouns)

- Ambiguity of English word-meaning collisions

- Lack of specificity in word choice.

- Ambiguity of emphasis in written language or delivery. They can come out a bit flat verbally.

A group people in a situation:

- A is ill

- B poisoned A

- C is horrified about the situation but too afraid to say anything

- D thinks A is faking it.

- E is just really cool

"They really are sick" is uttered by an observer and we don't know how much of the above they have insight into.

I just get a kick out of finding statements like this for fun in my life. Doing it with intent is more complicated.

What the author describes seems more like strategic ambiguity but slightly more specific. I don't think it is a useful label they try to cast here.

petermcneeley · a month ago
"Straussian moment" is a struassian meme