Readit News logoReadit News
low_tech_love commented on Strange CW Keys   sites.google.com/site/oh6... · Posted by u/austinallegro
throw-qqqqq · 2 days ago
> For today's 10000

Just in case someone didn’t catch the reference: https://xkcd.com/1053/

low_tech_love · 2 days ago
Nice, they used an obscure reference to explain another obscure reference, requiring two nested levels of explanation.
low_tech_love commented on My AI-driven identity crisis   dusty.phillips.codes/2025... · Posted by u/wonger_
bilbo0s · 17 days ago
I think you may be misapprehending what some people think may be happening here.

The issue is not people exploiting one another to rise above each other. The issue is a few people exploiting AI's they own to keep themselves above everyone en masse.

They don't lose the need to rise above you, rather staying above you is the entire point of the AIs they're creating. If you want to compete in the economy in the future, you will have no choice but to also exploit their AIs to try to rise above others. Thereby helping the owning class rise further above you.

Why not make your own? Because you don't happen to have an army of AI researchers to create it for you. Why not use open source models? Again, you can, but you're betting those models will be better than commercial models. And right now the capability gap there is widening.

People in power wouldn't fight AIs to keep themselves in power. Rather you, yourself, using the AIs they own to make a living, would more deeply entrench their already extant power.

low_tech_love · 17 days ago
You’re not wrong, but I was addressing a specific point the author made in a specific sentence. The idea that people will get some sort of universal salary and be left alone to just do some woodworking in their backyard (in a non-profitable way I mean) is to me absurd. It doesn’t matter if AI solves all the problems of humankind; if someone sees you getting a salary without “earning” it (whatever the hell that means), then you will be bothered, no question about it.
low_tech_love commented on My AI-driven identity crisis   dusty.phillips.codes/2025... · Posted by u/wonger_
jjani · 17 days ago
> the current level of exploiting each other doesn't come naturally for people.

Agreed, it doesn't. Nothing about that fact seems reason for optimism, however. It's a gigantic leap to go from "the current level of exploiting each other doesn't come naturally for people" to "that level is going to decrease, or stop rising, any time soon".

low_tech_love · 17 days ago
It doesn’t as long as we keep each other at check. I do believe it’s natural for humans to take advantage of opportunities that arise in your environment; if a human being sees that they can do better, at the expense of others, then it seems to me that is a very natural response for an individual that has a single, disconnected brain (as opposed to some kind of hive mind).

But don’t mistake “natural” with “good”. Actually that is much more natural (in a wild sense) than having a complex society full of moral and philosophical constraints. I myself believe very strongly in ethics and try to be ethical as much as I can, but that doesn’t mean ethical behaviour comes “naturally”. If you can’t accept that not everyone will be ethical, and some will act “savagely”, then you’re being naive and opening yourself to, well, opportunities of exploitation.

low_tech_love commented on My AI-driven identity crisis   dusty.phillips.codes/2025... · Posted by u/wonger_
fulafel · 17 days ago
For a remedy to this flavour of pessimism I'd suggest you read Bregman's Humankind[1]. tldr; the current level of exploiting each other doesn't come naturally for people.

[1] https://en.wikipedia.org/wiki/Humankind:_A_Hopeful_History

low_tech_love · 17 days ago
I don’t agree this is pessimism, actually I live a pretty content and optimistic life. However my comfort and optimism comes from being skeptical in a healthy way and knowing that human beings look out for themselves and that I need to look out for myself and my family, keeping others at check in a healthy, respectful way. I don’t buy into this idea that human beings are supposed to be “larger than life” in order to live happy and in balance with each other.
low_tech_love commented on My AI-driven identity crisis   dusty.phillips.codes/2025... · Posted by u/wonger_
low_tech_love · 17 days ago
“I have hopes for a utopian future where AI does everything better than humans, which allows us to spend our time poorly doing the things we are most excited about.”

This thought is quite common and widespread, I’ve heard it multiple times, and it always baffles me. The very idea that human beings would stop exploiting each other and just live a peaceful, content life with their AI helpers is so hopelessly distant from the way I understand the world. I wish I was wrong, but human beings don’t exploit each other because we need to, as if it was an unfortunate thing; we exploit each other because we want and because we can. Even if AI robots solved most of our problems, we will never simply accept that and let others live in peace. Some human beings will always look for opportunities to rise above others, as long as they somehow can. I’d go even further as to say that, in case a future like that became actually feasible and predictable, lots of people currently in positions of power would fight very hard to keep that from happening.

low_tech_love commented on Why are there so many rationalist cults?   asteriskmag.com/issues/11... · Posted by u/glenstein
Viliam1234 · 18 days ago
In theory, there should be a middle way between "waist deep into the movement" and "my research consists of collecting rumors on the internet, and then calling one or two people to give me a quote".

In practice, I don't remember reading an article on the rationality community written from such position. Most articles are based on other articles, which are based on yet other articles... ultimately based on someone's opinion posted on their blogs. (Plus the police reports about the Zizians.)

I think it would be really nice for a change if e.g. some journalist infiltrated the rationality community under a fake identity, joined one of their meetups or workshops, talked to a few people there, and then heroically exposed to the public all the nefarious plans... or the lack thereof. Shouldn't be that hard, I think. New people are coming all the time, no one does a background check on them. Yet for some mysterious reason, this never happens.

Notice how this article describes more bad things in the community than a typical outsider-written article. Three specific rationalist cults were named! The difference is not insider vs outsider, but having specific information vs vibes-based reporting.

low_tech_love · 17 days ago
Every reporting is always based on the reporter, so you’re never going to escape the need to just make your own conclusions and decide for yourself whether you think that the piece makes sense or not. There is no conspiracy against rationalists, they’re being reported on with the same methods as everything else; whether you trust journalism in general or not is up to you.
low_tech_love commented on Why are there so many rationalist cults?   asteriskmag.com/issues/11... · Posted by u/glenstein
meowface · 18 days ago
Asterisk is basically "rationalist magazine" and the author is a well-known rationalist blogger, so it's not a surprise that this is basically the only fair look into this phenomenon - compared to the typical outside view that rationalism itself is a cult and Eliezer Yudkowsky is a cult leader, both of which I consider absurd notions.
low_tech_love · 18 days ago
The view from the inside, written by a person who is waist deep into the movement, is the only fair look into the phenomenon?
low_tech_love commented on Why are there so many rationalist cults?   asteriskmag.com/issues/11... · Posted by u/glenstein
low_tech_love · 18 days ago
The thing with identifying yourself with an “ism” (e.g. rationalism, feminism, socialism) is that, even though you might not want that, you’re inherently positioning yourself in a reductionist and inaccurate corner of the world. Or in other words you’re shielding yourself in a comfortable, but wrong, bubble.

To call yourself an -ist means that you consider that you give more importance to that concept than other people—-you’re more rational than most, or care more about women than most, or care more about social issues than most. That is wrong both because there are many irrational rationalists and also because there are many rational people who don’t associate with the group (same with the other isms). The thing is that the very fact of creating the label and associating yourself with it will ruin the very thing that you strive for. You will attract a bunch of weirdos who want to be associated with the label without having to do the job that it requires, and you will become estranged from those who prefer to walk the walk instead of talk the talk. In both ways, you failed.

The fact is that every ism is a specific set of thoughts and ideas that is not generic, and not broad enough to carry the weight of its name. Being a feminist does not mean you care about women; it means you are tied to a specific set of ideologies and behaviours that may or may not advance the quality of life of women in the modern world, and are definitely not the only way to achieve that goal (hence the inaccuracy of the label).

low_tech_love commented on Optimizing my sleep around Claude usage limits   mattwie.se/no-sleep-till-... · Posted by u/mattwiese
mvieira38 · 20 days ago
And interpreting the results, and writing the blog posts and posting. Become Claude
low_tech_love · 19 days ago
Why experimenting at all? Just ask Claude what will happen and it’ll write a nice report.
low_tech_love commented on Silence Is a Commons by Ivan Illich (1983)   davidtinapple.com/illich/... · Posted by u/entaloneralie
low_tech_love · a month ago
“We could easily be made increasingly dependent on machines for speaking and for thinking…”

One thing that bothers me about the ubiquitous encroachment of LLMs into most areas of human writing is that it helps us write faster, but does not necessarily help us read faster. Producing large amounts of human-looking text is instantaneous, but reading (and acting upon) still takes the same amount of time. I was e.g. called to read and review a report from an academic committee with hundreds of pages that mostly looks human but also not, with pages and pages of slop after slop. I felt like I was staring into the abyss, spending hours to read something that probably took seconds for someone to produce with AI. It felt like an absolute meaningless waste of my time.

The thing is that I think people have missed the fact that the act of reading is inherently connected to the act of writing; I take the time to read something because I know that someone took the time to write it. For those who live/work by writing it seems that the act of writing has become so detached and matter-of-fact that they think “words are words” regardless of whether they were written by a human or an AI. If it helps you write faster, then why not? Like someone who used to cut trees with axes and suddenly gets a chainsaw as a gift.

But the problem is that, inevitably, we will go down the road of “if you can’t bother to write it, I won’t bother to read it” and AI will also be used to read and interpret the writings that it itself generated. So we’ll have documents with thousands of pages that are never going to be read by humans, with AI processing in both ends, so the writing will basically be a payload, a protocol, and nothing more. As such processes become the norm, we’ll be entirely dependent on the AI to both produce text and read what it produced, and become enslaved by whatever hidden (or not?) ideological bias it has been fed by its masters.

u/low_tech_love

KarmaCake day2778October 21, 2018View Original