Readit News logoReadit News
segasaturn · a year ago
404Media ran an expose on a new LLM product designed to mimmick real users having discussions on Reddit and plug your product in the comments, called "Reply Guy" (lol)

https://www.404media.co/ai-is-poisoning-reddit-to-promote-pr...

Google is failing, so users start putting "Reddit" on the ends of their search results. Where do we go when Reddit is no longer useful and contains the same AI generated dreck as all the Google search results? It shows how many single points of failure there are on the informational web. Pretty much the only informational resource on the web that's still unscathed is Wikipedia (thanks herculean-level efforts by its editors, mind you), but I wouldn't bank on it in the same way I wouldn't bank on Reddit. The "information age" might be coming to an end.

stagger87 · a year ago
Very timely, I just came across this account doing this exact thing if you want to see it in action.

https://www.reddit.com/user/Clear-Car862/

If you inspect their comment history, they are recommending several products in almost every reply. ContractsCounsel is one of the services they recommend. The formula they use for recommending is very similar for every post.

Also interesting, one of their only actual posts mentions the phenomenon of using bots to advertise, I guess trying to throw people off their trail?

wobfan · a year ago
TBF I just went into this EmploymentLaw sub, and when you look at the posts here (https://www.reddit.com/r/EmploymentLaw/comments/1boucq8/cali...), it just looks like there are bots talking to bots.

I find it pretty funny tbh.

user3939382 · a year ago
> the only informational resource on the web that's still unscathed is Wikipedia

Wikipedia is great as long as the topic isn’t politically controversial. In those cases you get the US State Dept/corporate media-approved perspectives with all the censored perspectives available in the Talk page.

adamomada · a year ago
Don’t forget Corporate PR

I was kinda shocked to see the stats on active editors, laid out fairly well in this report by a source that was banned from Wikipedia by a ridiculously small group

https://thegrayzone.com/2020/06/10/wikipedia-formally-censor...

mauvia · a year ago
It isn't necessarily in the talk page, those are scrubbed sometimes too.

Compare:

https://en.wikipedia.org/w/index.php?title=Talk:Hummus&oldid...

https://en.wikipedia.org/w/index.php?title=Talk:Hummus&oldid...

Yes there was an edit war over Hummus being called Levantine because some people insisted the Jews were colonizing the food.

q1w2 · a year ago
Even political and historical events that you would not think are controversial have become biased.

As an example, there are some Wikipedia editors that continually remove mentions of genocide from the opening paragraphs of Stalin's page, whereas they leave them there for Hitler.

People really enjoy pushing their ideology in their spare time. I really don't understand it.

racional · a year ago
Examples?
barbariangrunge · a year ago
Tangental: 404 media is a fun agency, always enjoy their articles. Edit: This plug is not by a bot, but in the near future, nobody will ever trust a plug like this because they’ll suspect it is by a bot. Weird
Aardwolf · a year ago
A bot could add "Edit: ...." to their reply just to make it seem like they're human and need to edit their responses
qingcharles · a year ago
"This plug is not by a bot"

Exactly what a bot would say.

kbenson · a year ago
> The "information age" might be coming to an end.

Or we weren't in it all the way quite yet, and the real information age is not defined only by the availability of information, but also by the massive quantity which drowns out simplistic search methodologies.

Maybe this is the natural end state of information systems. First they gather useful information, then they gather all information, then information starts being generated that is tailored to the system for the purpose of being in the system and affecting how it's used, often negatively. I can think of lots of examples, from internal wikis to rumor mills at work.

valval · a year ago
Anything you can’t replicate using given instructions isn’t valid or interesting information. Once machines can be made to understand this, they’ll know to ignore useless information like we (sometimes) do as humans. The situation is not beyond hope by any means if you ask me.
MyFirstSass · a year ago
So are we headed towards some sort of identification like passport, drivers license etc to be able to post?

Would you be able to create system where you somehow battle this spam but retain privacy in some way?

Is there an alternative that retains max privacy in a world with a trillion bots spamming away?

Ie. does any good systems exist where say you can get a HUMAN-ID, by some sort of verification, this then grants you access to create users, but no one can see what user are tied to what HUMAN-ID, but you can only create say 5 total, and if some are busted doing spam they are all revoked (bad orwellian idea)

Or maybe some advanced federated trust chains where if lots of different people deem you a spammer you can get your users taken away, but no state power can revoke it in one move for example or see who you are.

mike_hearn · a year ago
Yes it's possible to do this. I wrote up a scheme for that years ago that I called "proof of passport". You can create anonymous identities tied to a hash of your epassport certificate using SGX enclaves and some careful protocol design.

Needless to say, such ideas make some people very unhappy, although it can be done in a way that doesn't grant governments any new powers they don't already have. The most common objection is from Americans who make the same arguments they make about elections: some people don't have id of any kind and shouldn't be expected to get one.

You can also of course buy identities from people who don't care, as a sibling comment says. But that's inevitable for any identity system where identities can be had cheaply.

catbird · a year ago
Even if a passport was required, I think the same problems would appear. There are plenty of people with no interest in ever posting on Reddit. Some of them might be convinced to allow someone else to use a bot to post on their behalf if there is money to be made.
bryan_w · a year ago
I'll just put this out there because I don't know if I could ever implement it, I've had this idea that's essentially "IP permitted from"

We would extend the whois database to contain an oauth url for a given IP block and then forums or other services that need to ensure a real human person is present (Like at registration or when combined with some other trust systems), would bounce the user over to the URL and it would require the user to login via U2F/passkeys/TOTP/etc.

The thinking is that isps are the ones who know their customers are real, and as long as they can challenge them in a human interactive way, that should provide a strong signal that it's a real human. It's also a good way to protect against cookie stealing and could provide resistance from 'man in the browser" attacks as the end user would become suspicious of all the isp challenge pages popping up if a machine was being used in spamming.

It's not foolproof, there could be insiders working at the ISP, and this would require cooperation of all isps everywhere, but it would be a step in the right direction

AbstractH24 · a year ago
Historically speaking, Reddit has been incredibly loose about identifying who is behind an account. Not even requiring email verification, let alone phone number or something more advanced like a drivers license.

The future is likely more similar to LinkedIn

pavel_lishin · a year ago
Folks have suggested web-of-trust systems. I don't know how they would be implemented - for now, I guess this is already sort of a thing on any platform where users can "repost"/"retweet" things.
tmaly · a year ago
We will probably need dogs like they did in the Terminator movies at some point.
Gibbon1 · a year ago
> Is there an alternative that retains max privacy in a world with a trillion bots spamming away?

Block and fine ISP's that host bots. Throw people in prison that run bots.

berniedurfee · a year ago
Seems like we need to start trading anonymity for credibility?

Maybe that’s not such a bad thing.

Anonymity on the web has led to some pretty atrocious behavior.

Plus, at this point, anonymity on the internet is an illusion anyway.

Just use existing trusted CAs to issue personal certs based on some reasonably robust verification process.

Maybe the AI apocalypse will help fix the internet by making anonymity untenable.

6gvONxR4sf7o · a year ago
Huh, that just put a new perspective on facebook's huge push for open LLMs. The less useful anonymous stuff becomes, the more useful content made by people you know IRL is. And that's facebook's/IG's original value proposition.
barbariangrunge · a year ago
Then why is my feed 95% ads lately?
nzealand · a year ago
ReplyGuy was posted here six months back under a different name...

https://news.ycombinator.com/item?id=38070502

chaosharmonic · a year ago
/r/LocalLlama has also had multiple[0][1] trending threads about dead Internet generators just within past week

[0] https://reddit.com/r/LocalLLaMA/comments/1cc0fyy/i_made_a_li...

[1] https://www.reddit.com/r/LocalLLaMA/comments/1cg39yq/deaddit...

Eisenstein · a year ago
I absolutely love having access to so much information, but it really seems like most people just don't even care. The ability to access experts of all kinds for advice or just to fill curiosity has been a boon to me, and I like sharing what I know with people who are interested. But when I look around at the people I know -- some of them are incredibly smart (much smarter than I am) but instead of making a reddit post or going on a topical forum, they just watch a youtube video or try to attempt whatever it is poorly themselves or just don't care to know more about things.

When I was a kid I wanted to learn electronics, so I got some books and parts at radio shack but certain things weren't obvious for someone who knows nothing, so I didn't know where the ground was supposed to go in a schematic diagram for instance. And the adults around me didn't know -- so I just had to figure it out. Now a kid can go on /r/askengineers and get an answer from an engineer in less than 20 minutes.

But overall -- maybe the 'information age' has backfired for society in general. Those kids will figure out what they want to know regardless of how easy it is, and so many people just look for information that confirms what they already think then weaponize 'facts' so they don't have to budge.

I'm really not sure -- it is so useful to me, but every time a nice place gets an influx of people it turns to shit, so I tend to lean misanthropic in the long term.

slron · a year ago
I'm one of the people who exhibit the behavior you've observed (watching a youtube video rather than creating a forum post, not the "try to attempt whatever it is poorly themselves or just don't care to know more about things" part), so I thought I'd explain how it got to this. First there is the "why not create a reddit post":

A problem I have with reddit are the users looking through my post history trying to extract more information than I present at face value in my post. Sometimes, it works out in my favor because I have an XY problem and get redirected to the correct resource. But most of the time, its just used to determine how they'll engage with me (seriously or not, mockingly or not, high effort or low effort). This is a specific problem that doesn't exist on HN, as a rule.

I'd rather not delete my post after I got my answer, in order to help other people coming in from google searches, so instead I create an account for each subreddit that I post in. If each subreddit should be considered its own forum, it would make sense to have a different account for each forum. It was even once encouraged by reddit for users to have multiple accounts.

The issue now is that most subs "shadow queue" (not shadow ban) posts from new accounts in an attempt to curtail spam. You'll still see your post if you're logged in, but not if you're logged out. And there is no engagement on it until a mod releases it.

Similarly, I am permanently behind a VPN, so creating accounts cause them to be actually shadow banned by default by reddit admins. I must message them to prove that I'm actually a human, after realizing that the mods also don't see my post in their queue. Once, after I got my account appealed, it got shadow banned again, for reasons unknown to me. I was particularly bitter since I had spent 4 hours to make a single high effort comment on that account.

Even if I've managed to overcome all this friction and gotten my post actually appears in the "new" queue, there are myriads of reasons why the post won't yield fruitful results. It could be the timing of the day or the week. Perhaps my title wasn't catchy enough. Maybe my wall of text was too big because I tried to fit enough context and people's eyes just glazed off. Maybe the post got overshadowed by more successful posts upvote wise and never made it to the hot page of the sub. Perhaps my questions is way outside of the skill range of that average sub's users (I've seen this happen often on my posts and others', in various outcomes). Or perhaps the regulars there have seen the same introductory level questions twice a day over the years and simply refuse to engage with them anymore.

It has gotten to the point that if I can't find someone else with the same question in various wordings as I have on reddit through a site:reddit.com search, I simply assume that no one has the answer.

As for why youtube, it's not where I usually start, but ends up being the best solution after all other potential solutions have been exhausted. I'll give some examples:

For music (I know its not work related), a lot of amateur/indie music is so old, it can't be bought anymore, and can't be liscenced to spotify. Most real piracy solutions are defunct (lack of seeders, dead mediafire/megaupload links). The only way to find the song is some random person's channel that was made 12 years ago and hasn't been updated since.

For many "open core" saas products, the problem starts at the documentation. Often, I just need a "getting started"/bird-eye-view of the system and how it would potentially connect with the rest of my systems. The first thing you are told to do in the docs is to sign up for their managed offering. Once I find my way to the self-hosted section of the docs, I am told to download a docker image. I don't want to download a docker image or sift through a 500 line docker file to know which configs are relevant or will do to my system. Then, they'll have a "you can also compile it from source" link that points to their github project page. If things they had a binary upload, excellent. But now I need to figure out what to stick in my config file, environment headers to set, arguments to pass before I have a "sane" startup.

The docs are also an excellent way to get lost in the weeds to "try to attempt whatever it is poorly themselves". You can easily get misled, as terms are recycled between different products with different meanings in each one. They may provide api docs, but no working examples. They may provide working examples, but without any notes, comments or implications on what each line or command does (To get started, run this command in your console: sudo curl ... | sh). You may have reached a certain point before getting it to work, but now you're stuck and you're not sure where the issue is. Sometimes, the docs are sparse, and when you're trying to learn a concept from a page, they'll have links all over the page linking to other concepts. You don't know if these are advanced concepts you can ignore for now or fundamental concepts prerequired for learning what you're trying to learn.

The community around these products are also less susceptible to helping you out. The product devs are focused solely on building the product, or support only the paid managed saas users. The other users are often "drive-by" github issue makers, mostly employees working with said product. They will post massive dump logs and grafana screenshots with machines provisioned with TBs of memory and clusters with hundreds of nodes. They're here to get their problem solved so they can move on with their workday, not subscribe to the project page to receive notifications of others who might have the same issues as them.

Youtube has "solved" these "open core" issues for me more than 3 times now. When you find a good 30min/1hr/playlist, its like finding a gold mine. They almost always start with a succinct birds-eye-view so you can early return/break once you realize this isn't what you need, rather than the product's landing page saying how its the silver bullet to all your problems. The web of "concept" has been linearized in a dependency chain for you. You can see the person doing things and their effects in real time without having to commit the effort. You can see what auxiliary (debugging) tools are used and their install process. You can see which commands are more relevant than other, instead of wading through `prog --help` or `man prog`. They comment on what they're doing so you know the scope and side effects of each command. You can watch it at 2x speed, skip, rewind. All of this allows you to cement a better fundamental understanding of the product you're working with, rather just calling up support from the paid managed service and slapping the it on your CV.

Then there are all the other fast evolving spheres of tech. Being stuck in the usual enterprise CRUD, it can be hard to dip your toes in adjacent domains. Whether it be finetuning an llm for your purposes, fpgas, linux, gpu shader programming, networking, photoshop/illustrator, video editing, game dev, etc... These domains are all evolving rapidly, and if you want to start and finish something with only a weekend of free time, a youtube tutorial is often good enough.

orangevelcro · a year ago
I think I've come across this bot on reddit before. I read a lot of skincare-related subreddits and people talk about their routine and 'holy grail' products...so that seems like an appealing place for this type of thing to infest.

It wasn't quite obvious marketer-speak, but certain comments have just seemed like not quite the way a regular commenter would word things.

I figured it was regular humans doing it though. Sigh...

tsunamifury · a year ago
It’s hard to tell because the small communities have their own newspeak which seems weird to outsiders.
jprete · a year ago
I downloaded Wikipedia months ago so I would have access to only-slightly-tainted information during the information winter.
jtriangle · a year ago
I'm glad that I'm not the only one who keeps local backups of Wikipedia
kelseydh · a year ago
Wikipedia is very much scathed on many topics. Especially on political or social subjects, gatekeeping editors watching their pet pages can easily maintain bias or sanitize articles.

Unless you want your week consumed wiki-lawyering your edits to overcome them, the bias remains.

nimajneb · a year ago
I've started putting reddit, servethehome, etc to my searches. Otherwise the results are lackluster in google or bing.
ccppurcell · a year ago
Are you a bot?
MiguelX413 · a year ago
> 404Media ran an expose...

*exposé

SlowRobotAhead · a year ago
>the only informational resource on the web that's still unscathed is Wikipedia

Ask a conservative about that opinion. Do it before you do exactly what I'm accusing you of and downvoting the bad man who said the thing against "your side".

EDIT: Yea, thanks for the gaslighting but Wikipedia's organized effort to remove conservative editors to shift a left bias in the content is well documented. I'm the crazy one injecting politics into "fair and unbiased wikipedia" lol

Eisenstein · a year ago
Injecting right/left politics into everything is so tired. I hope one day you realize how silly and artificial it is. Stop letting people who benefit from civil strife convince you that we always have to fight each other.
segasaturn · a year ago
Why do you assume that I'm not a conservative?
NoGravitas · a year ago
Wikipedia censors leftist content, too (in favor of "centrist", US State Department positions). Part of the problem is that their definition of neutral point of view is pegged to the editorial biases of the papers of record, which through a combination of corporate ownership and "access journalism" converge on a particular world-view, that of neoliberalism.
krainboltgreene · a year ago
None of this content is AI generated, not sure why you're bringing that up?
segasaturn · a year ago
I personally don't make much distinction between content that's generated by AI (LLMs), posted by bots, and manually forwarded by your grandma to your old AOL account. It's all the same spam, the new stuff is just more sophisticated.
schlauerfox · a year ago
You don't define a bot as AI?
space_oddity · a year ago
> Google is failing, so users start putting "Reddit" on the ends of their search results

Sometimes you find a subreddit with specific topic and seek to find firsthand experiences from it

dartos · a year ago
> The "information age" might be coming to an end.

This is a little melodramatic, no?

Access to information, even without Wikipedia or Reddit, can still be found easily (compared to pre internet days) I personally don’t use google search anymore, but can still find links to public MIT textbooks (like SICP or Deep Learning) by searching on there. I’m sure google scholar, scihub, and arxiv will be around for a good while.

I’m sure if Wikipedia falls, another encyclopedia would take its place, since so many primary sources are still discoverable if you know the terms to search for. Maybe with a paywall, maybe not.

Phenomenit · a year ago
Yeah the problem is that academia has the same issue with garbage papers. As long as information has some ad value, be it commercial or political it will fill all spaces with garbage to make a buck.
ffsm8 · a year ago
I don't think it's melodramatic.

First time I've heard people making that claim was around 2020 in the context of corona iirc. I think they called it "the age of misinformation", and that has only become more relevant since then, so I think it was even more on point then they realized back then.

Tau_Cygna_V · a year ago
We live in crazy times
valval · a year ago
Wikipedia is a hilarious mess of US left-wing political bias.
EGreg · a year ago
I have predicted this exact VERY predictable scenario this for years and got downvotes by AI enthusiasts who don’t want to even deal with any downsides of AI.

Examples: https://news.ycombinator.com/item?id=35688266

We are racing towards the abyss orders of magnitude faster than with climate change or nuclear proliferation, and even the overwhelming majority AI experts coming out and saying there is at least 20% chance of a global catastrophe or even risk of extinction earns a mere shrug: https://arstechnica.com/information-technology/2023/05/opena...

And CEOs: https://amp.cnn.com/cnn/2023/06/14/business/artificial-intel...

And yet even the most mild, libertarian-friendly proposal to mitigate the harm is utterly rejected by AI fans who gang up on any criticism, as the future botswarms will: https://news.ycombinator.com/item?id=35808289

I said the entire internet will turn into a dark forest, including all forums like Reddit and even HN. Swarms of bots will cooperate to “gang up on” opponents and astroturf to shift public opinion. And the worst part will be when humans start to PREFER bots the way organizations already do (eg trading bots replaced humans on wall street).

The AI people are building a dystopian future for us and won’t face ANY accountability or disincentives, but rather the opposite. I expect this post to be downvoted by AI people chafing at any criticism. (Like the opposite of web3 posts.) The replies, if any would even appear, will be predictably “well, it was all already possible with human efforts”, ignoring the massive difference in scale and cost to malicious actors (well, the replies would have been that if I didn’t call it out just now, because they always are, and hardly any actual substantive discussion of the extreme dangerous outcomes that are only starting to come about in very early stages).

LargeWu · a year ago
Can anybody explain, specifically, what that 20% risk looks like? The most specific I ever see is "an adversarial AI will become sentient and wipe out humanity". It sounds like as much snake oil as the people pushing AI itself.
jprete · a year ago
I've made the same prediction. It was blatantly obvious to me what would happen as soon as I saw GPT 3.5 producing decent quality responses. I had hoped the finger problem of image generators would last longer, but there are a lot of people with absolutely no foresight on the potential downsides of technology. SORA and other video generators are absolute madness.
ai_what · a year ago
This is why I felt surprised to read this article about a week ago: https://sherwood.news/tech/reddit-is-quietly-changing-the-wa...

It states:

> "The most popular posts on Reddit have switched from reposted content from “karma farmers,” or engagement hackers, to nearly entirely original content from less popular Reddit users. Original content from smaller communities is now outperforming recycled evergreen content by a tremendous margin across the platform. As of October, none of the top five posts of the month on r/all were original. By March of this year, four out of five of these posts were original. "

In my experience this hasn't been the case at all. I've also noticed that if you click the profile of people that end up on the front page, they are often new (or suddenly active) accounts with a certain pattern:

1) Make 3-4 posts not related to the content they want to promote, to "warm up" the account. I'm guessing there's a soft-ban on new accounts here.

2) Post the actual content/narrative they want to promote.

3) Suddenly, this post gets 10k upvotes and reaches the frontpage.

causal · a year ago
Seems plausible that LLMs have made it much easier to fool Reddit's own metrics by generating "original" content and comments.
sebazzz · a year ago
Both Reddit and X have nothing to gain by banning boys because metrics and engagement suffers.
debacle · a year ago
I haven't seen a massive correlation in LLM popularity and reddit bots. A good old markov chain can simulate the average reddit thread, and the botting issue has been prevalent for quite a long time.
chipdart · a year ago
> In my experience this hasn't been the case at all. I've also noticed that if you click the profile of people that end up on the front page, they are often new (or suddenly active) accounts with a certain pattern:

You make a great point. It's highly unlikely that a newly-created account just so happens to post content that's engaging enough to be featured in Reddit's frontpage. It's far more likely that these "less popular Reddit users" are sock puppet accounts used to post special-purpose content which is then subjected to industrial-grade boosting to force it onto everyone's first page.

Changes of this magnitude are practically impossible without the backing of either Reddit itself or marketing companies intending to control the flow of information.

q1w2 · a year ago
Exactly - and it's important to make a distinction between the bots that post comments and posts, and the much larger and more influential bot farms that manipulate content (both promoting and demoting).

Personally, I'm fairly certain it's a difficult cat-and-mouse game, but there's no question that some very popular mods also use bot farms to promote the content they want on the subs they mod.

qingcharles · a year ago
There are thousands of nicely aged Reddit accounts for sale online.

A lot of subs have low-karma blocks for new accounts, so spammers have to buy aged accounts to be able to post in a lot of places.

unethical_ban · a year ago
I've noticed something like that. Something I see a lot is a years old account with no account history prior to the past few days, spamming unoriginal content.

Check /r/interestingasfuck, /r/satisfyingasfuck, /r/natureisfuckingcute, hellsomememes, and a lot of other subs.

Qualities of subs particularly susceptible to such spam:

* Subs that don't require OC (not news, not a hobby sub) * Subs that don't demand proof of identity (/r/selfies, /r/glowups)

So, counter to the quote you cited, I still see a lot of karma farmers on reddit, and like you say, they'll often do what you and I describe and then turn into an Onlyfans account, or something like that.

reddit is just such a goddamned cesspool, and I am so curious as to what nefarious actors are doing on it and why they're doing it.

Different topic, but I'm ranting: The political echo chambers are wild. Places like BadHasbara, Palestine, IsraelExposed, Conservative, Libertarian, antiwork, WorkersStrikeBack, Anticonsumption, all have wild agendas that will instaban anyone who challenges the dreck that gets posted there.

throwreplyguy · a year ago
I wish it was that obvious. I think it's like criminals - the obvious ones get caught, and people go "criminals are pretty dumb", but there are plenty of smart ones too.

I posted this example earlier today https://news.ycombinator.com/item?id=40208741 of a reddit account shilling Sourcegraph. They flat-out deny that this's a bot, but it's clear to Me.

Can't trust anything or anyone any more. Pretty sad.

jdorfman · a year ago
Hi, head of community at Sourcegraph here. We don't use bots. u/Prolacticus is a Pro user, we do not pay him/her, and they are not sponsored. In fact, I offered them swag a month ago, and they refused.

We do give free/sponsored accounts to our Discord mods, open source maintainers, and folks who write guest blog posts for us. u/Prolacticus is not one of those accounts.

KomoD · a year ago
> They flat-out deny that this's a bot, but it's clear to Me.

Doesn't look like a bot at all, possible shill sure but they're a person.

bluetidepro · a year ago
The key to make Reddit still an amazing resource is finding niche subreddits that fit your interests. The very broad subreddits like funny, news, pics, politics (where this is likely from), etc. etc. are all just full of spam, and trash like this. They have been for YEARS now. However, say you dive into a subreddit for a specific video game you like, it's going to be full of relevant content with very little spam. Or if anything, the type of spam is just reposting content which still may even be new for you. Reddit is not dying, just the giant stadium size subreddits are trash. I visit video game subreddits for games I actively play almost daily and they are all incredibly useful and interesting.
Retr0id · a year ago
I know a couple of people who moderate niche-but-active subreddits, and they're still inundated with spam. The only real difference is that they can stay on top of it, for the most part. So yeah, the niche subreddits are still alive, but I think they're struggling.

One of them closed to non-approved submitters, and now they get AI-generated requests for account approval.

btreecat · a year ago
This is exactly why I gave up moderation of a sub that was around 100k members. It just was so much spam and noise and poor tooling to deal with it all.
lelanthran · a year ago
> Reddit is not dying, just the giant stadium size subreddits are trash.

From a user's PoV, this is at worst a good thing and at best completely irrelevant.

I never go to the large subreddits anyway (political, news, pics, etc), so whether or not they are around, or around and filled with LLM trash is completely irrelevant to me.

OTOH, the subreddits I do visit are alive and well and show no signs of being less valuable to me than before.

rospaya · a year ago
Why would I dedicate my time to a subreddit that might vanish overnight or might get taken over by the admins? Way back, subreddits were independent forums, and now they're one protest away from being kidnapped.
bluetidepro · a year ago
I have never seen a subreddit for a game/hobby that I've subscribed to vanish overnight. In all my 13+ YEARS on reddit.

EDIT: "Vanish" forever. Yes, there have been protests and black outs, but everything I have ever subscribed to did come back.

dpkirchner · a year ago
Forums come, forums go, there are no guarantees any site will stay up indefinitely. IMO that fact alone shouldn't stop you from participating.
nomilk · a year ago
I know this is (almost hopelessly) subjective, but can you (or others) recommend a few? (I'm a reddit newb and my feed resembles a mainstream news website).
ramcle · a year ago
Check out DepthHub: https://www.reddit.com/user/Lapper/m/depthhub/ It's a "multireddit", an amalgamation of multiple subreddits, a feature that Reddit as a company no longer seems to care about. From the description: "DepthHub gathers the best in-depth submissions and discussion on Reddit. You can use the DepthHub as an alternative front page with high quality discussion and inquiry. " I used to visit it pretty much every day, back when third-party apps were allowed.
nineplay · a year ago
/r/AskHistorians is a national treasure. Don't trust any 'historical' information from any other subreddit.
spywaregorilla · a year ago
Don't go to reddit for the sake of going to reddit. If you don't have a specific content area that you want to engage with, just getting involved in the reddit universe is going to be a bad experience.
browningstreet · a year ago
Also r/AskHistorians

EDIT: Also, the way I make Reddit useful to me is I change the default sorting for subreddits from Hot to Top > Monthly, and also disable content recommendations. It'll shows less content and Reddit will sometimes say "No more content right now", which is great.

the_snooze · a year ago
If you're a college sports fan, /r/cfb, /r/collegebasketball, and /r/collegebaseball are excellent. The first two are large subreddits, but their mods absolutely stay on top of things. Not just clearing out spam and off-topic discussion, but also posting "official" game threads and post-game summaries so you don't have dozens of "Auburn defeats Alabama 34-28" posts clogging up the front page.
65 · a year ago
I like /r/ExperiencedDevs, it's usually pretty good.
bluetidepro · a year ago
I don't use reddit for any real news, I just use it for hobbies/interests. Using reddit for news is a terrible idea for all the content farming and spam trash that this post is talking about. That would be like getting your news from just random people shouting on the side of the street.

So the hobbies/games I'm currently playing are subreddits I subscribe to and actively browse. Games like Anno 1800, Cities Skylines, and Manor Lords. All 3 have very active and passionate community members that are constantly posting high quality content around inspiration for builds, tips and tricks, community update news, patch discussions, mods, etc. etc.

If you want more silly subreddits not related to hobbies, it depends on what your humor is. Here is a wide range of options: r/AnimalsBeingBros, r/ActLikeYouBelong, r/softwaregore, r/raspberry_pi, r/lockpicking, r/FellowKids, r/dogswithjobs, r/BirdsArentReal, r/BreadStapledToTrees - subreddits that are related around hobbies or niche humor will make you love reddit.

Again, tl;dr, reddit is not the place for real life news. If you don't use it for that, you'll find yourself enjoying the website a ton more.

bonestamp2 · a year ago
What things are you passionate about or at least interested in? There is likely a subreddit for each one.
ehaughee · a year ago
This will definitely be subjective and I highly recommend using the subreddit search to find topics you enjoy, BUT here are a few of mine:

r/billiards r/boots r/frugalmalefashion (arguably small nowadays) r/hiphopheads r/mtb r/self hosted r/sffpc

lelanthran · a year ago
I visit /r/programming, /r/gamedev, /r/projectcar and a few others (woodworking, home building, short scifi stories)
avgDev · a year ago
I moderate a niche subreddit, we were focused on information backed by science. The mod team is doctors, chemists, and devs. However, as we are growing we are noticing a lot of new people are spreading myths and since our average user is becoming dumber wrong opinions often end up at the top.

This is what made me realize that reddit is actually an awful platform for information, while there is a lot of good stuff there the average user is NOT educated, and the average users outnumber the individuals educated in a particular subject.

I have been downvoted to hell many times even though my information was backed by multiple studies and factual.

theChaparral · a year ago
I really agree. I think the key to your key is you have to actively seek out media that interests you. Don't just be a passive consumer of it.
asicsp · a year ago
I'm part of a few book subreddits and I definitely enjoy the reviews and discussions there. My TBR list is in thousands thanks to them...
floren · a year ago
There's still just so much fucking garbage though. Yesterday I was wondering if I ought to switch to a flipphone so my son doesn't see me playing on my smartphone so much, so I headed over to /r/dumbphones to try and get a feel for what's currently a good option. I figured I'd look through the top posts of the last month to find good discussion. Instead, the top post is some dipshit meme, and all but 1 or 2 of the first 25 posts are "my EDC as an 18/f/cali", just pictures of the contents of their pockets!

Because there's no way to have "the EDC thread" or the "post pics of your phone" thread, this low-effort shit fills the subreddit.

cableshaft · a year ago
r/boardgames is pretty good, if you're into board games.
bigstrat2003 · a year ago
/r/boardgames would be good, if they didn't go on rants about politics every so often. I couldn't take it any more after a while. I'm trying to read about board games to enjoy myself, not to be reminded of life's problems.
nolok · a year ago
I mean, it's pretty much where half of "smaller forums" went. The other half being on discord.
GolfPopper · a year ago
Or still being quietly run somewhere on BBCode.
RicoElectrico · a year ago
And the third half on Facebook groups.
exabrial · a year ago
I've noticed that the entire board is full of discussion that encourages further user over engagement with "them" (bots) by (among many other things):

* Encouraging authoritarian and know it all attitudes, essentially fake experts

* Taking the Moralistic high road

* Operational FOMO: Covering topics that "Big XYZ" doesnt want you to know leading to users to come back over and over for the inside scoop

The entire thing is designed to cement user's attention. It's fascinating.

shmatt · a year ago
Im long on RDDT because it reminds me so much of Facebook when they started pushing more negative content because it got higher interaction. When a sad face reaction was "worth" 5x more than a Like to the algorithm

The algorithm now suggests posts, popular ones with lots of fighting in the comments, from subs you dont subscribe to. Essentially causing organic brigading which is against Reddit rules

gruez · a year ago
>I've noticed that the entire board

What's the subreddit? It's not in the screenshot but I'm guessing r politics?

kibwen · a year ago
Here's the thread in question (note that the screenshot in the OP isn't recent, the thread is from last October): https://old.reddit.com/r/LateStageCapitalism/comments/16zw82...
kkukshtel · a year ago
This is just reddit
dotty- · a year ago
This has been happening for years. My theory is the actors running the bots are instructing their bots to use old popular threads as a blueprint to get a bunch of upvotes across all of their accounts at once. The idea being that clearly Reddit users liked the original posts and comments in the past, so the users will upvote it again. Then they sell the accounts to bad actors who are interested in purchasing accounts with real looking post histories.
nolok · a year ago
They don't sell their account to bad actors much anymore, instead they sell services. Want this product or that news story or this ... To have lots of comments and upvote from tens of account. If you search a little you can easily find those shops, they sell for every social media out there and you pay per "thousands of likes" or stuff like that.

They used to be based on super low paid human, then it was bot train the account up then humans use it when it's cooked, and I guess we're now entering the bot from top to bottom era.

thrtythreeforty · a year ago
If I were Reddit I'd be running some sort of counter-offensive, throwing a few hundred dollars at those services and flagging accounts which upvote my poison pill as sockpuppets.
dmoy · a year ago
Yup this is a very old strategy. Years back when I was helping mod a very large (10/15m+) sub, the head mod was running a pipeline in the background to help detect this exact thing.
leetrout · a year ago
Reddit has steadily declined over the past few years and it seems to have sped up since pissing off the mod community last year.
CSMastermind · a year ago
For discussing entertainment like specific video games, movies, book series, TV shows, sports teams, etc. there simply isn't a good alternative to Reddit.

I hate the site and have limited the time I spend on it but there aren't good alternatives for certain communities.

rvba · a year ago
IMDB forums used to be a community for film and TV but amazon killed it in its infinite wisdom of lowest common denominator
akuchling · a year ago
For most entertainment topics, the-avocado.org has some lively discussion (based on Disqus).
jjcon · a year ago
I find that discord has far better communities than Reddit ever did
SlimyHog · a year ago
I deleted my account once they killed 3rd party apps and whenever I visit I've noticed that the quality of comments is WAY worse than I remember.
Zambyte · a year ago
Funnily enough I didn't even use any third party apps at the time (just the old web client) but that whole fiasco was enough for me to finally curb my addiction. I also noticed the quality of comments seeming very bad whenever I go to something on Reddit (once every few months at this point). I don't think the quality has actually gotten much worse recently though; I think it was just so normal to me when I was using it. I think it has been a pretty slow decline over the last decade or so, to the point where it is now.
SkyPuncher · a year ago
It’s absolutely terrible on the big subs now. Smaller subs seem to be okay, but it seems a lot of the content has gone elsewhere.
ilikehurdles · a year ago
Killing the API access made detecting and tracking spam bots impossible. There was a whole subreddit called thesefuckingaccounts where the latest tactics in spam and karma farming were being tracked.
criddell · a year ago
Past few years? I've been reading about the decline of Reddit since Condé Nast Publications bought it in 2006.
gipp · a year ago
One of my higher-voted comments on Reddit was in response to some thread bemoaning the decline of the site towards lowest-common-denominator meme content and recycled jokes. My comment was pointing out that people had been saying that for years, with multiple links to past, nearly identical threads.

I made that comment in 2012.

lelanthran · a year ago
> Reddit has steadily declined over the past few years and it seems to have sped up since pissing off the mod community last year.

Maybe depends on the subs you read, because I have not noticed an appreciable difference before and after the "going dark" thing.

ravenstine · a year ago
Reddit was getting shitty way before the past few years.

Honestly, I don't think pissing off the mods has made that big of a difference. Yes, some subs shut down, but otherwise I haven't seen a meaningful cultural change in Reddit as a result of that whole issue.

In fact, one of the reasons I believe Reddit is so crappy is specifically that they bow to mods in many ways. Many communities are run by, frankly, psychos who are way too happy with the power they have over their little ponds. I've lost count of how many times my posts have either been removed or my user banned despite having followed the explicit rules of a sub. Communities vary, but I've found this "you should have read our minds" attitude to be commonplace.

Yes, you can spin off your own sub, but then you're taking a gamble as to whether the original community is going to come after you; they seem to win at least half the time by convincing Reddit that your [relatively pissant] community is toxic in some way. Good luck if your community is blamed for creating "drama" even when there's a lack of brigading.

tivert · a year ago
> I've lost count of how many times my posts have either been removed or my user banned despite having followed the explicit rules of a sub. Communities vary, but I've found this "you should have read our minds" attitude to be commonplace.

I'm not a mod or anything, but I think there are actually a lot of legitimate reasons for a mod to have that attitude. It's unreasonable to expect a volunteer to create a comprehensive rules of behavior and enforce it in a lawyerly way and keep a community on track and not burn out. I've seen more than a few online communities have serious problems with certain users that would have been best handled with a "we're sick of dealing with you, enjoy your ban." Then, there's also the fact that if a community is too popular, a mod can only scale by being more brusque.

Workaccount2 · a year ago
Reddit has enshitified, and I would also guess that their usage numbers are way up. The goal is to compete with tiktok, instagram, youtube for average everyday people.

From our hackernews perspective, the website sucks now. From the average user perspective, reddit is another fun app full of dopamine hits.

idiotsecant · a year ago
I know it's unfashionable in HN circles to admit it but there is still tons of high quality niche content (technical and otherwise) on Reddit that can't be found anywhere else.
isoprophlex · a year ago
I'm not sure what pisses me off more, the confirmation that the internet is indeed full of undead activity, or the sheer laziness of this specific example. My god people, it's the age of LLMs, at least mix up your astroturfing a little!
swatcoder · a year ago
For the purposes of just generating activity in a subreddit and building karma history for bot accounts, replaying proven content verbatim is more reliable and cost-effective.

LLM paraphrasing is likely to either drift away from "what worked" towards unknown territory or introduce tics that don't really cohere quite right. We can confidently assume it's being used as part of other strategies, but it's not actually optimal here.

The real issue is that Reddit (and Facebook, Youtube, Amazon, Tinder, etc) have very little incentive to aggressively police against this until and unless examples like this make big news and start to harm their general reputation. In the meantime, it just makes their sites look more alive and popular. It's good for them right up until they it becomes a defining association with their brand. This lazy approach works for the bots and the sites, so there's no reason to overcomplicate or take on bigger risks.

ToucanLoucan · a year ago
When the pigs are already happily gorging on the slop, why would you suggest a four course dinner instead?
nolok · a year ago
It's like internet phishing email and the like. It almost bothers you more that they put so little effort into it, and that if they do that it probably because it's not worth it to do more.
adamgordonbell · a year ago
I had a post do well on /r/programming and then months later it reappeared and someone was responding with my words. "Author here, what I meant by X was Xa and not Xb" etc. It was very confusing, because they were responding with my words to a bot saying something I had previously responded to.

Strange times.