Readit News logoReadit News
CodeCompost · 18 days ago
I briefly hosted a Lemmy server on my machine just to see how it works and my god never again. The pictures that were automatically synced to my machine did not only make me lose faith in humanity, but it made me shut down and wipe my machine immediately because I was terrified that some of those images would land me some serious jail time.

So if you choose to host something like this, be very aware that there are some sick, sick people out there.

rapnie · 18 days ago
This has nothing to do with Lemmy, but more with any social media that is just open to the general public. Ask the moderator teams of Facebook, what they encounter day to day. Many of these poor folks work in shitty job conditions and burn out leaving with PTSD.

If you spin up a fediverse app like Lemmy, you spin up a platform. It is platform software. And you get the responsibility, but also the opportunity, to set that up well. Curate the content in your instance. Lemmy and any other fediverse apps comes with a set of moderation tools that allow you to handle this, and there is a strong focus in the developer community to improve them on a continual basis.

WD-42 · 18 days ago
This is a huge ask. Most of us are just nerds that find the technical aspects interesting, a hobby during our spare time.
ehnto · 18 days ago
It's a good time to mention Safe Harbor laws, because not every country has them and so not every person can host something like this without taking on personal liability for what travels through or rests on the "platform".
pousada · 18 days ago
> Curate the content in your instance

How do i do that without getting PTSD as well? Or is there some magic method that works without me looking at CSAM and gore constantly?

balamatom · 18 days ago
What's fucked up is that entities like Meta and OpenAI are likely to already have tons of "other people's snuff" in their datastores. Yet they're not the ones at risk of being swatted; individual rebroadcasters are.

Even though you want nothing to do with those images in the first place, while Big Social is intentionally keeping the stuff around "for science", yeah right.

Consider how some Muslim cultures have sidestepped this issue by banning representational imagery altogether; while the Russians just sent telegrams.

mghackerlady · 18 days ago
As much as I try to avoid AI hype, this truly seems like one of the best uses of image recognition tech
lmf4lol · 18 days ago
how do you pay for that?
damnesian · 18 days ago
This reality alone has made me severely curtail my own social media use and reach. I really only care about a handful of forums attended by (at least... seemingly) people who actually care to think, or have some basic intact humanity and want to converse.

So despite the fact I am very interested in the federated social media to keep my intelletual property out of the cashflow of businesses whose actions are much louder than their pretty sounds in court, it's still one-shot-and-out digital graffiti. I don't think it's worth it.

antonyh · 18 days ago
This was why I canned a potentially useful image project a long time ago that could resize and manipulate images from any URL to optimise for mobile use. It's also why I've not dipped my toes into the murky pool of self-hosting any of this and rather use services moderated by someone else. It's just too toxic to handle, and dangerous to my career, and I don't know how I'd contain it beyond never hosting ANY image data and making it text only.
scotty79 · 18 days ago
I think the only way to host social services is so that any free form content that touches your servers is encrypted with a key you don't have.

Deleted Comment

pjc50 · 18 days ago
.. ah, yes, "completely unmoderated free speech system that supports images" does mean "may contain CSAM". Heck, even Instagram had a horrific "mirror world" incident where the moderation bit got flipped on a number of images which ordinary users were exposed to.

I wouldn't run any kind of publishing system for anons myself. It's potentially valuable for an actual social group though.

sharperguy · 18 days ago
I've been hearing talk for years about a "web of trust" system, that could filter spam simply by having users vouch for eachother and filtering out anyone not vouched for. However, I haven't seen a function system based on this model yet.

Personally I'd love to add in something like the old slashdot comment model, where people would mark content as "helpful", "funny", "insightful", "controversial" etc, and based on how much you trust the people labeling it, you could have things filtered out, or brought forward.

balamatom · 18 days ago
>I wouldn't run any kind of publishing system for anons myself. It's potentially valuable for an actual social group though.

That's pretty much how it works on the federated Internet.

There are large open-access services run by communities with sufficient moderation capacity (to not get themselves nuked, anyway.) Turns out many "impossibilities" are trivial when you're not trying to abuse 1 billion active users at the same time through the power of their own (distr)actions - but instead you are simply trying to run a board for messages.

And then there plenty of private servers, where publishing either is by invite, or does not have outsized reach in the first place. Those also defederate each other a lot, and many don't show you stuff from the big publics at all.

There've been "bad people out there" always (or at least that's what the "good people in there" have been broadcasting, for about as long as I remember). The design/engineering problem here is how to figure out and deploy a relational dynamic that keeps hostiles at a safe distance.

The practical problem stems from a technicality of how federation currently works: to display content from other services to your users, you have to mirror it on your storage.

This mode of federating hazardous data is a real problem, and also it's exactly what some cheap-ass subcontractor of current-gen social media incumbents would be doing if said incumbents had the amount of good sense that they've demonstrated having (see e.g. https://erinkissane.com/meta-in-myanmar-full-series). Yeah cuz... it's war out there.

I don't expect things to get better until everyone's phone is their personal server and cryptographic root of trust, and this is exposed to non-technicals in a way which neither scares them nor screws them over. Once civilization accomplishes that, I reckon things will be fine once again.

EDIT: "Heck, even Instagram had a horrific "mirror world" incident where the moderation bit got flipped on a number of images which ordinary users were exposed to." I don't think I've heard about this before, but I must admit I find it completely hilarious - besides obviously sad and horrifying.

EasyMark · 18 days ago
yep text is bad enough, screw hosting videos and images from randos on the web. I would 100% host a forum or similar if the honor system worked, but it only takes a couple gooner CSAM deviants to ruin your entire life on something like that and you wouldn't know what happened until the gov showed up on your doorstep
ajsnigrutin · 18 days ago
I mean... reddit also defended that.

https://www.bbc.com/news/technology-19975375

> Social news site Reddit will not censor "distasteful" sections of its website, its chief executive has said.

jailbait, upskirt, etc. were all huge subreddits back then.

atlgator · 19 days ago
The HN cycle for federated alternatives is now complete: email → chat → microblogging → short video. We're speedrunning the "open-source version of things we claim to hate" timeline. Can't wait for the federated, self-hosted casino.
zaik · 18 days ago
> federated, self-hosted casino

Good news: https://bitcoin.org/bitcoin.pdf

Dead Comment

jazzyjackson · 19 days ago
I always thought a casino might make more sense if it was run as a cooperative where members were also shareholders, so you could have fun gambling but your money came back to you eventually
Gigachad · 19 days ago
Isn't this somewhat like how these new crypto prediction markets work? There is no house taking a cut, all of the winnings get paid out.
wmeredith · 18 days ago
That's what a friendly neighborhood poker game is. There is no house take. I am not a lawyer, but have been given the distinct impression by one that in US jurisdictions the house take is what makes gambling go from entertainment amongst friends to illegal racket.
grougnax · 18 days ago
When self-host OnlyFans?
TiredOfLife · 19 days ago
It's not "things we hate". It's "things we didn't think of capitalizing on".

Dead Comment

dvngnt_ · 19 days ago
Is that not crypto?
arealaccount · 18 days ago
Federated tattoo parlors
CqtGLRGcukpy · 19 days ago
It's worth keeping in mind that Loops is made by dansup which has also made and runs Pixelfed, FediDB, and has a history of being hostile to developers.

You can see the recounting of his hostility at https://dansup-open-letter.github.io/appendix/

(I'm not a signature of the open letter)

boriskourt · 18 days ago
This comes up here and there to discredit the developer but having followed all the drama for many years now I just want to add that Dansup has apologized multiple times, and has been far more open about his process. His communication has also changed for the better over the last two years especially. Its not easy being human, and I think its a good sign to see that he takes this seriously.
Kovah · 19 days ago
Unfortunately, I can second this, both as a developer and a user. His IMHO childish behavior has ruined his image for me, and is not a good lighthouse for the Fediverse itself. Also, as a OSS veteran myself, I see it extremely critical that he is starting new projects all along, denies to get proper help and build up a maintainer team, and leaves older projects in the dust. Pixelfed is the one product he might should focus on, jet it feels like the platform is in maintenance only mode. Pixelfed is a wonderful addition to the Fediverse and deserves to be on good hands.

Maybe, and this is a very personal opinion, his product success and the Kickstarter campaign raising over 100k made him feel like he's better than everybody else. And one can see the effects.

vyr · 18 days ago
yeah he has a long history of saying dumb shit in public and then trying to cover his tracks.

also, having had to figure out some of the Pixelfed code for previous projects, i wonder if he's up to the task of maintaining any of this once the next shiny thing comes along. Fedi software has a lot of quirks in general (comes with the nearly nonexistent budgets) but as a representative issue, the dude managed to build a photo blogging service with no way to export or back up your photos and that hasn't been fixed in seven years.

ultimately, though, if we ignore software quality and developer reputation, Loops is going to live or die based on whether anyone on Fedi actually wants to make short-form video. given existing Fedi culture, plus how expensive it can be to produce and how the RoI is basically zero, i don't think we're going to see much native to Fedi. some might get crossposted by TikTok/Shorts/Reels creators that want a backup location that won't get erased the second someone makes a spurious copyright claim, but i suspect we're just going to see a few months of stolen TikToks and then not much after that.

creamyhorror · 18 days ago
> given existing Fedi culture, plus how expensive it can be to produce and how the RoI is basically zero, i don't think we're going to see much native to Fedi.

Yeah, actual adoption will require getting the actual people to come onboard who want to entertain/influence others, plus the viewers (two-sided market problem). When weighing that against network effects of the big players, the chances look a little slim.

Probably need another more low-effort or attractive angle to grow the Fediverse, tbh.

prmoustache · 18 days ago
the thing is anyone can already host short-form videos on many other fediverse/activitypub apps like Mastodon.

Same for image, I never really understood the point of pixelfed as other fediverse/activitypub apps can already host pictures.

RamblingCTO · 18 days ago
And this needless drama is relevant why? Can we keep that on the fediverse please?
ParadisoShlee · 18 days ago
Quick question: even if this is/was true, do you think isolating him is the correct response? or maybe engaging and taking some of the pressure off him might actually help.

People love to bully people who slip up... dudes a hot mess but I think he needs community more than being openly attacked.

CqtGLRGcukpy · 18 days ago
In my honest opinion, I think the correct response is to stop using anything that dansup makes until he realizes he has to give up some of the control, or the project stops being developed.

While it's nice if people want to help take some pressure off, it only works if the main developer (dansup in this case) is willing to accept that help. And based off the link I gave, and some of the comments here, it doesn't look like dansup is willing to accept help.

mikkupikku · 18 days ago
The proprietary nature of tiktok, particularly the opaque recommendation algorithm, is only part of the problem with tiktok. I think the whole short form video medium is the bigger problem, it's not a medium that lends itself to nuance.
mrguyorama · 18 days ago
Similarly, the problem with Twitter is not strictly who runs it. Short form content is brain rot. It requires you to eschew nuance and reason and complexity, dumb things down, handwave or ignore criticisms, parrot cliches, and will always prefer a short but nice sounding thing to something that requires more effort.

Short form communication like that is inherently bad.

mikkupikku · 18 days ago
Absolutely. I think the proof of this is in the toxicity of the twitter alternatives. Despite being federated instead of centralized, they produce the same kind of toxicity. It's the medium itself which is corrupting.

https://en.wikipedia.org/wiki/The_medium_is_the_message

jijijijij · 18 days ago
Vine was fun. I think, the actual problem is commercialization and advertisement. That's creating the incentives to hijack any form of social media and turn them into addictive brain rot machines.
cyberge99 · 18 days ago
It's also very detrimental to brain development.
evolve2k · 19 days ago
Great to see this progressing. Tried it out just now after last testing it over 6 months ago.

I’d say the main “feature” id want to see added is a mandatory field on upload to tick if it’s AI content. Then a tag on videos that are Ai and at the account level to filter out AI content.

Otherwise it’s going to be a slops fest.

bigfishrunning · 18 days ago
What is the incentive for people producing AI slop to tick the "This is slop" box? Reminds me of the evil bit https://en.wikipedia.org/wiki/Evil_bit
mortsnort · 19 days ago
The reality is that the addictive algorithm of commercial social media platforms is the product.

These alternative platforms are like nicotine free cigarettes.

They might garner small communities, which is totally cool and valid, but they will never slay the giants.

rustyhancock · 19 days ago
As is the incessant stream. If there is a pause at all in the next video loading the addicted user can break free.

One of the issues with federated anything is that there will be good servers and bad servers.

Good servers get hammered, and if you're popular you might end up perversely paying for people to watch your videos having to fund your server to maintain its performance.

This happened with mastadon, matrix and will be far worse if they want to deliver tiktoks insane performance.

echelon · 19 days ago
Birth rates took a dip with broadband, smartphones, and TikTok.

Dopamine and attention sinks are pulling society in directions counter to evolutionary programming. Our runtime algorithms optimize for different things.

No value judgment, but it's interesting. I haven't had kids (yet?), and I feel the internet (and the career that revolves around it) is the biggest reason why.

Dead Comment

brutal_chaos_ · 19 days ago
To be fair, you don't need to slay giants to be a viable product.
echelon · 19 days ago
"nicotine free cigarettes" are a product too.

We have limited time on earth, so many tend to evaluate things on how big of an impact they make or how large of a demand they satisfy.

It's okay if not everything is big, but it's also okay for people to use scale as a criteria for sizing things up.

Morromist · 19 days ago
Old-school social media can be addictive too. I don't use any social media where complex algorithms decide what I see, but I have trouble wasting far too much time on discord.
nine_k · 19 days ago
Discord is communication with fellow humans. Tiktok is one-way consumption mostly. Discord is mostly text. Tiktok is mostly speech-free momentary videos, adjusting to the minute hints in your reaction.

There's no comparison.

socalgal2 · 18 days ago
Social media is its own problem . It’s got nothing to do with the algorithms and everything to do with being able to trivially broadcast to the entire planet.

Even with no algo the people posting want maximum exposure and have every incentive to try to get it.

Deleted Comment

mghackerlady · 18 days ago
I wonder if it'd be possible for the host to use custom algorithms, that way you can have instances with very limited recommendation systems or ones with something more tiktok-like
skirge · 19 days ago
happiness is addictive. Solution is to replace with something completely different. Why not free hacking challenges / courses for teenagers?
nanobuilds · 19 days ago
The UX should be more "mass user" friendly if the goal is to attract mainstream users. Sign in with Apple/etc.. selecting a server is technical language and not end-user language etc.
soared · 19 days ago
Also why not have the homepage be a feed, let me scroll for a bit before forcing me to sign up
HugoTea · 18 days ago
You can, but you need to be on an instance, go to https://loops.video for that experience. joinloops.org is more of an onboarding site it seems
deminature · 19 days ago
Selecting a server is one of the reasons the fediverse appears to have not seriously challenged incumbents. As soon as a non-technical user sees this, they bounce.
8organicbits · 19 days ago
Checking the Mastodon app, new users are asked to "Join mastodon.social" or "pick another server". If you just mindlessly click the primary button, you'll get an account on mastodon.social so I think the server selection challenge has largely been addressed.
Gigachad · 19 days ago
There's also the problem where selecting the server is quite a consequential choice which users who are brand new have no way to make a proper choice on. The owner of the server you choose has access to all of your data and the ability to delete your account or shut down the server at any time.
verdverm · 19 days ago
Technical users like myself bounce too. It's why I build and play in the ATProto ecosystem ("the atmosphere")
nine_k · 19 days ago
A sensible default should be preselected. E.g. the geographically closest, not overloaded server.
dewey · 18 days ago
I think this got much better than on the early days. Nowadays it's not much different than picking your Discord server, which is already something everyone is aware of.
HugoTea · 18 days ago
This is the same philosophy as the creator, the sign-up flow does not ask you to select a server it defaults you to loops.video Maybe the app is different? I haven't tried it yet
ParadisoShlee · 19 days ago
This project would love some additional support. If you're a developer who is interested in building this... please get in touch with them, the team is VERY small so you can do some good by just supporting them.
flexagoon · 19 days ago
No disrespect to the project, but it is a bit hard to find motivation to contribute to something that will most likely end up being used by like 100 people who'll all just post videos about how they're better than Tiktok users
fsflover · 18 days ago
This looks like a shallow dismissal to me.
verdverm · 19 days ago
Saw this comment about developer hostility first, is there a response or critique out there?

https://news.ycombinator.com/item?id=47118328