Readit News logoReadit News
chlorion commented on 10% of Firefox crashes are caused by bitflips   mas.to/@gabrielesvelto/11... · Posted by u/marvinborner
chlorion · 8 days ago
Does it though?

People experience "blue screens" and kernel panics and such pretty often.

chlorion commented on GNU Pies – Program Invocation and Execution Supervisor   gnu.org.ua/software/pies/... · Posted by u/smartmic
mos87 · 24 days ago
>Try running any part of the systemd software suite on an openrc system and see how that works out?

Well from this POV it's kinda openrc's problem if it doesn't. What about trying to run any part of the Openrc software suite on an Upstart system? The question why would anyone sane want to is rhetorical tho...

Why obsessing over whether systemd is monolithic and in what measure anyway? There certainly ARE optional systemd parts. So it's correct to say it's not entirely monolithic.

chlorion · 24 days ago
openrc-init can be used on an upstart system, the daemon manager itself can't but that's because you'd have two different daemon managers. Beyond that there aren't any openrc software components, because it was designed to be a modular init system that just handles what it was intended to handle.

The rest of the system for example chrony, sysklogd, cron, etc run fine on upstart systems, because they aren't tied to systemd and are fully modular.

It's okay to be a monolith, that doesn't make it inherently bad or anything, but we should be honest about it, and it does come with some tradeoffs.

chlorion commented on Zero-day CSS: CVE-2026-2441 exists in the wild   chromereleases.googleblog... · Posted by u/idoxer
pheggs · 24 days ago
I love rust but honestly I am more scared about supply chain attacks through cargo than memory corruption bugs. The reason being that supply chain attacks are probably way cheaper to pull off than finding these bugs
chlorion · 24 days ago
The statistics we have on real world security exploits proves that most security exploits are not coming from supply chain attacks though.

Memory safety related security exploits happen in a steady stream in basically all non-trivial C projects, but supply chain attacks, while possible, are much more rare.

I'm not saying we shouldn't care about both issues, but the idea is to fix the low hanging fruit and common cases before optimizing for things that aren't in practice that big of a deal.

Also, C is not inherently invulnerable to supply chain attacks either!

chlorion commented on I fixed Windows native development   marler8997.github.io/blog... · Posted by u/deevus
dotancohen · a month ago
You don't have to install executables downloaded from an unknown GitHub account named marler8997. You can download that script and read it just like any other shell script.

Just like those complaining about curl|sh on Linux, you are confusing install instructions with source code availability. Just download the script and read it if you want. The curl|sh workflow is no more dangerous that downloading an executable off the internet, which is very common (if stupid) and attracts no vitriol. In no way does it imply that you can not actually download and read the script - something that actually can't be done with downloaded executables.

chlorion · a month ago
>The curl|sh workflow is no more dangerous that downloading an executable off the internet

It actually is for a lot of subtle reasons, assuming you were going to check the executable checksum or something, or blindly downloading + running a script.

The big thing is that it can serve you up different contents if it detects it's being piped into a shell which is in theory possible, but also because if the download is interrupted you end up with half of the script ran, and a broken install.

If you are going to do this, its much better to do something like:

    sh -c "$(curl https://foo.bar/blah.sh)"
Though ideally yes you just download it and read it like a normal person.

chlorion commented on GNU Pies – Program Invocation and Execution Supervisor   gnu.org.ua/software/pies/... · Posted by u/smartmic
eliaspro · a month ago
systemd is not a monolith.

It's a collection of losely coupled components and services of which basically every single one can be disabled or replaced by another implementation.

chlorion · a month ago
No it definitely is a monolith.

It's NOT loosely coupled in any way. Try running any part of the systemd software suite on an openrc system and see how that works out?

I have no idea why people are so insistent on claiming that its not a monolith, when it ticks off every box of what a monolith is.

chlorion commented on Termux   github.com/termux/termux-... · Posted by u/tosh
chlorion · a month ago
I have been able to do some light programming in elisp in termux in emacs on my moto g. My emacs config is now setup to detect termux and config itself accordingly which is neat.

Also I have a wireguard vpn setup so that I can ssh between my phone and desktop computer via a VPS with a public ipv4 address. This allows me to just "ssh 10.0.0.4" to access the phones sshd, instead of having to deal with changing IP addresses and NAT traversal.

chlorion commented on We can't have nice things because of AI scrapers   blog.metabrainz.org/2025/... · Posted by u/LorenDB
tommek4077 · 2 months ago
But serving HTML is unbelievably cheap, isn't it?
chlorion · 2 months ago
It adds up very quickly.
chlorion commented on We can't have nice things because of AI scrapers   blog.metabrainz.org/2025/... · Posted by u/LorenDB
chlorion · 2 months ago
I self host a small static website and a cgit instance on an e2-micro VPS from Google Cloud, and I have got around 8.5 million requests combined from openai and claude over around 160 days. They just infinitely crawl the cgit pages forever unless I block them!

    (1) root@gentoo-server ~ # egrep 'openai|claude' -c /var/log/lighttpd/access.log
    8537094
So I have lighttpd setup to match "claude|openai" in the user agent string and return a 403 if it matches, and a nftables firewall seutp to rate limit spammers, and this seems to help a lot.

chlorion commented on Babel is why I keep blogging with Emacs   entropicthoughts.com/why-... · Posted by u/ibobev
dzonga · 5 months ago
do people also realize you can blog with .txt files ?

write a txt file, scp then let whatever server serve the files.

chlorion · 5 months ago
I use org mode files, a small build script and git.

The build system just invokes emacs and compiles org documents to HTML, and installs them in /var/www/${site}. I have a git update hook on the server that invokes the build script when I push updates.

Originally I did just rsync over HTML files though, but I like the new setup a lot.

chlorion commented on How I block all 26M of your curl requests   foxmoss.com/blog/packet-f... · Posted by u/foxmoss
geocar · 5 months ago
Do you actually use this?

    $ md5 How\ I\ Block\ All\ 26\ Million\ Of\ Your\ Curl\ Requests.html
MD5 (How I Block All 26 Million Of Your Curl Requests.html) = e114898baa410d15f0ff7f9f85cbcd9d

(downloaded with Safari)

    $ curl https://foxmoss.com/blog/packet-filtering/ | md5sum
    e114898baa410d15f0ff7f9f85cbcd9d  -
I'm aware of curl-impersonate https://github.com/lwthiker/curl-impersonate which works around these kinds of things (and makes working with cloudflare much nicer), but serious scrapers use chrome+usb keyboard/mouse gadget that you can ssh into so there's literally no evidence of mechanical means.

Also: If you serve some Anubis code without actually running the anubis script in the page, you'll get some answers back so there's at least one anubis-simulator running on the Internet that doesn't bother to actually run the JavaScript it's given.

Also also: 26M requests daily is only 300 requests per second and Apache could handle that easily over 15 years ago. Why worry about something as small as that?

chlorion · 5 months ago
Claude was scraping my cgit at around 12 requests per second, but in bursts here or there. My VPS could easily handle this, even being a free tier e2-micro on Google Cloud/Compute Engine, but they used almost 10GB of my egress bandwidth in just a few days, and ended up pushing me over the free tier.

Granted it wasn't a whole lot of money spent, but why waste money and resources so "claude" can scrape the same cgit repo over and over again?

    >(1) root@gentoo-server ~ # grep 'claude' /var/log/lighttpd/access.log | wc -l
    >1099323

u/chlorion

KarmaCake day714November 25, 2021View Original