Readit News logoReadit News
Veserv commented on Capsudo: Rethinking sudo with object capabilities   ariadne.space/2025/12/12/... · Posted by u/fanf2
kccqzy · 2 days ago
I am the owner and only user of the computer. Does that mean I should run everything with root? Of course not. It’s simply better to start with little privileges and then elevate when needed. Using any additional privileges should be an intentional act. I also do it the other way: reduce my privileges via sudo -u nobody.
Veserv · a day ago
No, you should run every program with only the privileges it needs. The very concept of running your programs with all your privileges as a user by default is wrong-headed to begin with. To strain the "user" model you should have a distinct "user" for every single program which has only the resources and privileges needed by/allocated to that program. The actual user can allocate their resources to these "users" as needed. This is a fairly primitive version of the idea due to having to torture fundamentally incompatible insecure building blocks to fit, but points in the direction of the correct idea.
Veserv commented on Async DNS   flak.tedunangst.com/post/... · Posted by u/todsacerdoti
AndyKelley · 2 days ago
There's a second problem here that musl also solves. If the signal is delivered in between checking for cancelation and the syscall machine code instruction, the interrupt is missed. This can cause a deadlock if the syscall was going to wait indefinitely and the application relies on cancelation for interruption.

Musl solves this problem by inspecting the program counter in the interrupt handler and checking if it falls specifically in that range, and if so, modifying registers such that when it returns from the signal, it returns to instructions that cause ECANCELED to be returned.

Blew my mind when I learned this last month.

Veserv · a day ago
Introspection windows from a interrupting context are a neat technique. You can use it to implement “atomic transaction” guarantees for the interruptee as long as you control all potential interrupters. You can also implement “non-interruption” sections and bailout logic.
Veserv commented on Show HN: Wirebrowser – A JavaScript debugger with breakpoint-driven heap search   github.com/fcavallarin/wi... · Posted by u/fcavallarin
Veserv · 3 days ago
BDHS seems strictly less powerful than a time travel debugger. You can just set a hardware breakpoint and run backwards until the value is set.

Why not just do proper time travel? Is that absent for Javascript?

Veserv commented on Latency Profiling in Python: From Code Bottlenecks to Observability   quant.engineering/latency... · Posted by u/rundef
ajb · 5 days ago
Interesting, but "FunctionTrace is opensourced under the Prosperity Public License 3.0 license."

"This license allows you to use and share this software for noncommercial purposes for free and to try this software for commercial purposes for thirty days."

This is not an open source license. "Open Source" is a trademarked term meaning without restrictions of this kind; it is not a generic term meaning "source accessible".

You can also just use perf, but it does require an extra package from the python build (which uv frustratingly doesn't supply)

Veserv · 5 days ago
perf is a sampling profiler, not a function tracing profiler, so that fails the criteria I presented.

I used FunctionTrace as a example and evidence for my position that tracing Python is low overhead with proper design to bypass claims like: “You can not make it that low overhead or someone would have done it already, thus proving the negative.” I am not the author or in any way related to it, so you can bring that up with them.

Veserv commented on Latency Profiling in Python: From Code Bottlenecks to Observability   quant.engineering/latency... · Posted by u/rundef
hansvm · 5 days ago
That depends on the code you're profiling. Even good line profilers can add 2-5x overhead on programs not optimized for them, and you're in a bit of a pickle because the programs least optimized for line profiling are those which are already "optimized" (fast results for a given task when written in Python).
Veserv · 5 days ago
It does not, those are just very inefficient tracing profilers. You can literally trace C programs in 10-30% overhead. For Python you should only accept low single-digit overhead on average with 10% overhead only in degenerate cases with large numbers of tiny functions [1]. Anything more means your tracer is inefficient.

[1] https://functiontrace.com/

Veserv commented on Latency Profiling in Python: From Code Bottlenecks to Observability   quant.engineering/latency... · Posted by u/rundef
Veserv · 5 days ago
Why even bother with sampling profilers in Python? You can do full function traces for literally all of your code in production at ~1-10% overhead with efficient instrumentation.
Veserv commented on Cancer is surging, bringing a debate about whether to look for it   nytimes.com/2025/12/08/he... · Posted by u/brandonb
toast0 · 6 days ago
It's a human factors thing. If two patients both have a cancerous tumor that does not need treatment, the patient that did not have a screening is better off. The patient who was screened will deal with anxiety from having a positive screening result in addition to any negative effects from the screening and follow ups. Many patients are not comfortable living with a detected tumor, even if the standard of care is to watch and wait. Of course, the opposite scenario is also true --- if both patients have a tumor were removal would be best, the one that gets screened has a better outcome.

Maybe if we were all pretty rational people, we could better manage positive screening results and follow up actions that lead towards taking no specific action; but that's not where people are at the moment.

There's a tradeoff of early detection of fast growing tumors that are likely to cause issues vs detection of slow growing tumors that are likely to not cause issues except if they're detected. You can see how the consensus is shifting on things like breast, prostate, and colon cancer screenings over time. My TLDR is that we developed tools and methods, started applying them and have generally reduced the screening frequency over time as we understand more about the tradeoffs.

Veserv · 6 days ago
Except that the medical system already does that for various types of common cancer screening such as breast cancer. It is frequently detected extremely early when it is medically insignificant and patients may be recommended to just wait and watch with more frequent screening. Increased vigilance would have been impossible without early detection and early detection for breast cancer is viewed very positively and is very positive for society.

We have a system that partially results in anxiety because cancer screening is frequently only done when cancer would already be medically significant. A positive result usually means medically significant cancer because as a society we already chose to not screen when it would be medically insignificant. This is perfectly reasonable if the test is expensive, inaccurate, or harmful as even just the harms from doing the test in bulk could result in societally worse outcomes than occasional early detection. However, the rise in "medically unnecessary" screening indicates that we have turned the corner on that in many cases; that or it is easily billed corruption which is a separate problem.

Veserv commented on Cancer is surging, bringing a debate about whether to look for it   nytimes.com/2025/12/08/he... · Posted by u/brandonb
kulahan · 6 days ago
There is cancer in your body every single day. Your immune system handles it just fine. There is an explicit difference between cancer that needs treating and cancer that should be ignored because it's a waste of time and resources to treat. You're not a doctor, you're not qualified to tell the difference, you're not trained on cancers, you're not even in the medical field. The monopoly is held for exactly this reason.

We already have an extreme shortage of available healthcare workers. We don't need to stress them further because 20% of the population suddenly decides they need 80 elective surgeries to remove things that would've gone away or stayed benign on their own.

Veserv · 6 days ago
Your response is a non-sequitur. The original statement was about intentionally not detecting cancer. You are talking about whether the cancer is medically necessary to treat.

You are just assuming that all cancer must be treated if detected, even if it is medically unnecessary, therefore we must not detect medically insignificant cancer which would be net harmful to treat. You can detect things and determine no action should be taken. I can understand if that might be the modern standard of care, but if so then that is the problem; not early detection of cancer, which could be medically insignificant, but which may also allow the early detection of medically significant cancer.

Veserv commented on Kenyan court declares law banning seed sharing unconstitutional   apnews.com/article/kenya-... · Posted by u/thunderbong
DANmode · 9 days ago
Meaning it didn’t happen, or the farmers aren’t as innocent as the word innocent legally implies?

Comment could be considered misleading…

Veserv · 8 days ago
To use the example provided by the anti-Monsanto upthread poster as a example of Monsanto being underhanded:

https://en.wikipedia.org/wiki/Bowman_v._Monsanto_Co

1. Bowman buys Monsanto soybeans as seeds agreeing to not replant the soybean harvest.

2. Bowman sells the soybean harvest to a food wholesaler who sells to retailers who sells to consumers for consumption.

3. Bowman buys soybeans back from that same food wholesaler (who normally only sells for consumption) intending to replant those food soybeans (which is abnormal).

4. Bowman then tests the seeds he bought to verify which ones were the ones he sold which had the Monsanto modifications (or his neighbors who were also using Monsanto seeds with the same contract) and which he was not allowed to replant as per the contract in 1.

5. Bowman then only replants the ones with the modifications and uses Roundup in those fields.

6. Bowman then repeatedly saves and replants seeds from that crop to amplify their quantity of modified crop and purchases more seeds from the food wholesaler.

It was about as premeditated and intentional a contract violation as you can get.

Veserv commented on Checked-size array parameters in C   lwn.net/SubscriberLink/10... · Posted by u/chmaynard
wild_pointer · 11 days ago
This guy is doing something else completely. In his words:

> In my testing, it's between 1.2x and 4x slower than Yolo-C. It uses between 2x and 3x more memory. Others have observed higher overheads in certain tests (I've heard of some things being 8x slower). How much this matters depends on your perspective. Imagine running your desktop environment on a 4x slower computer with 3x less memory. You've probably done exactly this and you probably survived the experience. So the catch is: Fil-C is for folks who want the security benefits badly enough.

(from https://news.ycombinator.com/item?id=46090332)

We're talking about a lack of fat pointers here, and switching to GC and having a 4x slower computer experience is not required for that.

Veserv · 11 days ago
I am actually not talking about the lack of fat pointers. That is almost entirely orthogonal to my point. I am talking about the fact that what would be the syntax for passing a array by value was repurposed for automatically decaying into a pointer. This results in a massive and unnecessary syntactic wart.

The fact that the correct type signature, a pointer to fixed-size array, exists and that you can create a struct containing a fixed-size array member and pass that in by value completely invalidates any possible argument for having special semantics for fixed-size array parameters. Automatic decay should have died when it became possible to pass structs by value. Its continued existence continues to result in people writing objectively inferior function signatures (though part of this it the absurdity of C type declarations making the objectively correct type a pain to write or use, another one of the worst actual design mistakes).

Fat pointers or argument-aware non-fixed size array parameters are a separate valuable feature, but it is at least understandable for them to not have been included at the time.

u/Veserv

KarmaCake day4288May 4, 2019View Original