Musl solves this problem by inspecting the program counter in the interrupt handler and checking if it falls specifically in that range, and if so, modifying registers such that when it returns from the signal, it returns to instructions that cause ECANCELED to be returned.
Blew my mind when I learned this last month.
Why not just do proper time travel? Is that absent for Javascript?
"This license allows you to use and share this software for noncommercial purposes for free and to try this software for commercial purposes for thirty days."
This is not an open source license. "Open Source" is a trademarked term meaning without restrictions of this kind; it is not a generic term meaning "source accessible".
You can also just use perf, but it does require an extra package from the python build (which uv frustratingly doesn't supply)
I used FunctionTrace as a example and evidence for my position that tracing Python is low overhead with proper design to bypass claims like: “You can not make it that low overhead or someone would have done it already, thus proving the negative.” I am not the author or in any way related to it, so you can bring that up with them.
Maybe if we were all pretty rational people, we could better manage positive screening results and follow up actions that lead towards taking no specific action; but that's not where people are at the moment.
There's a tradeoff of early detection of fast growing tumors that are likely to cause issues vs detection of slow growing tumors that are likely to not cause issues except if they're detected. You can see how the consensus is shifting on things like breast, prostate, and colon cancer screenings over time. My TLDR is that we developed tools and methods, started applying them and have generally reduced the screening frequency over time as we understand more about the tradeoffs.
We have a system that partially results in anxiety because cancer screening is frequently only done when cancer would already be medically significant. A positive result usually means medically significant cancer because as a society we already chose to not screen when it would be medically insignificant. This is perfectly reasonable if the test is expensive, inaccurate, or harmful as even just the harms from doing the test in bulk could result in societally worse outcomes than occasional early detection. However, the rise in "medically unnecessary" screening indicates that we have turned the corner on that in many cases; that or it is easily billed corruption which is a separate problem.
We already have an extreme shortage of available healthcare workers. We don't need to stress them further because 20% of the population suddenly decides they need 80 elective surgeries to remove things that would've gone away or stayed benign on their own.
You are just assuming that all cancer must be treated if detected, even if it is medically unnecessary, therefore we must not detect medically insignificant cancer which would be net harmful to treat. You can detect things and determine no action should be taken. I can understand if that might be the modern standard of care, but if so then that is the problem; not early detection of cancer, which could be medically insignificant, but which may also allow the early detection of medically significant cancer.
Comment could be considered misleading…
https://en.wikipedia.org/wiki/Bowman_v._Monsanto_Co
1. Bowman buys Monsanto soybeans as seeds agreeing to not replant the soybean harvest.
2. Bowman sells the soybean harvest to a food wholesaler who sells to retailers who sells to consumers for consumption.
3. Bowman buys soybeans back from that same food wholesaler (who normally only sells for consumption) intending to replant those food soybeans (which is abnormal).
4. Bowman then tests the seeds he bought to verify which ones were the ones he sold which had the Monsanto modifications (or his neighbors who were also using Monsanto seeds with the same contract) and which he was not allowed to replant as per the contract in 1.
5. Bowman then only replants the ones with the modifications and uses Roundup in those fields.
6. Bowman then repeatedly saves and replants seeds from that crop to amplify their quantity of modified crop and purchases more seeds from the food wholesaler.
It was about as premeditated and intentional a contract violation as you can get.
> In my testing, it's between 1.2x and 4x slower than Yolo-C. It uses between 2x and 3x more memory. Others have observed higher overheads in certain tests (I've heard of some things being 8x slower). How much this matters depends on your perspective. Imagine running your desktop environment on a 4x slower computer with 3x less memory. You've probably done exactly this and you probably survived the experience. So the catch is: Fil-C is for folks who want the security benefits badly enough.
(from https://news.ycombinator.com/item?id=46090332)
We're talking about a lack of fat pointers here, and switching to GC and having a 4x slower computer experience is not required for that.
The fact that the correct type signature, a pointer to fixed-size array, exists and that you can create a struct containing a fixed-size array member and pass that in by value completely invalidates any possible argument for having special semantics for fixed-size array parameters. Automatic decay should have died when it became possible to pass structs by value. Its continued existence continues to result in people writing objectively inferior function signatures (though part of this it the absurdity of C type declarations making the objectively correct type a pain to write or use, another one of the worst actual design mistakes).
Fat pointers or argument-aware non-fixed size array parameters are a separate valuable feature, but it is at least understandable for them to not have been included at the time.