Readit News logoReadit News
rcthompson commented on Are hard drives getting better?   backblaze.com/blog/are-ha... · Posted by u/HieronymusBosch
Retric · 2 months ago
It depends on what you’re doing and what you’re concerned about.

For a simplified example suppose you have X drives storing 20TB vs 2X drives with 10TB in a simple RAID 1 configuration. When a drive fails there’s a risk period before its contents are replicated on another drive. At constant transfer speeds larger disks double that period per drive but half the number of failures. Net result the risk is identical in both setups.

However, that assumes a constant transfer speeds, faster transfer rates reduce overall risks.

rcthompson · 2 months ago
Hmm, I hadn't considered that doubling the drive size doubles the resilver time and therefore doubles the exposure time for risk of array loss. I guess the math gets complicated depending on RAID topology.
rcthompson commented on Are hard drives getting better?   backblaze.com/blog/are-ha... · Posted by u/HieronymusBosch
rcthompson · 2 months ago
Of note, assuming that decommissioning of drives is driven primarily by e.g. space concerns rather than signs of impending individual drive failures (which seems to be the case based on the linked article about storage scaling), you could conduct a survival analysis in which decommissioned drives are treated as right-censored to get a better measure of the failure rate over time as well as how that failure rate depends on various factors. Note that the most common choice of a proportional hazards model may not be appropriate here, and an accelerated failure time model may be more appropriate, although I couldn't say for sure without actually working with the data.
rcthompson commented on Are hard drives getting better?   backblaze.com/blog/are-ha... · Posted by u/HieronymusBosch
rcthompson · 2 months ago
If hard drives increase in capacity while maintaining the same MTBF, does this count as an improvement? If you previously stored your data on 10 drives and now you can store the same data on 5 drives, that reduces the probability of failure of the system as a whole, right? Is there some kind of "failure rate per byte" measure that normalizes for this?
rcthompson commented on M8.7 earthquake in Western Pacific, tsunami warning issued   earthquake.usgs.gov/earth... · Posted by u/jandrewrogers
jandrewrogers · 5 months ago
You are off by about a factor of 1,000.

Each incremental increase in magnitude is 10^1.5 in power. The difference between 1994 Northridge and this one is 2.1, so roughly 10^3 difference in power.

rcthompson · 5 months ago
I thought that it was a log10 scale, so each increment of 1 on the scale is a 10-fold power increase, not a 10^1.5-fold.
rcthompson commented on Adding a feature because ChatGPT incorrectly thinks it exists   holovaty.com/writing/chat... · Posted by u/adrianh
kragen · 5 months ago
I've found this to be one of the most useful ways to use (at least) GPT-4 for programming. Instead of telling it how an API works, I make it guess, maybe starting with some example code to which a feature needs to be added. Sometimes it comes up with a better approach than I had thought of. Then I change the API so that its code works.

Conversely, I sometimes present it with some existing code and ask it what it does. If it gets it wrong, that's a good sign my API is confusing, and how.

These are ways to harness what neural networks are best at: not providing accurate information but making shit up that is highly plausible, "hallucination". Creativity, not logic.

(The best thing about this is that I don't have to spend my time carefully tracking down the bugs GPT-4 has cunningly concealed in its code, which often takes longer than just writing the code the usual way.)

There are multiple ways that an interface can be bad, and being unintuitive is the only one that this will fix. It could also be inherently inefficient or unreliable, for example, or lack composability. The AI won't help with those. But it can make sure your API is guessable and understandable, and that's very valuable.

Unfortunately, this only works with APIs that aren't already super popular.

rcthompson · 5 months ago
In a similar vein, some of my colleagues have been feeding their scientific paper methods sections to LLMs and asking them to implement the method in code, using the LLM's degree of success/failure as a vague indicator of the clarity of the method description.
rcthompson commented on The unreasonable effectiveness of fuzzing for porting programs   rjp.io/blog/2025-06-17-un... · Posted by u/Bogdanp
rcthompson · 6 months ago
The author notes that the resulting Rust port is not very "rusty", but I wonder if this could also be solved through further application of the same principle. Something like telling the AI to minimize the use of unsafe etc., while enforcing that the result should compile and produce identical outputs to the original.
rcthompson commented on Violence alters human genes for generations, researchers discover   news.ufl.edu/2025/02/syri... · Posted by u/gudzpoz
johnisgood · 10 months ago
Should it not be "affects" or "influences"?
rcthompson · 10 months ago
DNA methylation means adding one or more methyl groups to the DNA, so technically it is an alteration. But most people would assume that altering a gene specifically means changing the "letters" of the gene sequence that encode the protein, and that's not what DNA methylation does.
rcthompson commented on Violence alters human genes for generations, researchers discover   news.ufl.edu/2025/02/syri... · Posted by u/gudzpoz
derefr · 10 months ago
I work as an editor sometimes. I've worked in technical writing, though not specifically science journalism.

My guess for why this keeps happening, is that it's a two-step process, fueled by a failure of communication:

1. The science writer themselves does understand epigenetics — but doesn't think it's important to the point the article is making for the reader to understand epigenetics. The writer wants to remove the requirement/assumption of "understanding epigenetics" from their writing, while still being technically correct in everything they say. So they choose to gloss an epigenetic change as "causing changes to the DNA." (Which it certainly does! Either chemically — to the DNA molecules themselves, through methylation; or structurally/topologically — through modifications to the histones around which the DNA is wrapped.)

2. The science writer's not-so-scientific editor comes along, doing a stylistic editing pass; sees the word "DNA"; and says "hey, that's jargon, and we're aiming for accessibility here — we need to replace this." And they (incorrectly) decide that a valid 1:1 replacement for "DNA" is "genes" or "genome."

This invalidating change could be caught... if the publication had a formal workflow step / requirement for the editor to perform a back-check with the original writer after copyediting + stylistic editing, to ensure that validity has not been compromised. I believe that big-name science journals and science magazines do tend to have these back-check steps. But smaller publications — like the PR departments of universities — don't.

rcthompson · 10 months ago
I can't speak for every institution, but our PR department does as many back and forth passes as it takes for the scientists who did the work to sign off that any edits made still preserve scientific accuracy.
rcthompson commented on Violence alters human genes for generations, researchers discover   news.ufl.edu/2025/02/syri... · Posted by u/gudzpoz
derektank · 10 months ago
"There's new evidence that historical trauma is passed down through changes to the genome!"

"Genome or epigenome?"

"...epigenome."

I feel like I read of a similar study every few years, the first I can recall was 'Transgenerational response to nutrition, early life circumstances and longevity'[1], and it is always needlessly disappointing to thumb through past the headline and read that, inevitably, the media has decided to report this as a change to the genome when the actual research suggests otherwise.

Epigenetic changes are interesting in their own right! But they don't change human genes, at most they change gene expression.

[1] https://www.nature.com/articles/5201832

rcthompson · 10 months ago
Describing a change to DNA methylation "alters" a gene is technically correct in the sense that it is an change to the molecular structure of the DNA that makes up the gene, but is indeed misleading, because without further clarification a majority of people would assume it refers to a change in the gene sequence.
rcthompson commented on Violence alters human genes for generations, researchers discover   news.ufl.edu/2025/02/syri... · Posted by u/gudzpoz
sensanaty · 10 months ago
I'm a complete layman here, but what is the difference exactly?
rcthompson · 10 months ago
Imagine you have the text of a book in a word processor. You can change the text by typing new words or deleting ones that are there. You can also change the font, size, alignment, etc. The latter category of changes does not alter the words in the text, but it can affect how that text is interpreted, which parts of the text a reader focuses on, etc. The difference between a genetic alteration and an epigenetic alteration is conceptually similar. Genetics is changing the "text" of the genome while epigenetics is changing aspects of the genome that affect how that "text" is interpreted and used.

u/rcthompson

KarmaCake day7341August 31, 2010
About
http://darwinawardwinner.github.io/resume/ryan_thompson_resume.pdf

[ my public key: https://keybase.io/rcthompson; my proof: https://keybase.io/rcthompson/sigs/DsaLwPFziJbWiFVlQ5zmSsidG9IxdF6IUZfiINIMj0o ]

View Original