Readit News logoReadit News
scraptor commented on Stop selling “unlimited”, when you mean “until we change our minds”   blog.kilocode.ai/p/ai-pri... · Posted by u/heymax054
epistasis · a month ago
I had been loyal, but am not any longer, so the ploy definitely did not work with me. I guess I'll move on to Gemini now, until I get sick of it.

To the degree there is a moat, I do not think it will be effective at keeping people in. I had already been somewhat disillusioned with the AI hype, but now I am also disillusioned with the company who I thought was the best actor in the space. I am happy that there is unlikely to be a dominant single winner like there was for web search or for operating systems. That is, unless there's a significant technological jump, rather than the same gradual improvement that all the AI companies are making.

scraptor · a month ago
Interesting to see the definition of moat change from keeping other companies out to keeping your customers in.
scraptor commented on Systemd has been a complete, utter, unmitigated success   blog.tjll.net/the-systemd... · Posted by u/Mond_
scraptor · 2 months ago
I don't hate journald because it's not plaintext, I hate it because it's worse than plaintext. Somehow journald manages to provide a database which is 40x slower to query than running grep on a compressed text file. I'm all in favour of storing logs in an indexed structured format but journald ain't it.
scraptor commented on Serving a half billion requests per day with Rust and CGI   jacob.gold/posts/serving-... · Posted by u/feep
rokob · 2 months ago
I’m interested why Rust and C have similarly bad tail latencies but Go doesn’t.
scraptor · 2 months ago
sqlite resolves lock contention between processes with exponential backoff. When the WAL reaches 4MB it stops all writes while it gets compacted into the database. Once the compaction is over all the waiting processes probably have retry intervals in the hundred millisecond range, and as they exit they are immediately replaced with new processes with shorter initial retry intervals. I don't know enough queuing theory to state this nicely or prove it, but I imagine the tail latency for the existing processes goes up quickly as the throughput of new processes approaches the limit of the database.
scraptor commented on Puerto Rico's Solar Microgrids Beat Blackout   spectrum.ieee.org/puerto-... · Posted by u/ohjeez
dylan604 · 2 months ago
> In fact the demand has shrunk so much that it jeopardizes financing of coal power companies.

That is something that I think would be the impetus needed to motivate reduction in coal power plants. If they become unprofitable to operate, then will the market finally decide to stop using them? Sadly, I could see the current US administration deciding to offer subsidies to keep coal.

scraptor · 2 months ago
Subsidies is too technocratic, trump style is to just order utilities to keep the plants operational at negative profit https://edition.cnn.com/2025/06/06/climate/michigan-coal-pla...
scraptor commented on U.S. bombs Iranian nuclear sites   bbc.co.uk/news/live/ckg3r... · Posted by u/mattcollins
aristofun · 3 months ago
All those "nice guys" vouching for negotiations — you have absolutely no idea who are we dealing with here.

Only street laws and rules apply to them. They see negotiations as weakness, nothing more. History proved many times - you don't negotiate with tyrans and bloody dictators. Period.

If you have enough brain to crack leetcode puzzles, why can't you nail that?

scraptor · 3 months ago
But enough about trump, let's talk about iran
scraptor commented on Another Crack in the Chain of Trust: Uncovering (Yet Another) Secure Boot Bypass   binarly.io/blog/another-c... · Posted by u/vitplister
OjotCewIo · 3 months ago
UEFI variables or not: who in their right mind serializes raw pointer values to any kind of storage (network, disk, nvram, ...)?

Why is it that the most security-sensitive areas are ravaged by the sloppiest programmers and the most negligent managers and business types? I'd like to understand the economics and the psychology behind it.

scraptor · 3 months ago
When's the last time you made a motherboard purchase decision on the basis of firmware quality? Or rather, when's the last time a corporate purchasing manager got fired for buying motherboards with low quality firmware?
scraptor commented on What is HDR, anyway?   lux.camera/what-is-hdr/... · Posted by u/_kush
tart-lemonade · 4 months ago
> YouTube automatically edits the volume of videos that have an average loudness beyond a certain threshold.

For anyone else who was confused by this, it seems to be a client-side audio compressor feature (not a server-side adjustment) labeled as "Stable Volume". On the web, it's toggleable via the player settings menu.

https://support.google.com/youtube/answer/14106294

I can't find exactly when it appeared but the earliest capture of the help article was from May 2024, so it is a relatively recent feature: https://web.archive.org/web/20240523021242/https://support.g...

I didn't realize this was a thing until just now, but I'm glad they added it because (now that I think about it) it's been awhile since I felt the need to adjust my system volume when a video was too quiet even at 100% player volume. It's a nice little enhancement.

scraptor · 4 months ago
The client side toggle might be new since 2024 but the volume normalisation has been a thing for a long time.
scraptor commented on US Copyright Office found AI companies breach copyright. Its boss was fired   theregister.com/2025/05/1... · Posted by u/croes
biophysboy · 4 months ago
Having actually done research and published scientific papers, the key limitation is experimentation. Review papers are useful, and AI is useful, but creating new knowledge is more useful. I haven't had much luck using LLMs to extrapolate well beyond their knowledge domain.
scraptor · 4 months ago
I certainly don't see much value in AI generated papers myself, I just object to the claim that the mere act of reading a large number of existing papers before writing yours is inherently plagiarism.
scraptor commented on US Copyright Office found AI companies breach copyright. Its boss was fired   theregister.com/2025/05/1... · Posted by u/croes
bgwalter · 4 months ago
The required reasoning is not very deep though: If an AI reads 100 scientific papers and churns out a new one, it is plagiarism.

If a savant has perfect recall, remembers text perfectly and rearranges that text to create a marginally new text, he'd be sued for breach of copyright.

Only large corporations get away with it.

scraptor · 4 months ago
Plagiarism is not an issue of copyright law, it's an entirely separate system of rules maintained by academia. The US Copyright Office has no business having opinions about it. If a AI^W human reads 100 papers and then churns out a new one this is usually called research.
scraptor commented on Hyperspace   hypercritical.co/2025/02/... · Posted by u/tobr
gblargg · 6 months ago
There might be a difference in robustness. There's a monetary consequence to this developer for getting it wrong.
scraptor · 6 months ago
The linux tools can't get it wrong, the kernel checks that the files submitted for deduplication are actually identical.

u/scraptor

KarmaCake day556March 18, 2019View Original