Readit News logoReadit News
karolinepauls commented on Vacuum Is a Lie: About Your Indexes   boringsql.com/posts/vacuu... · Posted by u/birdculture
chuckadams · a day ago
Shorter:

* VACUUM does not compact your indexes (much).

* VACUUM FULL does. It's slow though.

karolinepauls · a day ago
That's too reductive. Vacuum full isn't just slow, it exclusively locks the table for the duration of the vacuum and is basically a no-go when the database is in use.
karolinepauls commented on Vacuum Is a Lie: About Your Indexes   boringsql.com/posts/vacuu... · Posted by u/birdculture
throwaway613745 · a day ago
Dont' forget to ANALYZE your tables sometimes too.

Just recently was trying to optimize a 12s index scan, turns out I didn't need to change anything about the query I just had to update the table statistics. 12s down to 100ms just form running ANALYZE (no vacuum needed).

karolinepauls · a day ago
And make sure your `random_page_cost` is about 1.1 if running on an SSD or if >~98% of your hot pages fit in memory. Rather than 4 by default which makes the planner afraid of using indexes.
karolinepauls commented on Show HN: I audited 500 K8s pods. Java wastes ~48% RAM, Go ~18%   github.com/WozzHQ/wozz... · Posted by u/wozzio
Narishma · 2 days ago
What makes you think they're talking about CPU? It reads to me like it's memory.
karolinepauls · a day ago
Two things - the word "idles" and the nature of CPython's allocator which generally doesn't return memory to the OS but reuses it internally. So you cannot really "spike" memory usage, only grow it.
karolinepauls commented on Show HN: I audited 500 K8s pods. Java wastes ~48% RAM, Go ~18%   github.com/WozzHQ/wozz... · Posted by u/wozzio
wozzio · 2 days ago
I've been consulting on EKS/GKE cost optimization for a few mid-sized companies and kept seeing the same pattern: massive over-provisioning of memory just to be safe.

I wrote a simple CLI tool (bash wrapper around kubectl) to automate diffing kubectl top metrics against the declared requests in the deployment YAML.

I ran it across ~500 pods in production. The "waste" (allocated vs. used) average by language was interesting:

Python: ~60% waste (Mostly sized for startup spikes, then idles empty).

Java: ~48% waste (Devs seem terrified to give the JVM less than 4Gi).

Go: ~18% waste.

The tool is called Wozz. It runs locally, installs no agents, and just uses your current kubecontext to find the gap between what you pay for (Requests) and what you use (Usage).

It's open source. Feedback welcome.

(Note: The install is curl | bash for convenience, but the script is readable if you want to audit it first).

karolinepauls · 2 days ago
> Python: ~60% waste (Mostly sized for startup spikes, then idles empty).

I understand we're talking about CPU in case of Python and memory for Java and Go. While anxious overprovisioning of memory is understandable, doing the same for CPU probably means lack of understanding of the difference between CPU limits and CPU requests.

Since I've been out of DevOps for a few years, is there ever a reason not to give each container the ability to spike up to 100% of 1 core? Scheduling of mass container startup should be a solved problem by now.

karolinepauls commented on Unusual circuits in the Intel 386's standard cell logic   righto.com/2025/11/unusua... · Posted by u/Stratoscope
junto · 22 days ago
This reminds me of Adrian Thompson’s (University of Sussex) 1996 paper, “An evolved circuit, intrinsic in silicon, entwined with physics,” ICES 1996 / LNCS 1259 (published 1997), which was extended in his later thesis, “Hardware Evolution: Automatic Design of Electronic Circuits in Reconfigurable Hardware by Artificial Evolution, Springer, 1998”.

Before Thompson’s experiment, many researchers tried to evolve circuit behaviors on simulators. The problem was that simulated components are idealized, i.e. they ignore noise, parasitics, temperature drift, leakage paths, cross-talk, etc. Evolved circuits would therefore fail in the real world because the simulation behaved too cleanly.

Thompson instead let evolution operate on a real FPGA device itself, so evolution could take advantage of real-world physics. This was called “intrinsic evolution” (i.e., evolution in the real substrate).

The task was to evolve a circuit that can distinguish between a 1 kHz and 10 kHz square-wave input and output high for one, low for the other.

The final evolved solution:

- Used fewer than 40 logic cells

- Had no recognisable structure, no pattern resembling filters or counters

- Worked only on that exact FPGA and that exact silicon patch.

Most astonishingly:

The circuit depended critically on five logic elements that were not logically connected to the main path.

Removing them should not affect a digital design

- they were not wired to the output

- but in practice the circuit stopped functioning when they were removed.

Thompson determined via experiments that evolution had exploited:

- Parasitic capacitive coupling

- Propagation delay differences

- Analogue behaviours of the silicon substrate

- Electromagnetic interference from neighbouring cells

In short: the evolved solution used the FPGA as an analog medium, even though engineers normally treat it as a clean digital one.

Evolution had tuned the circuit to the physical quirks of the specific chip. It demonstrated that hardware evolution could produce solutions that humans would never invent.

karolinepauls · 22 days ago
I wonder what would happen if someone evolved a circuit on a large number of FPGAs from different batches. Each of the FPGAs would receive the same input in each iteration but the output function would be biased to expose the worst-behaving units (maybe the bias should be raised biased in later iterations when most units behave well).
karolinepauls commented on The scariest "user support" email I've received   devas.life/the-scariest-u... · Posted by u/hervic
tantalor · 2 months ago
> as ChatGPT confirmed when I asked it to analyze it

lol we are so cooked

karolinepauls commented on Ask HN: Are there any non-SPA front end developers left?    · Posted by u/karolinepauls
leakycap · 3 months ago
SPA development is everywhere because you can charge a lot more for it, and web development has become commoditized in many other arenas

Sanity is available immediately if you are willing to be paid less. There are tons of simple, non-SPA, non-stack-on-stack projects out there, they just usually pay 1/10th the complex stuff.

karolinepauls · 3 months ago
Sorry not to have made this clear: I am not a frontend developer. I'm a backend/infra developer who's forced to work on a React app abandoned by a frontend developer who incorporated their own wrappers-of-wrappers-of-wrappers.

Meanwhile the client is telling me is virtually impossible to find frontend devs willing to write HTML.

karolinepauls commented on Public static void main(String[] args) is dead   mccue.dev/pages/9-16-25-p... · Posted by u/charles_irl
crystal_revenge · 3 months ago
One thing I'll miss about this was the way this arcane writing increasingly made sense over time as you became a better programmer.

I learned Java only after Python and I remember being not quite familiar with types so even the 'void' and 'String[]' where a bit mysterious. After learning basic types that part made sense. Then you start learning about Classes and objects, and you understood that main is a static method of this class Main that must be required somehow. As you dive deeper you start learning when this class is called. In a weird way what started as complete, unknowable boiler plate slowly evolved into something sensible as you began to understand the language better. I have no doubt that seasoned Java devs see a lot more in that invocation that I do.

Good riddance though!

karolinepauls · 3 months ago
Good riddance indeed. The last 30 years of software teaching basically trained developers to produce complexity for its own sake, while calling it engineering. I'm not sufficiently full of myself to link to my own writing (yet) but I'm full of myself enough to self-paraphrase:

1. Programmer A creates a class because they need to do create an entry point, a callback, an interface... basically anything since everything requires a class. Result: we have an class.

2. Programmer B sees a class and carelessly adds instance variables, turning the whole thing mutable. Result: we have an imperative ball of mud.

3. Another programmer adds implementation inheritance for code reuse (because instance variables made factoring out common code into a function impossible without refactoring to turn instance variables from step 2 into arguments). Result: we have an imperative ball of mud and a nightmare of arbitrary dynamic dispatch.

At some point reference cycles arise and grandchild objects hold references to their grandparents in order to produce... some flat dictionary later sent over the wire.

4. As more work is done over that bit of code, the situation only worsens. Refactoring is costly and tedious, so it doesn’t happen. Misery continues until code is removed, typically because it tends to accumulate inefficiencies around itself, forcing a rewrite.

karolinepauls commented on One universal antiviral to rule them all?   cuimc.columbia.edu/news/o... · Posted by u/breve
zahlman · 4 months ago
Aren't bacteria generally much larger than viruses?
karolinepauls · 4 months ago
Phages don't devour bacteria, they get inside and hijack them, like viruses tend to do with cells.

u/karolinepauls

KarmaCake day27July 24, 2024View Original