Those I know who lived through this issue when digital editing really became cheap seem to be more sanguine about it, while the younger generation on the opposite side side is some combination “whatever” or frustrated but accept that yet another of countless weird things has invade a reality that was never quite right to begin with.
The folks in between, I’d say about the 20 years from age 20 to 40, are the most annoyed though. The eye of the storm on the way to proving that cyberpunk lacked the required imagination to properly calibrate our sense of when things were going to really get insane.
I was in a mates car recently and it scared the hell out of me, he was tail gating for most of a 3 hour journey. Eventually we got to a bit with chevrons and he wasn't obeying the rule staying N chevrons away from the car in front. I told him and he replied "nonsense, my car beeps if I'm too close to the car in front" I didn't have the energy to point out that is a collision warning not a safe distance measurer type device.
I don't even need to keep an eye on my cooking anymore, the smoke alarm beeps when I get too close.
P.S.
"The goal was to create a reliable, distributed communication system that could continue operating even if parts of it were damaged by a nuclear attack."
This is a myth. The ARPANET was not hardened; quite the opposite. ARPA's goal was for their researchers located across the country to easily share their work ... initially it was just used to share papers, before Ray Tomlinson invented email. Beyond that, JCR Licklider who laid the conceptual foundations was looking toward something along the lines of today's Internet + AI:
https://en.wikipedia.org/wiki/Man%E2%80%93Computer_Symbiosis
P.P.S. Steve Crocker's MIT PhD thesis was on man-machine symbiosis. I know this because he mentioned it to me when I met him in the UCLA Computer Club which he came to because he wanted to teach an informal class on LISP and Theorem Proving, and the club organized such classes. We got to talking about his thesis, he posed some challenges to me that I got lucky in solving, and he immediately offered me a job (he was the head of the ARPANET project at UCLA, under Leonard Kleinrock) that shaped the rest of my life--I'm greatly indebted to him.
Y.A.P.S. Steve Crocker received the Jonathan B. Postel Award (created by Vint Cerf) last year.
So it wasn't a design consideration for ARPANET, but it would have shown up in enough early papers to give the myth some legs.
>The pins that were supposed to nestle into the motherboard were instead pointing skyward
even mean? How do you physically solder a chip the wrong way around?
The story seems totally unbelievable. This is a training session, someone asks a potentially reasonable question and then is just let go? Hiring people is expensive and letting someone go over something like that is ridiculous.
The story isn't even alleging that the manager disagreed or that the manager tried to argue there was no defect. If you take the story as told it is completely nonsensical.
With effort, and bodge-wire. I've seen chips done dead-bug style when the board's been messed up (eg, the footprint is orientated for the bottom of the board, but placed on the top, and vice-versa).
It's definitely not something you'd ship, but a kludge that can get you working until the next board spin.
And I've posted benchmark data to my sbc-reviews repo here: https://github.com/geerlingguy/sbc-reviews/issues/81
Performance-wise it's pretty much the same as the Pi 5 16GB (and can be slightly faster than the regular Pi 500 depending on the task, if it benefits from faster storage or more RAM...)
Since this is the first Pi with built-in NVMe (I'm not counting the Compute Module Developer Kit), I plugged in an eGPU and tested a new 15-line patch for AMD GPU drivers, which seems to support practically all modern AMD graphics cards[1].
[1] https://www.jeffgeerling.com/blog/2025/full-egpu-acceleratio...
I really want to hope the name is a nod to the Amiga 500+ (which had twice the RAM of the A500 ..)
In the US, if you roll up to a random charging station you may or may not find a plug matching your car’s port.
The most realistic way to hit this would be to have built an image 18 months ago, on top of :testing or :unstable, and then not update or rebuild it at any time in those 18 months - in which case removing anything from the repo wouldn't help you. Or be purposely trying to recreate an affected environment for testing/research, in which case it's on you.
You're not wrong that we should keep our shields up - but "update sometime in the last 18 months" perhaps isn't such a revelation.
One thing does come to mind though - I do wonder if there's a way to strongarm apt's dependencies mechanism into having openssh-server conflict with affected libxz versions, so that if you did apt update && apt install openssh-server in an affected image, it'd bring a fixed libxz along for the ride. (and the images don't carry apt manifests, so apt update is required and you would have today's package list.) You could still pin an affected version, so there'd still be enough rope to allow you to recreate a research environment.
https://cartoonstockart.com/featured/the-two-things-that-rea...
I do value the inconvenience. When I put an album on, I put an album on. I don't hit next, random, go wandering off down rabbitholes. I put the album on.
And I do see the cost as a feature, somewhat. It feels like I got something for my money, in a way that paying for a zip doesn't.