In this case the "automated test" flipped all kinds of configuration options with repeated reboots of a physical workstation. It took hours to run the tests, and your workstation would be constantly rebooting, so you wouldn't be accomplishing anything else for the rest of the day. It was faster and cheaper to require 8 devs to rollback to yesterday's build maybe once every couple of quarters than to snarl the whole development process with that.
The tests still ran, but they were owned and run by a dedicated test engineer prior to merging the branch up.
You can accomplish better insulation and thermal mass above the ground, while having better ventilation and less humidity.
Up the chain to automated test machines, right?
"What Does Nevada’s $35 Billion Fund Manager Do All Day? Nothing"
This is the actual title of the article, the page, and the printed version. There was no reason to have edited this except for optics. If true, that's absurd, dang.
In comparison to the usual suspects like Apollo guidance computer or TI-83, it has a higher clock speed but shorter word length and extremely limited ram and memory. Precisely due to the cost engineering - this is meant to run simple, limited size programs.
> Honestly, this is where I'd love RISC-V to be making inroads
A 32-bit CPU is way too big for this price point still.