The argument goes that because we are intentionally constraining the model - I believe OAI’s method is a soft max (I think, rusty on my ML math) to get tokens sorted by probability then taking the first that aligns with the current state machine - we get less creativity.
Maybe, but a one-off vibes example is hardly proof. I still use structured output regularly.
Oh, and tool calling is almost certainly implemented atop structured output. After all, it’s forcing the model to respond with a JSON schema representing the tool arguments. I struggle to believe that this is adequate for tool calling but inadequate for general purpose use.
- History, most likely
There’s some subtle bits to the humor depending on how charitable you’re feeling. It might just be absurdist, as in “Blackbeard’s guide to astrobiology”, or it may be more mean spirited and playing on a belief that she is not intelligent.
TL;DR - the joke formula is just:
subject=…
person_not_familiar_with_subject=…
joke=“${person_not_familiar_with_subject}’s guide to {subject}“
And the amount of implied cruelty in the comparison is variable.
None of this is to say that PE firms squeezing vital hospitals aren't morally culpable. Just that there's a meaningful distinction between immoral decisionmaking and violence.
Consider that when speeding, you might cause an accident. Such an accident would most likely impact a small number of people other than yourself.
When a PE firm engages in extractive hospital management, it provably increases mortality rate, and it does so at scale.
The first choice carries possible risks of lower magnitude, the second choice carries guaranteed risk of higher magnitude.
“Risky behavior” vs “ruthless greed”, the latter feels much closer to violence.
Who knew at the time they were creating games that would be disassembled, deconstructed, reverse engineered. Do any of us think about that regarding any program we write?
I wonder if any sense this is criticism (or actual criticism) is based on implementers of SaaS who have it so deeply ingrained that “haha what if the users of this software did this really extreme thing” is more like “oh shit what if the users of this software did this really extreme thing”.
When I worked on Google cloud storage, I once shipped a feature that briefly broke single-shot uploads of more than 2gb. I didn’t consider this use case because it was so absurd - anything larger than 2mb is recommended to go through a resumable/retryable flow, not a one-shot that either sends it all correctly the first time or fails. Client libraries enforced this, but not the APIs! It was an easy fix with that knowledge, but the lesson remained to me that whatever extreme behaviors you allow in your API will be found, so you have to be very paranoid about what you allow if you don’t want to support it indefinitely (which we tried to do, it was hard).
Anyway in this case that level of paranoia would make no sense. The programmers of this age made amazing, highly coreographed programs that ran exactly as intended on the right hardware and timing.
Depending on how well we assume an LLM would do at this task, it’s an interesting way to see what “real people” would think about a very hypothetical situation.
On the other hand, no one today in any other part of the economy would decide they want to start a new business, with small scale, in a commodity market. Software startups rarely go up against Windows, Excel, Google search, or Amazon e-commerce.
I grew up in a farming area in the midwest, and even then (several decades ago) there was no realistic prospect of doing well, but most (not all) farmers insisted on growing corn, soy, or one of a very few other commodity crops. I'm not surprised that this doesn't work out well; it doesn't work out in any non-agricultural sector either. Small businesses have to go into niche markets, and that is not a new phenomenon. I recall reading (and hearing) almost exactly these same complaints in the 1980's, straight from the farmers, but the idea of growing other crops just made them irritable.
And then, they would cheer the arrival of a big Wal-Mart in town, and go shop there instead of the small store they had been buying from.