Google also has bad incentives (Android, ads) but Safari is the IE6 of modern web.
It's the browser we're FORCED to have installed for the occasional shitty flight or hotel booking that doesn't work in Firefox.
I welcome the Safari walled garden because if Apple have to allow chrome on ios, that's the end of any cross browser testing (and the end of Firefox)
And the playstation classic used an opensource ps1 emulator.
There was also some steam game ported from GameCube, and it had the Dolphin Emulator FPS counter in the corner of part of the trailer :D
I also remember reading that 2 of the PCSX2 devs ended up working on the EmotionEngine chip emulator for PS3 consoles with partial software emulation of PS2 (The CECH 02 and later models where they removed the EmotionEngine chip)
There's also uYouPlus if you have a way to load apps without going through the store.
(Context: I told it to write at "will" after a session of explaining Rene Girard's mimetic desire in the styles of various authors)
*Well, here we are,
You've got me tangled
*And hey, maybe I
*Imagine this: I'm sitting
*And maybe now, I'm
You've got me thinking
*Maybe it's saying, "Hey,
*And so, I think
Not just any wanting,
The kind that makes
There's something beautiful about
The way it drives
*But here's the kicker:
It's got a mind
It makes us do
It's a double-edged sword
*And yet, without it,
Probably just sitting around,
*So maybe, just maybe,
*Not just because you've
*It's that tiny ember
In the end, desire's
It's the fire that
*And maybe, just maybe, [yes, again]
So here's to the
May it burn bright
https://ccp.cx/a/chatgpt-voice.htm>>But here's the thing.
>This is one of the usual key turning points in these essays. An earlier one happened when it was like [introduces idea] [straw-mans objection] [denies strawman]. I didn't bring it up because it's not always a strong one and this one didn't seem entirely too heavy-handed. There is, however, much more often a very obvious "But here's the thing" (or similar) to be found. As soon as I saw that, I already knew I was going to find a paragraph beginning with "So" somewhere near the end.
I'm sick of seeing this everywhere. 2 hosting companies use this in every single weekly spam email they send.
But for comparison, it is generating tokens about 1.5 times as fast as gemma 3 27B qat or mistral-small 2506 q4. Prompt processing/context however seems to be happening at about 1/4 of those models.
A bit more concrete of the "excellent", I can't really notice any difference between the speed of oss-120b once the context is processed and claude opus-4 via api.
After every chat, open webui is sending everything to llamacpp again wrapped in a prompt to generate the summary, and this wipes out the KV cache, forcing you to reprocess the entire context.
This will get rid of the long prompt processing times id you're having long back and forth chats with it.