Grew up in NSW for 25 years. Nothing has changed. A few extra toll roads.
As a user, you have no way to see if a photo has been "scanned" with smart features and what it has detected (e,g found person x, found dog, blue sky, beach etc).
Trips features, has this algorithm finished scanning your library? You have no idea, it's just hidden.
Faces, detection, has this completely scanned your library? You don't know. Photos that don't seem to have faces detect, was it scanned or failed or did it not scan yet?
The list is nearly endless - but in line with the rest of the direction of MacOS, getting worse.
Time will tell but there's evidence that some government staff grew inexplicably wealthy while in office which would suggest corruption. Corruption in government is terrible for the average citizen, ask anyone from a country that suffers from a lot of it.
I really fail to see why auditing government spending is a bad thing?
I am not a Trump voter. I agree with the outcome they have stated - reduce stupid spending - but I have no idea thats the true motivation, the true goal and I disagree with the manner in which they are doing it. Just because you agree with the dictator doesn't make it right?
It’s laughable.
I found Turkish airlines staff to be more stand-offish than others when I flew with them internationally a few years back. Is that anybody else's everyday experience with them?
I've also gone the Zigbee2MQTT (ZQM) route instead of the ZHM built into HomeAssistant as it just supports a lot more, and a lot better in my experience.
Once you're in the open world of Zigbee, you can also go the Ikea Tradfri route as well for bulbs that are a WHOLE lot cheaper.
Make it as easy to run as something like datasette.
I quickly ruled out using database/sql drivers as the indirection through interface types added a bunch of overhead and stymied my attempts for reasonable memory layout. For my use-case, I found the crawshaw driver performed the best, but I ended up forking it as well as the Golang standard library CSV parser as I found defensive copying & allocation was the largest bottleneck. I ended up cycling several very large arenas among a CSV parser thread that filled the arena with column bytes and several threads writing to different temporary sqlite databases. Then at the end I ATTACHED them together and copied them into one big file (idk exactly why this is faster, but my profiles showed most cpu time spent in sqlite doing query binding things so MOAR CORES).
One notable optimization was exposing a way to bind borrowed bytes to query parameters without inducing a copy in either Golang caller code, or SQLite library code. The crawshaw driver upstream only exposes sqlite_bind_blob with SQLITE_TRANSIENT mode, which tells SQLite to copy the input to a private allocation before returning from the sqlite_bind* call. I added a version that passes SQLITE_STATIC, which means "trust me, I won't touch these bytes until the query is done, and I'll free them afterwards". This is safe in Rust who's "borrow" and "lifetime" concept models this perfectly, but I guess in Golang its dicey enough to not expose in your public package.
Here's the relevant commit in my fork: https://github.com/crawshaw/sqlite/commit/82ad4f03528e8fdc6a...
I'm curious how OP's https://github.com/cvilsmeier/sqinn would fare, I'm somewhat sus about copying 200GB to stdin but the benchmark results are pretty good so ¯\_(ツ)_/¯