I use AI for different things, though, including proofreading posts on political topics. I have run into situations where ChatGPT just freezes and refuses. Example: discussing the recent rape case involving a 12-year-old in Austria. I assume its guardrails detect "sex + kid" and give a hard "no" regardless of the actual context or content.
That is unacceptable.
That's like your word processor refusing to let you write about sensitive topics. It's a tool, it doesn't get to make that choice.
But, also yes, it's a pain in the ass and a frustrating kind of necessary evil. So there is room for improvements.
Nextjs is a living hell. The ironic thing is AI makes it dramatically more tolerable to the point it's actually pretty good. But that can't be a good thing in terms of architectural design can it? We're in odd times.
Of course, it's easy to be a hater on the sidelines. I am guilty. Nextjs likely just does too much in it's own made-from-scratch clever way. use-client vs server is just out-of-the gate ridiculous. But then I suppose the real question is "well if you want isomorphic environment, how else are you going to do it?". My answers is "I wouldn't!" but then vercel and the powers that be seem to have won the mindshare of the product crowd. At least for now.
Both https://typesense.org/ and https://duckdb.org/ (with their spatial plugin) are excellent geo performance wise, the latter now seems really production ready, especially when the data doesn’t change that often. Both fully open source including clustered/sharded setups.
No affiliation at all, just really happy camper.
Today, it's almost a national societal resignation that "you have no privacy, get over it." I wish that weren't the case, but I'd like to see more representation embrace privacy as the basic right it should be again.