early on when I was doing iOS development I learned that "m34" was the magic trick to make flipping a view around have a nice perspective effect, and I didn't know what "m34" actually meant but I definitely knew what the effect of the line of code that mutated it was...
Googling on it now seems like a common experience for early iOS developers :)
https://stackoverflow.com/questions/14261180/need-better-and...
https://stackoverflow.com/questions/3881446/meaning-of-m34-o...
There isn't one?
Oh, maybe that's why people who didn't already know or care about emdashes are very alert to their presence.
If you have to do something very exotic with keypresses or copypaste from a tool or build your own macro to get something like an emdash, or , it's going to stand out, even if it's an integral part of standard operating systems.
Give me a company phone or you don't get this rule. I'm not using my phone for work if I can't have it out during work.
I use it 99% for work related things during work, though, with the 1% being happy birthday texts or something similar
And furthermore - aren't there shells that will give you the --help if you try to tab-complete certain commands? Obviously there's the issue of a lack of standardization for how command-line switches work, but broadly speaking it's not difficult to have a list of common (or even uncommon) commands and how their args work.
(spends a few minutes researching...)
This project evidently exists, and I think it's even fairly well supported in e.g. Debian-based systems: https://github.com/scop/bash-completion.
> but there is no invitation to guess, and no one pretends you don’t need the manual
which is basically what you're saying too? the problem with voice UIs and some LLM tools is that it's unclear which options and tools exist and there's no documentation of it.
1. Checking the current temp or weather
2. Setting an alarm, timer, or reminder
3. Skipping a music track or stopping the music altogether roughly 3 seconds after hearing the command, or 1 second after you assume it didn't work
<end of list>
for example I say: "play comically long album title by artist on Spotify", it thinks about that for five seconds, does the bing noise, then says "playing comically long album title [special remastered edition] by artist on Spotify", and then a few seconds later starts playing the album, and if you don't wait through that whole thing it will just decide that actually you didn't want to hear the album
Not to mention the likely need for continuous internet connectivity and service upkeep. Car companies aren't exactly known for good software governance.
I don't own a car but rent them occasionally on vacation in every one I've rented that I can remember since they started having the big touch screens that connect with your phone, the voice button on the steering wheel would just launch Siri (on CarPlay), which seems optimal—just have the phone software deal with it because the car companies are bad at software.
It seems to work fine for changing music when there's no passenger to do that, subject to only the usual limitations with Siri sucking—but I don't expect a car company to do better, and honestly the worst case I've can remember with music is that played the title track of an album rather than the album, which is admittedly ambiguous. Now I just say explicitly "play the album 'foo' by 'bar' on Spotify" and it works. It's definitely a lot safer than fumbling around with the touchscreen (and Spotify's CarPlay app is very limited for browsing anyways, for safety I assume but then my partner can't browse music either, which would be fine) or trying to juggle CDs back in the day.
Has anyone ever seen a manager mentoring ICs? I haven't. This is a senior/staff/principal responsibility.