The problem is that people don't necessarily bother to form a cognitive compression of a large topic until they really have to. That's because they already carry other large cognitive burdens with them, so they (we!) tend to resist adding new ones. If you can rely on someone else knowing some topic X well, you might just do that and not bother getting to know topic X well-enough. For those who know topic X well the best way to reduce help demand is to help others understand a minimal amount of topic X.
> So, bash is a programming language, right? But it's one of the weirdest programming languages that I work with.
Yes, `set -e` is broken. The need to quote everything (default splitting on $IFS) is broken. Globbing should be something one has to explicitly ask for -- sure, on the command-line that would be annoying, but in scripts it's a different story, and then you have to disable globbing globally, and globbing where you want to gets hard. Lots of bad defaults like that.
It's not just Bash, but also Ksh, and really, all the shells with the Bourne shell in their cultural or actual lineage.
As for SQL, yes, lots of people want the order of clauses to be redone. There's no reason it couldn't be -- I think it'd be a relatively small change to existing SQL parsers to allow clauses to come in different orders. But I don't have this particular cognitive problem, and I think it's because I know to look at the table sources first, but I'm not sure.
We should peel off SQL and get access to the underlying layers.
Tools that do this make things clearer almost immediately. Consider the developer tools in a web browser. Do you remember the "dark ages" before such things existed? It was awful because you had to guess instead of seeing what was going on.
Tools like Wireshark that show you every last byte of network packets that it has access to AND parses it to help you see the structure. This isn't just for debugging networking data; it's hugely beneficial in teaching networking concepts because nothing is hidden.
This is also one of my favorite things about open source software. I can view the source to understand what's causing a bug, to fill in knowledge gaps left by the documentation, or just learn more about programming concepts. Nothing is hidden.
We are visualizing things in our head already. And any explanation of anything in computing is a diagram. But we have zero diagrams when coding.
Just dynamically instrument all code to send messages to a GUI.
It's not like a revolutionary killer feature for me, but I just always preferred having information always in my "peripheral" instead of having to actively check a separate menu.
I have no problem with Apple bundling these apps and making them work seamlessly together, and I don't even mind that they're all updated simultaneously (except for Safari, which I wish I could update independently without relying on the "Technology Preview" beta channel). But I do have a problem with upgrading my entire OS and disabling the new bloatware features just because I want to keep auto-updates enabled. I used to delay updating and then would end up way behind, which is why I enrolled in auto-update. But now it feels like I'm being held hostage to their update schedule.
And for what benefit? There are hardly any useful OS-level changes in this release, but there are a bunch of new features I'll need to disable (while hoping the next auto-update doesn't break my external monitor), all powered by freshly written code contributing to an expanded attack surface. If I had my way, then I'd take the OS updates and skip all the apps. Keep the attack surface small while still meaningfully improving the core. I don't care about the rest.
Hierarchy seems more rigid less general than tags but when it works--it works.
Hierarchy is easy in the physical world.
But what is crazy is since the dawn of computing we can store data however we want and project it however we want…and yet we still use hierarchy for file storage…like we still just have a filing cabinet of manilla folders.