They should feel stupid, for obvious reasons. I assume there's a "not" missing from that statement but, just like the next example,
> The ones who have a vague idea will wonder if they need to know more, or if they can skate by.
I'm here to learn. If fully understanding XYZ was a prerequisite for getting anything out of the talk, then either the presenter failed to communicate that ahead of time (their fault) or, I chose to ignore, or completely missed, the prereq (my fault, I'm objectively stupid, presenter is not to blame). Which leads to the next example.
> Half the ones who are confident that they know what XYZ is actually have it a little wrong, but won’t realize that they have a different conception than the speaker intended.
If the presenter asked without proceeding to clarify their POV on XYZ, how their specific POV on XYZ is crucial to the rest of the presentation, and how it might differ a bit from the common POV, why did they ask the question? Presenter is the stupid.
> In less time than it took to poll and embarrass the audience, you’ve:
> Complimented your listeners about already knowing what XYZ is.
The ones who have no idea what XYZ is now feel extra stupid.
> Defined XYZ, so those that didn’t know now do.
A single sentence never fully explained anything. The no ideaers are wondering what it means to mumble, what a frabbitz is, and why it might be important to optimize the process in systems with more than three quuxers (which they only assume has nothing to do with how it sounds).
> Explained your view of XYZ, so that even those that did know it now have your take on it as a common starting point for the rest of the talk.
In an IT crowd, it's guaranteed there's something about that brief description they highly object to, but they likely won't say anything.
Absolutely, but I get the feeling you're taking the common statement "learning C can be a great experience" and bastardazing it into a straw-man "you NEED to learn C" that you then argue against.
I just think it's an obvious statement and it reads like you're being nagged by people to learn, but you don't want to learn C and defend your position. For me nothing has taught me so much about development than a weekend hacking in C and stepping through the code with Valgrind or reading the binary output did. And I recommend all my peers do the same, most don't and that's okay. But I think it's a shame that people in general don't seem to care that much outside of work.
I did enjoy the read though, even if clickbaity.
This might just be because C++ broke my brain into assuming:
Object foo = other_object;
is a copy operation, and taking a reference or pointer would require extra characters (e.g. copy by default). Most other languages are the opposite: assignment creates a reference by default, and making a copy would require extra characters (e.g. .copy() in Python or .clone() in Java). That's my biggest mental adjustment when moving back-and-forth between C++ and Python.