I would encourage people to think even tinier than the items on their list. Calling some of these items tiny reeks of humblebragging.
At the start of my programming explorations, tiny would be something like learn how to create a file with an editor and how to save it. Or how to add 2 + 2 in bash. Not full blown programs, but bricks that can be put together.
And now no longer as a beginner, there can still be tiny things on the same scale that I’m doing for the first time because it’s a new problem or a new language.
I wouldn't be so harsh on the author, as to accuse them of humblebragging. What counts as 'tiny' depends entirely on where you are in a learning journey. What is 'tiny' for someone with 5 years experience is 'huge' for someone new.
As shown by fairly regular posts on imposter syndrome, we can all identify our own failings, so regularly think we don't know enough. We're mostly all seeking progress, so this is a tiny step, for her, based on where she was previously.
Also, re: imposter syndrome and seeking progress, I appreciate Dr Ben Hardy's concept of "Gain vs Gap". We tend to focus on the Gap in front of us (between where we are now and where we want to be), but it's important also to look back at the Gain (relative to where we started). The Gap will always be there, as our goals inevitably evolve, so focusing exclusively on it can be disheartening. The Gain, on the other hand, steadily increases according to our efforts. Using the OP's method of accumulating concrete evidence of increase in the Gain makes good sense to me.
>What is 'tiny' for someone with 5 years experience is 'huge' for someone new.
While true, this is exactly the boring conventional wisdom that I tried to steer attention away from with my above comment.
While it’s a blatantly obvious duh-level truism that people can take on larger challenges as they learn, I mean come on… it’s helpful to realize that there remain tiny things — tiny by almost anyone’s definition — even later in a career. I’m not just starting out and just last week I learned yet another way to add in bash, for instance.
One of the principles Evans is leveraging here but not really explicitly talking about is the power of time coupled with compounding growth. She's been working at this stuff for a decade now and continually building and learning. She's also been writing about what she learns which probably helps a great deal.
It might just be a case of Baader-Meinhof, but this is a theme that seems to be coming up more and more. James Clear's Atomic Habits book is built on the idea and Cal Newport (Deep Work) has been writing about it[1] under the name of slow productivity.
I think many of us would be better off if we took a longer term view of ourselves.
But if you don’t know much C (or even much about programming at all) you might not have the context to even know that. Which is related to the point of the article. Goals are hard because they require you to predict the future. Milestones are easier because they are past looking. But if you just use your current skill level you may not even account those past things as milestones. So you need to consciously celebrate them.
I think programming culture is full of people trying to up sell themselves, and it gives everyone a warped perception of self worth. I agree with this whole blog post, but it still itches me that the author has to frame his accomplishments as tiny when they seemed non trivial and apparently took place over nine years. It’s like if you aren’t working on enough buzzwords you must hedge your self worth.
The author describes just the most basic of all advice for achieving goals: subdividing them into smaller goals. Depending on the nature of the top goal, it may, of course, even be advisable to have multiple levels of subgoals. ("learn C/C++" -> "write a tiny Linux kernel module" -> "learn about the Linux kernel" -> "learn about Linux device drivers" -> "learn about Linux device drivers for sound" -> ...)
I think having concrete and realistic goals it the key. You can probably never tell yourself that you completely finished learning C++. But you can say that you created a small C++ project that calculates the fibonacci numbers, or that you read the next chapter of some C++ book.
Writing C without considering memory management is equivalent to writing conditionals, loops, functions, and primitive data structures which is what anyone can do with a 2 day codecademy course.
At the start of my programming explorations, tiny would be something like learn how to create a file with an editor and how to save it. Or how to add 2 + 2 in bash. Not full blown programs, but bricks that can be put together.
And now no longer as a beginner, there can still be tiny things on the same scale that I’m doing for the first time because it’s a new problem or a new language.
As shown by fairly regular posts on imposter syndrome, we can all identify our own failings, so regularly think we don't know enough. We're mostly all seeking progress, so this is a tiny step, for her, based on where she was previously.
Also, re: imposter syndrome and seeking progress, I appreciate Dr Ben Hardy's concept of "Gain vs Gap". We tend to focus on the Gap in front of us (between where we are now and where we want to be), but it's important also to look back at the Gain (relative to where we started). The Gap will always be there, as our goals inevitably evolve, so focusing exclusively on it can be disheartening. The Gain, on the other hand, steadily increases according to our efforts. Using the OP's method of accumulating concrete evidence of increase in the Gain makes good sense to me.
While true, this is exactly the boring conventional wisdom that I tried to steer attention away from with my above comment.
While it’s a blatantly obvious duh-level truism that people can take on larger challenges as they learn, I mean come on… it’s helpful to realize that there remain tiny things — tiny by almost anyone’s definition — even later in a career. I’m not just starting out and just last week I learned yet another way to add in bash, for instance.
Deleted Comment
It might just be a case of Baader-Meinhof, but this is a theme that seems to be coming up more and more. James Clear's Atomic Habits book is built on the idea and Cal Newport (Deep Work) has been writing about it[1] under the name of slow productivity.
I think many of us would be better off if we took a longer term view of ourselves.
[1]: https://www.newyorker.com/culture/office-space/its-time-to-e...
That's a crazy ambitious goal. I don't think I've met a person that wrote C code outside of work and it was used by other people.
(actually her accomplishments)
I met a bunch of people in my life that were smarter than me. All I do feels small compares to what they did.
On the other hand, I don't remember people who did stuff that was less impressive than mine. Only the ones that failed spectacularly.
But also:
> I’m comfortable writing very basic C programs as long as they don’t have to do anything fancy like “memory management”
So when do you consider you learn something? Aren't we always learning?