My knowledge gets outdated: APIs change, the top of the hour is a different one.
Skills and especially abstract skills don’t get outdated as fast:
Writing cuda kernels is surprisingly like the stuff we did in my first ever C class 25 years ago and I am still reading doc the same way my teacher taught me in 7th grade.
The more you look the more things are the same.
All that is done was done and has been done before; there is nothing new under the sun.
It is similar to programming or software in that it involves both skills and knowledge. But the knowledge doe not quickly become obsolete. If you think about it, just about every activity involves both knowledge and skill. There are also different depths to knowledge. The knowledge required to be functional in a field may not be the same level of knowledge required to advance that field.
I think there are skill and knowledge components to math—skill at manipulating equations, knowledge of theorems and identities. These play into each other, of course.
Learning a specific technology for a single project may have a short half-life. However, good coders aren't defined by tech knowledge, but by their deep understanding. If you can make great presentations in Powerpoint, everyone knows you'll still make good presentations in Google Slides.
Exactly. It's like the difference between thinking about tech proficiency in terms of "being good at C++" vs. being good at software engineering and being language agnostic.
I am not sure about that example. A lot of people learn to use a specific piece of software and memorise how to do stuff and are therefore confused by the slightest difference.
Do people really believe that knowledge from more than 30 months ago has no value? Even the people doing keyword searches on resumes are smarter than that.
I started doing NLP in 2014. First, I was using SVM and feature vectors, then word embeddings, then handcrafted neural network models, then fine-tuning transformer encoders, then working with LLMs. In that time I worked with huge number of technologies, libraries, and algorithms. A hiring manager recently asked me what my experience with AI agents are, and I had to say that it's basically zero.
Okay, he was obviously very new to the field and had no idea, but it illustrates how the field progressed in the past 10 years, and a person who is just joining has very similar starting line to old-timers. The breadth of knowledge I have is of course extremely useful and I am able to get new concepts really fast, as there are many similarities. But the market in general does not care that much really.
They are talking about the half-life, so it should never become valueless, just drop exponentially.
Also, skills and knowledge are different things, right? I’d believe that half the skills picked up in a fast-growing field are obsolete after a couple years.
The article compares its value to that a Nokia flip phone. Nokia flip phones, while not as valuable as an iPhone, aren't worthless. They can still fetch something on the open market.
The Lindy effect can be a useful heuristic. Something invented 30 months probably has less long-term value something that was invented 10 years ago and is being used.
What is "half of what you learn"? Frankly, I think that people underrate the amount of learning that goes into even the smallest things in software development. I think about system utilities: bash/zsh, git, vim, tmux, make, ssh, rsync, docker, LSPs, grep -- all of these are useful and have been useful for a decade or more. C, C++, Java, Python -- all languages which have been useful and will continue to be useful; languages like Go and Rust are really exceptions, not the rule, and even when new ones come onto the scene, by and large languages stay the same more than they change. Things like threading and concurrency, how to manage mutexes. Or background information about the Linux kernel, how it works, how paging, processes, system-calls and the like work underneath the hood. Of course, architectural information is essential if you're doing anything performant, and the minimum time for things to change in that space is the 3 years hardware development cycle, and more practically 5 years or more. Even with GPUs: many things have changed, but practically if you learned CUDA 5 years ago you'd still be doing great today.
>Consider the mathematics of futility: a company investing $2m in training its technical team today can expect half that knowledge to depreciate faster than a luxury car.
It's not like companies are molting their stacks every two years or something. If you are hired today, you skill and knowledge will evolve as the company evolves
I tried and failed to find some kind of concrete methodology that they used to get to the number 30 months. I'm still waiting for quadratic algebra to make my knowledge of linear algebra obsolete.
The things I've learned about the JVM in the 2000's are still mostly true, perhaps with a bit of tweaking.
The things I've learned about process, project management, some distilled concepts around refactoring, testing - all still very valuable and as true as when I learned them. Perhaps not the specific tools, but the concepts are valuable.
Learning C decades ago still has lots of value. Not to mention SQL - come on.
Learn that cool JS tech stack a few years back? Yeah, it's probably dead or radically changed. That integration with Company X? Same.
So clearly there's some distinction to be made here. People are still programming in FORTRAN in some niches. You can decide to invest in boring and stable approaches, or live on the bleeding edge relying on someone's weekend vibe code session.
"courtesy of Harvard Business Review" - there's your problem. Don't look to some MBAs to give you nuanced tech insight. The author of this article: "Harald Agterhuis" is just some recruiter. Of course he's got an incentive to push this BS.
It seems to be relevant to hiring and startups, that is, if you hire people for particular tech skills, or build your startup around some specific area of expertise, half of these skills will be irrelevant in 2.5 years.
On a personal level, the fundamentals will be useful for your entire career, and the more you know, the faster you will be able to get the skill-du-jour. But the idea is that on your résumé, expect to change half of the lines in your "skills" section every 2.5 years, even if it takes you no more than a few hours to add these lines.
It also brushes aside tech in industries like defense, aviation, assembly lines, etc... where you have big, expensive machines, certifications, and projects that span decades. I wouldn't be surprised to find some Fortran code somewhere in the foundries that build the latest AI chips as EUV lithography literally took decades of R&D before it went to production.
My knowledge gets outdated: APIs change, the top of the hour is a different one.
Skills and especially abstract skills don’t get outdated as fast: Writing cuda kernels is surprisingly like the stuff we did in my first ever C class 25 years ago and I am still reading doc the same way my teacher taught me in 7th grade.
The more you look the more things are the same. All that is done was done and has been done before; there is nothing new under the sun.
Deleted Comment
Okay, he was obviously very new to the field and had no idea, but it illustrates how the field progressed in the past 10 years, and a person who is just joining has very similar starting line to old-timers. The breadth of knowledge I have is of course extremely useful and I am able to get new concepts really fast, as there are many similarities. But the market in general does not care that much really.
Also, skills and knowledge are different things, right? I’d believe that half the skills picked up in a fast-growing field are obsolete after a couple years.
Beyond that, the essay is a rambling mishmash of ideas and unsourced assertions with no real point to it.
It's not like companies are molting their stacks every two years or something. If you are hired today, you skill and knowledge will evolve as the company evolves
The things I've learned about the JVM in the 2000's are still mostly true, perhaps with a bit of tweaking.
The things I've learned about process, project management, some distilled concepts around refactoring, testing - all still very valuable and as true as when I learned them. Perhaps not the specific tools, but the concepts are valuable.
Learning C decades ago still has lots of value. Not to mention SQL - come on.
Learn that cool JS tech stack a few years back? Yeah, it's probably dead or radically changed. That integration with Company X? Same.
So clearly there's some distinction to be made here. People are still programming in FORTRAN in some niches. You can decide to invest in boring and stable approaches, or live on the bleeding edge relying on someone's weekend vibe code session.
"courtesy of Harvard Business Review" - there's your problem. Don't look to some MBAs to give you nuanced tech insight. The author of this article: "Harald Agterhuis" is just some recruiter. Of course he's got an incentive to push this BS.
My recommendation? Flag this low quality article.
On a personal level, the fundamentals will be useful for your entire career, and the more you know, the faster you will be able to get the skill-du-jour. But the idea is that on your résumé, expect to change half of the lines in your "skills" section every 2.5 years, even if it takes you no more than a few hours to add these lines.
It also brushes aside tech in industries like defense, aviation, assembly lines, etc... where you have big, expensive machines, certifications, and projects that span decades. I wouldn't be surprised to find some Fortran code somewhere in the foundries that build the latest AI chips as EUV lithography literally took decades of R&D before it went to production.