I think this article hits on some truths and gets a handful of things wrong. First, they are correct that we are in a dynamic profession that requires constant learning and expanding. No doubt the people who choose to stay in software are likely to be people who are curious, life-long learners where this is a benefit of the profession rather than a drawback. That said, one thing I noticed from teaching computer science while a graduate student is that the poor students think about languages and libraries as key skills, while the better students think about the data structures, algorithms, and design principles. These slightly more meta-level concepts are far more stable and timeless, while the underlying implementations of them are constantly evolving. If you think of each new library as "having to learn a new skill" I can imagine burnout and overload are more likely, but if you figure once you know 3D graphics you can pretty easily pickup any new graphics engine, then it might even seem fun to explore different takes on the same old concepts. It's like hearing a new band in your favorite genre.
As for attributing burnout as the core issue here, I would strongly disagree with this idea. When teaching undergrads I noticed immediately that a good portion of each student cohort was basically there because they were interested in making money in the future rather than exploring the ideas of computer science. They were no doubt going to get frustrated or bored and move into management or some other profession rather than continue to expand as an engineer. This is totally fine, and they are probably richer for having learned what they did, but I don't know why we can't just see this and appreciate it for what it is rather than portraying it as the drama of burnout.
It's hard to blame students when they look at job postings and all that they see advertised is positions for programmers using languages X, Y and Z, or when they see tweets and blog posts by hotshot programmers about frameworks A and B.
The entire industry focuses way too much on 'experience with tool X' as a proxy for 'technical skill Y'. It's a bit like asking a carpenter how many years of experience they have with DeWalt cordless nail guns rather than ask about their house framing skills.
Worse, industry is routinely bashing on CS degrees because they don't turn people into framework X-ready candidates. It's getting a little tiring just how little credit is given to the idea of "maybe these tools can be learned in a reasonable amount of time by people with a degree showing they can pick things up rather quickly".
> It's a bit like asking a carpenter how many years of experience they have with DeWalt cordless nail guns rather than ask about their house framing skills.
This kind of logic only works for tech organizations that already have enough in-house domain expertise to onboard new programmers. The other day somebody asked me how to find a programmer to implement something for them. From a programming standpoint there was very little to do, but it involved many obscure technologies that you couldn't pickup in a day (and no you can't pick different technologies). For a person who's already done something similar it'd be a quick and easy job, shouldn't cost too much. With a generic programmer it'd take much longer, cost much more and you couldn't be sure they'd actually deliver.
>The entire industry focuses way too much on 'experience with tool X' as a proxy for 'technical skill Y'
Strongly emphasizing this. This is HIGHY applicable to the analytics environment. As a business analyst who specialized in mostly ad-hoc development because it was the most value-add area at the companies I worked with.. I had a lot of trouble finding new work because I didnt use Tableau, or Power BI, or Looker, etc. I was some sort of fool for doing everything in SQL, Excel, and Python.
IMO the tools are great, and you need a much lower level understanding of analytical concepts to get value from them. But for some reason people kept getting the impression I would somehow be less effective with them because I dont use them. And I had trouble correcting them with the limited bandwidth that exists to communicate with a single applicant in the hiring process. If I tried to get right to the point, I felt myself appearing arrogant.
The carpentry analogy is very similar to how i described it. "I am currently using a ruler and screwdriver, and these tools provide lasers and power drills"
It's interesting because at least in terms of working professionals, of the most productive professions I've worked with, the ones who focus on "meta-level" concepts are usually the ones who overthink every detail and get very little work done and ultimately they are the ones who burn out.
They tend to bike-shed details, take way too long to try to create sophisticated abstractions that never quite achieve the silver bullet they originally thought it would, and spend too much time dwelling on irrelevant details that ultimately leads no where and results in a kind of paralysis that can be very hard to break out from.
The ones who master a specific language or master a specific library/technology that focuses on doing a few things very well and in very concrete terms are able to deliver the most business value. Furthermore they absolutely have the ability to take their mastery of that and map it to new languages or new libraries. I mean I'm not talking about going from Java to Haskell or Haskell to Idris, but rather people who master C++ can fairly easily pick up Java or Python or TypeScript. People who have mastered Unity can easily pick up Unreal Engine. People who have mastered web development can easily pick up mobile development.
The idea that people who have a solid mastery of a technology and a programming language are somehow just stuck and unable to take those skills and apply them to other areas I think is overstated and just untrue, but those who treat software engineering as highly theoretical and focus on abstractions, design principles and get caught up on these high level details tend to not get much done and when they realize that software is not as clean and elegant as they would like it to be, they get burned out and give up.
I think going over any substantial codebase for products that are widely used and deliver solid business value on Github where most code is not at all reflective of the ideals often espoused on blog posts validates my point of view.
In short, people who treat software as just a tool to accomplish a concrete task are more productive than those who write software for the sake of writing software. They don't write the cleanest code, or the most elegant data structures and algorithms, but they produce the greatest amount of tangible business value.
If your comment does anything for me its to show how terribly few words we have to discuss these things.
> "meta-level" concepts
I'd say having a strong grasp of what you can achieve with just using files and folder, or understanding how SQL solves en entire problem space are meta level concepts. Its just that we take them for granted.
> business value
Is apparently something different than 'value', but still includes every software ever that was valuable to a business?
> high level details
...?
> software engineering
Building constraint solver for a compiler or ensuring a JS animation centers a div?
> highly theoretical and focus on abstractions, design principles
I'd recognize all these things. But out of context 'in the general case' they become meaningless.
---
I understand the picture you are trying to paint, but i don't think it tells anything beyond "I've noticed people make things overly complex". I agree.
However, keep in mind the 'get things done and provides value' software you've seen: is the software 'that survived', might have been set up by a very experienced person ( whose failures we're not seeing ), nobody might recognize it as being not-simple ( e.g. I've seen high value business software partial recreate 'regex'. Worked great, straightforward and easy to read function, just ~50 lines or so, could have been a single function call. ), how the requirements are presented is hugely important.
I think no one was writing about ones who master specific language.
There is a lot of people who learn just a surface without going deep into tool and think they know enough.
For me it seems that someone who would really go deep into learning language would get most of theoretical stuff on the way. Because there is no way to really master C++ or really master Java without learning about data structures and all kinds of "meta-level" concepts.
Maybe the difference is mostly approach to learning more practical/more theoretical.
I understand the concept here but there is also a level of right tool for the job.
Some guys see a screw and reach for their trusty hammer. Some guys know to grab a screwdriver.
I had a project the last two weeks where the code was just going to fail about as often as it was going to succeed. I had to write a resource manager and an Erlang style supervisor and use an embedded key value store.
A better dev may have intuited what took me basically a midstream rewrite to figure out, a worse developer may still be grinding on the problem.
I think my solve is "robust enough" but there was no real way to power through that. You either found the right abstractions or you didn't.
Not to be rude, but did you finish reading the article? The whole point was that high-aptitude learners give up. In fact I don't agree that re-learning the same tasks over and over with the zillionth framework iteration is a rewarding learning experience. It makes perfect sense to change careers instead.
And as music goes, you sound like the record companies that thought everyone should listen to disco for the next 50 years...
As I’ve gotten older it’s harder for me to learn an entirely new area (like say going from web dev to mobile or ML). But it’s actually easier to learn a new variation of something (like the latest JS framework) because it’s usually pretty similar to one of the things I already know. I guess this leads to increasing specialisation, but it also means studies that merely count “new skills” will be misleading if they don’t differentiate in which way the skills are new.
I'm the complete opposite. Hand me a new JS framework that does the same thing I've done a million times but have to learn it's opinionated abstraction set that's somehow better and I just turn off. I simply do not care, at all. You need to simply explain to me the improvement you're proposing or it might as well be trash to me.
Now give me a new theoretical concept where I can expand my knowledge or integrate into my knowledge map and view of the world and I'm excited, there aren't enough hours in the day. Tell me about this all new concept I wasn't familiar with--I'll start thinking of ways I can use it, how I can leverage it, or how it may connect with other ideas and concepts I have.
Now give me a tight deadline which most business environments create and I agree with you, give me the boring stuff I can pump out, get my paycheck and go home to enjoy the rest of my day.
> No doubt the people who choose to stay in software are likely to be people who are curious, life-long learners
The article showed the opposite effect though. Curious, life-long learners stop working in software development because they have to constantly learn new skills and believe they can get more bang for their buck when they can invest in skills that don’t lose their value over time.
I once got excited about ExtJS, the way it created a desktop-like experience in the browser, and I said to myself, "I will learn this, all of it, I will become an expert. Tips and tricks, best practices, the works".
After six months of this, ExtJS 4 came out, which was essentially a totally new framework. Everything I learned was not only not applicable, it had to be actively unlearned.
The lesson here is: become good and proficient at something, but don't focus on becoming a ninja in one particular transient tech. There is value in becoming a Jedi of Unix build-in tools, or more persistent technologies like Git, for example.
Also, this is a bigger problem in the Javascript echosystem, where the hype cycles are more intense than in, say, Python. I checked out my Flask project from seven years ago and it's ready to rock.
I get the thing about constant learning, but learning in this industry used to be cumulative. Now it's a hamster wheel. You are learning how to solve the same problems, in a different, presumably in a "new" way.
People seem to be spending more time coming up with catchy names for their projects than making sure this is all sustainable.
Yes, this is how I feel. I have no problem learning a new skill. I get discouraged when I learn a new skill and just when I start to get really comfortable and productive with it, it's suddenly "legacy" and some new thing is popular.
The only skills that have really stood the test of time for me are C, PHP, unix shell stuff, and SQL.
It's a mix of both. You need to have solid fundamentals and need to keep learning new ways to apply those fundamentals in the real world. There is absolutely effort involved in learning a new language, library, framework, platform no matter how good you otherwise are.
> I noticed from teaching computer science while a graduate student is that the poor (a) students think about languages and libraries as key skills, while the (b) better students think about the data structures, algorithms, and design principles.
The truth is that the programmers in group (b) think about both. Who's designing a lot of the new languages, libraries, and frameworks? Chances are it was someone from group (a). If you're in group (b) then do you want to spend your whole career being forced by your bosses to constantly relearn and follow the latest vogue vision from group (a)? Of course not. So this might not apply to students, but everyone from group (b) will eventually get burned by fads enough times that they start caring about the politics of software. Namely, not wanting to depend on bloat that doesn't actually solve computer science and systems engineering problems. Group (b) might even create alternatives themselves. Go is great example. The guys who built the Fifth Bell System watched their vision behind their techniques decline over the decades and said, enough is enough. So they made Go and it was like a ray of sunshine when it came out.
I actually find the model in the article pretty convincing despite agreeing with you on that it's a profession for people who like to learn knew things and that university/grad school should teach more generic, theoretical knowledge that depreciate slower.
However, these still don't invalidate the main point of the article, that a faster rate depreciation means that your max knowledge level, given your specific rate of learning, will be lower. I.e. your advantage over a less skilled, younger professional will be lower.
And you may say that learning a new 3D library shouldn't be counted as learning a new skill, but it doesn't make the problem go away. If anything, it underlines it: if you have to start working with a new 3D library then you will have to spend time and effort on learning it (to become efficient at using it) while if you were able to keep using it, you could spend that time and effort on learning something that we could count as a new skill.
The article is also hitting on the fact that your skill premium as an engineer has a cap, and so does your willingness to burn the midnight oil on a project. This means that as time goes on, as an engineer you'll face the following
- A younger engineer will have the same value to your employer as you do.
- A younger engineer will work harder than you are willing to.
These two items are inevitable given the current rate of change in the industry. While some engineers will find next level differentiated work to engage in such as leading a core piece of infrastructure which defines the changing field... Many will not. If the rug gets pulled on this core piece of infrastructure.. then it's often the case that the engineers are not particularly more skilled than others on brand new projects.
Well, that's not what the data is showing. Why are you trying to create a narrative that is not trying to explain what we observe? Smarter people leave the field earlier and the author offers a compelling explanation why.
> When teaching undergrads I noticed immediately that a good portion of each student cohort was basically there because they were interested in making money in the future rather than exploring the ideas of computer science.
In my school, those who wanted to make money went straight to management or finance. Computer science was for the passionate ones and probably not the right path to make money for the brightest students.
> the poor students think about languages and libraries as key skills
well so do the recruiters, they’ll be fine
in fact, the better students are the ones wasting their time unless they prefer to be in academia, like you
so what metric are you really gauging for?
the “poor students” are pivoting for money and the name of the university to boost their employment prospects, maybe this shows in their academic performance and ability to understand, I saw the same in undergrad
> they are correct that we are in a dynamic profession that requires constant learning and expanding
Not true. I have met many developers who haven't learned anything new for 15+ years and are still doing just fine developing software. A lot of Java developers come to mind. They have pretty much done the same thing their whole career and have no need or desire to learn anything new.
Once you understand how the industry churns and burns and doesn't really give much credence for capability as you age - it becomes disheartening to try and want to be an IC. Most people see the writing on the wall for being an older IC therefore they move into management or product or other roles.
A fascinating premise, and matches what I saw as an engineering director. The best just get bored and move on.
I think many teams are unaware how much extra value is possible by retaining existing employees vs hiring new ones. Each year I'd try to make sure I was "making them an offer they couldn't refuse" with new interesting challenges, new tech, plenty of personal research time, as much pay increase as I could possibly give, etc. A lot of engineering managers think that it's no big deal to just hire new staff, but even going from average turnover of two years to three years is an massive improvement.
its not just the best ones. If you remove people from the grind for 1-2 days a week, give them a small budget and enough autonomy to do what they want, most people will fix shit that bugged them for a long time.
The main problem is how micromanage-y the current development processes are. Everything has to be a ticket/user story and has to be planned and approved by some people that never even wrote a single line of code. Everything has to have some immediate business impact. I even see scrum teams measuring for team utilization now. target is 90% atm and they wonder why productivity is down.
> If you remove people from the grind for 1-2 days a week
The modern office seems hellbent on killing every last bit of slack in their workers, then wondering why they leave or get burned out.
I realized the other day that a big part of my drive to move towards self-employment is really just a way to carve out time to take adequate care of myself. I have significant doubts that it is possible to continue to advance in tech to staff+ levels, be a good spouse, parent, and friend, and not run myself into the ground with physical/mental issues. And that is sad on multiple levels.
So I respond by easing up on advancing my career, because it gives back to me the least.
It gets worse, too - as long as I've worked as a software developer there's been some sort of time tracking system in place, and it has to be planned up-front, and has to work out to at least 40 hours (after they "negotiate" your estimates down). Which leaves no time for the unplanned stuff that inevitably comes up. This always goes in a cycle like this:
1. Management demands that every bit of work be associated with a ticket
2. devs just open tickets for the unplanned stuff so that it shows up in the ticket tracking system
3. management complains about devs opening "their own" tickets and prohibits self-opened tickets
4. devs do the unplanned (always "super high priority!") stuff without any ticket tracking and fall behind on their "planned" tickets (that nobody really cares about any more, but are still on their board)
1. management demands that every bit of work be associated with a ticket...
I've dreamed about a 20% policy like google had, except it's where you can work on anything, including code debt.
I've tried to stress to managers in the past that developers feel the pain of code debt. It makes us slower! Enable us to spend time sharpening our tools and managing our codebase.
One problem of course is, not all SWE can do this well. I wouldn't necessarily trust a junior hire to recognize and execute a proper refactor.
> give them a small budget and enough autonomy to do what they want, most people will fix shit that bugged them for a long time.
This has been huge for me at my current job. I saw some unused equipment in a lab and started asking questions why. Turns out the thing worked, but not great, so no one used it. What started as just fixing bugs and adding features became my own line item in the budget and requests for the (new and improved) equipment from other departments. It's something I look forward to working on.
That reminds me of the time I was a junior dev, and the team lead told me verbatim: "I know you are too busy to write tickets, but can you take some time off [this urgent thing] to do that? Thanks!"
This was after they encouraged a certain "cool culture" for a couple of months due to the lack of direction. It was pretty funny that I did not only get micromanaged, but was told I did the wrong thing, and then asked to do a third job that was not my responsibility.
We have a lot of bugs everyone complains to me about and I have sufficient downtime to fix them* but I have to go through drawn out planning, UI, UX processes before I can even start. I just don't bother any more.
And yeah, it's definitely not just the best ones. I am mediocre and am so bored and so done with dev.
* the downtime is there because I am waiting for planning, UX, and UI for a different high priority task.
> Everything has to have some immediate business impact.
Or more specifically, explainable business impact.
But it's hard to explain how the code has become horrible and needs a refactor to makes it easier on devs, reducing stress, reducing likelihood of both bugs, and developers leaving.
It may not be a matter of "the best". I have taken a personality test that had an item on it that covered product lifecycle. If 1 is initial conception, 2 is prototype, 3 is initial release, 4 is major enhancement, and 5 is maintenance, my personality is that I prefer 2 or 3. By 4 (major enhancement) I start to get bored, and by 5 (maintenance) I'm definitely bored.
It's not that I'm one of "the best" (though I like to think that I am). I have a personality clash with the later stages of product lifecycle.
Is that the premise? Seems to be saying that constantly changing skills exhausts developers to the point that it becomes more lucrative to work in another profession.
Although, I suppose learning new things just to tread water can be boring too.
This is how I read it too. That a fast learners skill is degraded in a field that is constantly reset (software dev) vs one where you can stack knowledge. And thus the fast learners will eventually leave the software field and transition to one that doesn’t reset constantly so they can stand out more.
Doesn’t have to do with boredom so much as maximizing potential.
To be fair, almost all my managers were amazing, people who truly cared about their staff: at professional level as well as a personal level.
I've only had one absolute psychopath as a manager ... but I should thank him because he was the last straw and gave me enough courage (and anger) to leave AWS and start my journey as a solo entrepreneur.
Oddly enough I joined AWS not too long ago and while the job itself sucks, my manager is exceptionally awesome. In fact I only interviewed with AWS as kind of a half-joke, but he was so likeable that it convinced me to take it seriously. He has a healthy "fuck em" attitude when it comes to pressure from outside the team, so he's constantly protecting us in a multitude of ways (e.g. during on-call rotations or deadlines being imposed on us). He has yet to do a single thing that I would consider micro-management.
I think for some engineers (me) there's not much you can offer. I don't just want any new random tech challenge. It's unlikely your company or most companies have something I could truly be passionate about solving.
Literally every engineering director I've ever seen says exactly what you're saying. E.g., "as much pay increase as I could possibly": for my experience and growth, an employer I worked with offered a -6.6% increase in real salary for my time there. (≈3 years tenure.) Negative. The work was also … not what I'd like to be doing. So between "stay with company" and "find different company", it shouldn't be too hard to predict which option was more appealing.
My experience, unfortunately, is that good managers like you seem to be don't last long. They get replaced by manage-up sleazeballs who'll never, ever protect the people beneath them because it's not game-theoretically optimal.
The thing is, executives measure themselves by how quickly they get promoted into the next role, so no one cares that good management might reduce turnover in the next 2-3 years--in fact, the executive mindset is that it could just as easily increase turnover (what if we invest in their careers, and they leave?)
My philosophy as an engineering manager is to actually pursue this outcome. If I treat them badly they will leave. If I treat them well and train them into better engineers then they will leave. It's like being a college football coach. Having your star athlete be drafted to the NFL is entirely the point.
There is no shortage of swe. What I see and what I have experienced first hand is that companies, especially the more hyped/small ones, pretend to be FAANG and gets very picky when interviewing. They often employ FAANG style interview.
Now, if I really have to spend that much time prepping to interview at your unprofitable company (that most likely will go under) don’t you think that I would try my best to work at faang instead ?
As matter of fact, I was rejected at plenty of these small insignificant companies, but end up having offers as L6 at FAANG.
Be humble and you will find plenty of good engineers out there.
I know tons of good swe that don’t want to interview/ work at mega FAANG, and if I was running a business I would definitely try to attract those talents by being different. Offering a “normal and reasonable” interview process along with better perks, flexibility and wfh.
Instead, they all want to run bizzilion of micro services in k8s
I definitely think "we do not do leetcode interviews" and maybe "we only have three interviews total" would be selling points on a job posting. People who are experienced in the field don't want to go through the same hoops that newbies do just to prove they know how to write basic algorithms.
You wrote: <<People who are experienced in the field don't want to go through the same hoops that newbies do just to prove they know how to write basic algorithms.>>
I am constantly interviewing candidates for roles at my company. It seems like CVs are a complete gamble. Either some are lies, or wildy understated, and everything in between -- at all levels of experience! "[J]ust to prove..." and yet so many can not do the 2022 version of FizzBuzz. I am stunned how many senior (well, so they say!) hands-on technical applicants cannot do basic things like write a very simple linked list class, or explain to me how a hash map works. For get about explaining the finer points of sorting algorithms (honestly, very low value in my line of work).
There is no reasonable alternative to testing of some kind for hands-on techincal roles -- I am flexible about method: (1) white board coding (ugh in 2022), (2) IDE/text editor on shared PC / video chat (meh / eh in 2022), or (3) take home (the best in 2022, even if there are drawbacks for people with families).
Joel Spolsky said it best about hiring: The goal is to avoid bad hires. Average and above are fine.
Personally I do because I enjoy working with very smart people.
The backlash against leetcode is the same as backlash against other types of tests: most people are going to fail and most people don't like failing, so they blame the test.
> "we do not do leetcode interviews" and maybe "we only have three interviews total" would be selling points on a job posting.
It depends on the job. I have had interviews that broke the mold here and were panel discussions or more job-talk experience, and I found the interviews uniquely exhausting because they required their own set of skills to study for that were different from the “leet code” style. At the extreme end were the take home projects, which I simply didn’t have time to do for every company and were extremely unattractive to me for that reason. I actually find doing leetcode style interviews for me required the least amount of prep and was the most straightforward, especially when they were structured to leave me with time to ask and talk to real engineers at the company.
> Now, if I really have to spend that much time prepping to interview at your unprofitable company (that most likely will go under) don’t you think that I would try my best to work at faang instead ?
I feel the same way.
That said, my company and many others make the interview process much easier but still find it difficult to hire. I know this is a common problem, because I get bombarded with good job postings by recruiters and they are usually still there months later when I finally get around to responding.
Companies have only tested my code skills and handed me personality tests when applying for full time jobs. As a freelancer, I get one or two interviews where I talk to the architects, tech leads and managers, and that's it - either I'm in or out after that. They end up treating me as an employee anyway, so don't really know why this distinction is made in the first place, but I suspect HR.
I don't think FAANG style interviews are a good way to do it, but it is more important to be picky about hiring at a small company. If you're in a company where everybody knows everybody, then any new hire will affect the entire team. A good hire will pull the whole team up. A bad hire will negatively affect everyone.
>High-ability workers are faster learners, in all jobs. However, the relative return to ability is higher in careers that change less, because learning gains accumulate.
This is not the only reason for the quick learner -> high dropout thing. By the article's definition, I'd be a "fast learner". Most of the industry expects me to come in already knowing what they want me to know, while most ways to obtain said knowledge are blocked by barriers difficult to bypass for non-corporates. Meanwhile, almost every corporate I get in expects me to do the same things for several months and gives me a few learning opportunities every year. At the same time, university primed me to absorb knowledge like a sponge and never get stuck on a single perspective, while corporates are complaining why graduates don't know Spring after graduating.
So somehow you're expecting me to stay while my knowledge deteriorates unless I keep it up in my own time, all the while giving lowball raises and not satisfying my desire for challenges. Yes, I get it, grunt work has to be done. But you really can't tell me you're in need of software developers when you actively push people to do the very thing you claim you don't want them to do.
"somehow you're expecting me to stay while my knowledge deteriorates unless I keep it up in my own time, all the while giving lowball raises and not satisfying my desire for challenges"
That's not really the expectation... the expectation is that you'll be replaced by someone younger who's already learned all that.
The expectation for more senior people is that they'll go in to management or architecture.. or maybe burn out.. it doesn't really matter to most corporations, as their workers are replaceable.
This is spot on, and personally, the main reason why I decided to move to an engineering manager after 10 years as a developer.
And even as an engineering manager, I do not feel safe. I think only once you reach director level, you are protected from market hype and newest frameworks trends.
This. Slap the golden handcuffs if possible for the most boring work and give employees extra vacation time to recover from that slog and they might not leave. Most people are willing to deal with some BS as long as they feel compensated and the comp keeps up with changes in the marketplace.
Some really good points on the ultra-fast depreciation of SE tech skills. A relative of mine is a mechanical engineer, well past retirement age and still going strong in his 2-man consulting shop because that's what he loves doing. He works with precision manufacturers, automotive suppliers,... all very cutting-edge stuff, helping them develop new product lines, manufacturing processes,... He says the core skills that he's using are still those that he learnt in university several decades ago.
I was actually surprised because I heard so much about crazy new materials like carbon fibers, graphene, technologies like 3D-printing,... but apparently what makes a machine break are still the same things: mechanical stress, heat dissipation, friction,... new materials and processes might change the coefficients, but not the (mostly Newtonian) physics (my interpretation btw, not his words).
One could say the same thing about software engineering - true fundamental advances in algorithms and data structures are sufficiently rare that it wouldn't be a nuisance to keep up with them. But the %-age of how important those basics are relative to the extremely fast-changing landscape of tools and frameworks is much smaller (plus, one could argue that even the fundamentals see a lot of shifting ground in CS, with neural architectures, differentiable programming, not to mention quantum computing).
I think the skill depreciation concept is a bit exaggerated. Especially in the context of the newness of computers relative to the engineering field in general, the latter which has been around, arguably, for thousands of years.
For example, SQL & Unix have been around since the 1970s.
Linux since late 1991.
Javascript: The end of 1995. NodeJS: 2009.
Sure, there's a ton of churn in the JS Ecosystem, but all it takes is a bit of wisdom, skepticism, and patience to avoid the hype-cycle.
Also, once you learn to build certain things with a programming language you learn the paradigms of the system you build.
For example-- Web Servers. Looking at ExpressJS vs Python Flask documentation, there are many analogous pieces, because they follow the same standards for protocols.
Another example-- data engineering / statistical computing: Checking out R vs Python packages, there are a lot of the same concepts, just in a slightly different format/language.
HTTP/1.0: 1996 (RFC 1945)
TCP: 1974 "In May 1974, Vint Cerf and Bob Kahn described an internetworking protocol for sharing resources using packet switching among network nodes."
"TLS is a proposed Internet Engineering Task Force (IETF) standard, first defined in 1999"
Considering all of this... I don't think most major things actually change that much. Sometimes a popular new framework takes the world by storm, but that's pretty rare compared to the output and churn of the ecosystem.
The issue is that every job I’ve had requires learning a bunch of new shit. Rarely am I just transferring over to the same languages or frameworks. Add on that each company decides different design patterns that they want to utilize and has a different interpretation of what REST is and what HTTP status codes are… it’s a pain in the ass to be an expert in any good amount of time. Expert being one that can dive into the true weeds like cryptic memory leaks that require special profiling tools that aren’t documented anywhere - etc. (and able to do this at a moments notice with ease)
Especially if you’re a full stack eng who is constantly swimming over the entire stack and they keep pushing new DBs, new logging tools, etc.
There are commonalities but it is a lot of learning as you go. I used to know Angular pretty well but now I don’t remember it at all. I haven’t even gotten to really ramp on React as much because my company uses it in such a terrible way that it’s clearly not fit for.
From the perspective of whoever is experiencing it, it doesn't really matter how it fits into the historical picture. What you experience is sitting in front of your screen while your family is having dinner, not points on a multi-decade timeline.
In the last 20 years I have seen mostly ultra-fast depreciation of SE _interviewing_ skills.
After the first ten years software development becomes quite intuitive and you internalize all those best practices. You can be trusted to start a new service from an empty git repository. Later it gets incremental, there's a lot of path dependency in languages and frameworks and few things come out of the blue. Those that do are frequently intellectually stimulating to learn.
But interviews have been steadily getting strange and difficult (in a way not related to real life software development), at least in the last 10 years.
Very true. This stuff started at places like Google and Facebook and it makes a sort of sense for them as right or wrong they're very focused on hiring new college grads. With no real work experience to speak of you can do a lot worse than hire the ones that show they can apply their CS coursework to leetcode problems.
But doing the same to workers with 10 years of real world experience doesn't make nearly as much sense. Like hiring medical doctors by quizzing them on organic chemistry problems. Google and Facebook do it because they can and they don't know what else to do, but I don't understand how it became a universal practice.
>I was actually surprised because I heard so much about crazy new materials like carbon fibers, graphene, technologies like 3D-printing,... but apparently what makes a machine break are still the same things: mechanical stress, heat dissipation, friction,... new materials and processes might change the coefficients, but not the (mostly Newtonian) physics (my interpretation btw, not his words).
I don't really see the difference between that an programming. Writing code is still a bunch of "if" statements, the same underlying data structures, same algorithms, etc. There's some new technology being added on the top akin to carbon fiber and the other things you mentioned but it's fundamentally the same.
You really think that a SE in their 70's who learnt how to write if statements and data structures 50 years ago would say that they're still basically doing the same thing now as back then? Maybe if they work on legacy stacks like the famed COBOL devs coming back out of retirement. But the thing is that what he's working on is cutting edge, not maintaining systems that were built decades ago.
An unsolvable problem. The rapid deterioration of skill will not stop, it is accelerating instead. People's desire for stability as they grow older will not change either.
The only attraction in software development is the relatively good pay. The job itself sucks. You'll spent your life sitting in a chair looking at a text editor, and that's the best part of your day as about 50% of it is distractions. You're quite unlikely to work on something truly creative or thrilling, so it's mostly a boring grind.
Then, as the article mentions, it turns out the grind was for nothing and the rug is pulled every few years and you have to start over again. The job is cognitively taxing so you'll turn into an absent person that lives in their heads, it drains your life energy.
If I would be young now, I'd say fuck it and go install solar panels or heat pumps. It's outside, physical but not too physical, thus healthy. You get to meet lots of people and you see the direct result of your work. There's no office politics and you're contributing to a tangible good thing for the world. Skill requirements don't change much.
You might come home somewhat physically tired (but over time it normalizes), with a clear head and not a care in the world. There's no overflow between work and personal life.
You go to different places and different people, every single day. That's 100% less repetitive compared to sitting at home or going to the office to see the same people.
As for the tasks themselves, it was merely an example, but even for this example I disagree. My brother-in-law basically does all of these things, both for private citizens and industry and comes across a wide array of different situations.
I'm not saying it's absolute perfection, no job is. But I stand by my point that it has a series of very meaningful advantages: far healthier, more social, direct impact of your work, no cognitive overload, no politics.
Go ask people installing solar panels and heat pumps if the idea of working a "cognitively taxing" job where they could make more money, work from anywhere and not be dead physically tired at the end of the day sounds appealing... I don't think you'll hear them all say "screw that, I love installing solar panels". The grass is always greener for sure, but none-the-less software engineering and tech offers a very high quality of life with significant earning potential and a ton of flexibility.
> If I would be young now, I'd say fuck it and go install solar panels or heat pumps
Amen. For me the advice I give the young—-go for a trade. HVAC (especially if you live where it’s hot) and plumbing to me are the most future and recession proof jobs there are. Literally by the time other folks are graduating with CS degrees and massive college debt, you have been making 70k a year for 3 years and are about to clip 6 figures for the rest of the time you want to work.
One day a computer will be able to write its own code (and that’s not that far off now), but no robot or computer in this world will be able to come to your house and fix a clogged toilet, or replace a blown capacitor in your heat pump and people will always be willing to pay nearly anything to get those two problems replaced.
Unfortunately, this is also a low cost skill to acquire, so the barrier to entry is low.
As all the mid level white collar jobs keep getting automated away, there's going to be a glut of people in need of income who are more than capable of learning a trade if their ego can handle it.
It's not so much a desire for stability in general rather than the depreciation. I found that piece in the post rather convincing. I don't think most people have a problem with learning new skills in this field. But having to throw away old ones sure feels like a waste and given the model in the post limits how good you can ever get.
Also, sitting and staring at a text editor should not sound that awful for anyone who likes to create via writing code. Because that's how you do it. But that's not what you experience, obviously. It's about what goes on in your head.
I remember the feeling after taking a new job as a fresh graduate. It felt ridiculous that I know get paid for doing the thing I've wanted to do most of the time in the past ~12 years but I was told that you can't do that all day because you have duties. (Now the pay was actually pretty bad, even for local standards, as my first job was in an academic research institute.)
I think the problem of ageism wasn't mentioned at all and it should be.
People who love the field have no problems keeping up to date. Sure some job posts will mention Hadoop or Kafka but whatever, a good dev will have no problem learning these in a few days.
Does he get a chance though if he's 50?
Ironic thing here is that Hadoop is mostly already outdated.
Which is btw one of the depressing thing for a lot of data engineers: we used to play with those cool distributed processing frameworks, and now? We are mostly writing some terraform to deploy cloud resources, most of the distributed part being handled by those cloud providers.
> Which is btw one of the depressing thing for a lot of data engineers: we used to play with those cool distributed processing frameworks, and now? We are mostly writing some terraform to deploy cloud resources, most of the distributed part being handled by those cloud providers.
Sounds to me like switching one provider/tool by another - or are data engineers feeling bummed because the job has become too trivial / less fun?
I'm not disputing the fact that there is ageism, as I'm sure there are thousands of examples of it, but there's so much demand and so many different companies. I've worked with plenty of over-50's. Maybe there's a sweet spot for growing companies where you need the experience, which has been where I've worked. Small companies don't need structure, and maybe want cheap employees. Large companies put all that structure in management and a few super senior folks. (though I saw plenty of over 50s in my large company experiences) Medium growing companies need experience. I dunno, just guessing since certain companies I've worked for seem to have a higher concentration of older folks.
Here, this rando website says that 46% of software engineers are 40+
That's a survey though, I'm curious what biases are going to exist in the data. We might assume that older software developers move around less? May be less likely to respond to surveys? May be less likely to visit stack overflow, especially if they do less hands on coding?
I'm with you, I don't know the answer this is definitely complex.
It could be a combination of things -
a) burnout due to ever changing tech
b) ageism
c) highly paid devs choosing to retire / switch professions early simply because they can financially
40 is definitely not that old anymore for tech I think. Well I'm 38 I'll find out soon.
There should really be a surplus of software developers. Most companies are solving problems they don't have, employing a team of 10x more developers than they need - because of course microservices, Kafka, Kubernetes, you name it - cargo culting the shit out of it.
As for attributing burnout as the core issue here, I would strongly disagree with this idea. When teaching undergrads I noticed immediately that a good portion of each student cohort was basically there because they were interested in making money in the future rather than exploring the ideas of computer science. They were no doubt going to get frustrated or bored and move into management or some other profession rather than continue to expand as an engineer. This is totally fine, and they are probably richer for having learned what they did, but I don't know why we can't just see this and appreciate it for what it is rather than portraying it as the drama of burnout.
The entire industry focuses way too much on 'experience with tool X' as a proxy for 'technical skill Y'. It's a bit like asking a carpenter how many years of experience they have with DeWalt cordless nail guns rather than ask about their house framing skills.
This kind of logic only works for tech organizations that already have enough in-house domain expertise to onboard new programmers. The other day somebody asked me how to find a programmer to implement something for them. From a programming standpoint there was very little to do, but it involved many obscure technologies that you couldn't pickup in a day (and no you can't pick different technologies). For a person who's already done something similar it'd be a quick and easy job, shouldn't cost too much. With a generic programmer it'd take much longer, cost much more and you couldn't be sure they'd actually deliver.
Strongly emphasizing this. This is HIGHY applicable to the analytics environment. As a business analyst who specialized in mostly ad-hoc development because it was the most value-add area at the companies I worked with.. I had a lot of trouble finding new work because I didnt use Tableau, or Power BI, or Looker, etc. I was some sort of fool for doing everything in SQL, Excel, and Python.
IMO the tools are great, and you need a much lower level understanding of analytical concepts to get value from them. But for some reason people kept getting the impression I would somehow be less effective with them because I dont use them. And I had trouble correcting them with the limited bandwidth that exists to communicate with a single applicant in the hiring process. If I tried to get right to the point, I felt myself appearing arrogant.
The carpentry analogy is very similar to how i described it. "I am currently using a ruler and screwdriver, and these tools provide lasers and power drills"
Give them six+ months to train up and I’m sure they’ll do fine.
They tend to bike-shed details, take way too long to try to create sophisticated abstractions that never quite achieve the silver bullet they originally thought it would, and spend too much time dwelling on irrelevant details that ultimately leads no where and results in a kind of paralysis that can be very hard to break out from.
The ones who master a specific language or master a specific library/technology that focuses on doing a few things very well and in very concrete terms are able to deliver the most business value. Furthermore they absolutely have the ability to take their mastery of that and map it to new languages or new libraries. I mean I'm not talking about going from Java to Haskell or Haskell to Idris, but rather people who master C++ can fairly easily pick up Java or Python or TypeScript. People who have mastered Unity can easily pick up Unreal Engine. People who have mastered web development can easily pick up mobile development.
The idea that people who have a solid mastery of a technology and a programming language are somehow just stuck and unable to take those skills and apply them to other areas I think is overstated and just untrue, but those who treat software engineering as highly theoretical and focus on abstractions, design principles and get caught up on these high level details tend to not get much done and when they realize that software is not as clean and elegant as they would like it to be, they get burned out and give up.
I think going over any substantial codebase for products that are widely used and deliver solid business value on Github where most code is not at all reflective of the ideals often espoused on blog posts validates my point of view.
In short, people who treat software as just a tool to accomplish a concrete task are more productive than those who write software for the sake of writing software. They don't write the cleanest code, or the most elegant data structures and algorithms, but they produce the greatest amount of tangible business value.
> "meta-level" concepts
I'd say having a strong grasp of what you can achieve with just using files and folder, or understanding how SQL solves en entire problem space are meta level concepts. Its just that we take them for granted.
> business value
Is apparently something different than 'value', but still includes every software ever that was valuable to a business?
> high level details
...?
> software engineering
Building constraint solver for a compiler or ensuring a JS animation centers a div?
> highly theoretical and focus on abstractions, design principles
I'd recognize all these things. But out of context 'in the general case' they become meaningless.
---
I understand the picture you are trying to paint, but i don't think it tells anything beyond "I've noticed people make things overly complex". I agree.
However, keep in mind the 'get things done and provides value' software you've seen: is the software 'that survived', might have been set up by a very experienced person ( whose failures we're not seeing ), nobody might recognize it as being not-simple ( e.g. I've seen high value business software partial recreate 'regex'. Worked great, straightforward and easy to read function, just ~50 lines or so, could have been a single function call. ), how the requirements are presented is hugely important.
There is a lot of people who learn just a surface without going deep into tool and think they know enough.
For me it seems that someone who would really go deep into learning language would get most of theoretical stuff on the way. Because there is no way to really master C++ or really master Java without learning about data structures and all kinds of "meta-level" concepts.
Maybe the difference is mostly approach to learning more practical/more theoretical.
Some guys see a screw and reach for their trusty hammer. Some guys know to grab a screwdriver.
I had a project the last two weeks where the code was just going to fail about as often as it was going to succeed. I had to write a resource manager and an Erlang style supervisor and use an embedded key value store.
A better dev may have intuited what took me basically a midstream rewrite to figure out, a worse developer may still be grinding on the problem.
I think my solve is "robust enough" but there was no real way to power through that. You either found the right abstractions or you didn't.
And as music goes, you sound like the record companies that thought everyone should listen to disco for the next 50 years...
Now give me a new theoretical concept where I can expand my knowledge or integrate into my knowledge map and view of the world and I'm excited, there aren't enough hours in the day. Tell me about this all new concept I wasn't familiar with--I'll start thinking of ways I can use it, how I can leverage it, or how it may connect with other ideas and concepts I have.
Now give me a tight deadline which most business environments create and I agree with you, give me the boring stuff I can pump out, get my paycheck and go home to enjoy the rest of my day.
The article showed the opposite effect though. Curious, life-long learners stop working in software development because they have to constantly learn new skills and believe they can get more bang for their buck when they can invest in skills that don’t lose their value over time.
After six months of this, ExtJS 4 came out, which was essentially a totally new framework. Everything I learned was not only not applicable, it had to be actively unlearned.
The lesson here is: become good and proficient at something, but don't focus on becoming a ninja in one particular transient tech. There is value in becoming a Jedi of Unix build-in tools, or more persistent technologies like Git, for example.
Also, this is a bigger problem in the Javascript echosystem, where the hype cycles are more intense than in, say, Python. I checked out my Flask project from seven years ago and it's ready to rock.
I get the thing about constant learning, but learning in this industry used to be cumulative. Now it's a hamster wheel. You are learning how to solve the same problems, in a different, presumably in a "new" way.
People seem to be spending more time coming up with catchy names for their projects than making sure this is all sustainable.
The only skills that have really stood the test of time for me are C, PHP, unix shell stuff, and SQL.
The truth is that the programmers in group (b) think about both. Who's designing a lot of the new languages, libraries, and frameworks? Chances are it was someone from group (a). If you're in group (b) then do you want to spend your whole career being forced by your bosses to constantly relearn and follow the latest vogue vision from group (a)? Of course not. So this might not apply to students, but everyone from group (b) will eventually get burned by fads enough times that they start caring about the politics of software. Namely, not wanting to depend on bloat that doesn't actually solve computer science and systems engineering problems. Group (b) might even create alternatives themselves. Go is great example. The guys who built the Fifth Bell System watched their vision behind their techniques decline over the decades and said, enough is enough. So they made Go and it was like a ray of sunshine when it came out.
However, these still don't invalidate the main point of the article, that a faster rate depreciation means that your max knowledge level, given your specific rate of learning, will be lower. I.e. your advantage over a less skilled, younger professional will be lower.
And you may say that learning a new 3D library shouldn't be counted as learning a new skill, but it doesn't make the problem go away. If anything, it underlines it: if you have to start working with a new 3D library then you will have to spend time and effort on learning it (to become efficient at using it) while if you were able to keep using it, you could spend that time and effort on learning something that we could count as a new skill.
- A younger engineer will have the same value to your employer as you do.
- A younger engineer will work harder than you are willing to.
These two items are inevitable given the current rate of change in the industry. While some engineers will find next level differentiated work to engage in such as leading a core piece of infrastructure which defines the changing field... Many will not. If the rug gets pulled on this core piece of infrastructure.. then it's often the case that the engineers are not particularly more skilled than others on brand new projects.
Deleted Comment
In my school, those who wanted to make money went straight to management or finance. Computer science was for the passionate ones and probably not the right path to make money for the brightest students.
well so do the recruiters, they’ll be fine
in fact, the better students are the ones wasting their time unless they prefer to be in academia, like you
so what metric are you really gauging for?
the “poor students” are pivoting for money and the name of the university to boost their employment prospects, maybe this shows in their academic performance and ability to understand, I saw the same in undergrad
Not true. I have met many developers who haven't learned anything new for 15+ years and are still doing just fine developing software. A lot of Java developers come to mind. They have pretty much done the same thing their whole career and have no need or desire to learn anything new.
Deleted Comment
I think many teams are unaware how much extra value is possible by retaining existing employees vs hiring new ones. Each year I'd try to make sure I was "making them an offer they couldn't refuse" with new interesting challenges, new tech, plenty of personal research time, as much pay increase as I could possibly give, etc. A lot of engineering managers think that it's no big deal to just hire new staff, but even going from average turnover of two years to three years is an massive improvement.
The main problem is how micromanage-y the current development processes are. Everything has to be a ticket/user story and has to be planned and approved by some people that never even wrote a single line of code. Everything has to have some immediate business impact. I even see scrum teams measuring for team utilization now. target is 90% atm and they wonder why productivity is down.
The modern office seems hellbent on killing every last bit of slack in their workers, then wondering why they leave or get burned out.
I realized the other day that a big part of my drive to move towards self-employment is really just a way to carve out time to take adequate care of myself. I have significant doubts that it is possible to continue to advance in tech to staff+ levels, be a good spouse, parent, and friend, and not run myself into the ground with physical/mental issues. And that is sad on multiple levels.
So I respond by easing up on advancing my career, because it gives back to me the least.
So, in order to say, upgrade packages or refactor difficult to read code, the work item needs to be approved by a non-tech PO.
Guess how much gets done outside of planned/micromanaged? Answer: next to nothing.
It gets worse, too - as long as I've worked as a software developer there's been some sort of time tracking system in place, and it has to be planned up-front, and has to work out to at least 40 hours (after they "negotiate" your estimates down). Which leaves no time for the unplanned stuff that inevitably comes up. This always goes in a cycle like this:
1. Management demands that every bit of work be associated with a ticket
2. devs just open tickets for the unplanned stuff so that it shows up in the ticket tracking system
3. management complains about devs opening "their own" tickets and prohibits self-opened tickets
4. devs do the unplanned (always "super high priority!") stuff without any ticket tracking and fall behind on their "planned" tickets (that nobody really cares about any more, but are still on their board)
1. management demands that every bit of work be associated with a ticket...
I've tried to stress to managers in the past that developers feel the pain of code debt. It makes us slower! Enable us to spend time sharpening our tools and managing our codebase.
One problem of course is, not all SWE can do this well. I wouldn't necessarily trust a junior hire to recognize and execute a proper refactor.
This has been huge for me at my current job. I saw some unused equipment in a lab and started asking questions why. Turns out the thing worked, but not great, so no one used it. What started as just fixing bugs and adding features became my own line item in the budget and requests for the (new and improved) equipment from other departments. It's something I look forward to working on.
This was after they encouraged a certain "cool culture" for a couple of months due to the lack of direction. It was pretty funny that I did not only get micromanaged, but was told I did the wrong thing, and then asked to do a third job that was not my responsibility.
Deleted Comment
And yeah, it's definitely not just the best ones. I am mediocre and am so bored and so done with dev.
* the downtime is there because I am waiting for planning, UX, and UI for a different high priority task.
Or more specifically, explainable business impact.
But it's hard to explain how the code has become horrible and needs a refactor to makes it easier on devs, reducing stress, reducing likelihood of both bugs, and developers leaving.
It may not be a matter of "the best". I have taken a personality test that had an item on it that covered product lifecycle. If 1 is initial conception, 2 is prototype, 3 is initial release, 4 is major enhancement, and 5 is maintenance, my personality is that I prefer 2 or 3. By 4 (major enhancement) I start to get bored, and by 5 (maintenance) I'm definitely bored.
It's not that I'm one of "the best" (though I like to think that I am). I have a personality clash with the later stages of product lifecycle.
Is that the premise? Seems to be saying that constantly changing skills exhausts developers to the point that it becomes more lucrative to work in another profession.
Although, I suppose learning new things just to tread water can be boring too.
Doesn’t have to do with boredom so much as maximizing potential.
Or a reorg happens and you land a shitty manager.
To be fair, almost all my managers were amazing, people who truly cared about their staff: at professional level as well as a personal level.
I've only had one absolute psychopath as a manager ... but I should thank him because he was the last straw and gave me enough courage (and anger) to leave AWS and start my journey as a solo entrepreneur.
The thing is, executives measure themselves by how quickly they get promoted into the next role, so no one cares that good management might reduce turnover in the next 2-3 years--in fact, the executive mindset is that it could just as easily increase turnover (what if we invest in their careers, and they leave?)
Now, if I really have to spend that much time prepping to interview at your unprofitable company (that most likely will go under) don’t you think that I would try my best to work at faang instead ?
As matter of fact, I was rejected at plenty of these small insignificant companies, but end up having offers as L6 at FAANG.
Be humble and you will find plenty of good engineers out there.
I know tons of good swe that don’t want to interview/ work at mega FAANG, and if I was running a business I would definitely try to attract those talents by being different. Offering a “normal and reasonable” interview process along with better perks, flexibility and wfh.
Instead, they all want to run bizzilion of micro services in k8s
I am constantly interviewing candidates for roles at my company. It seems like CVs are a complete gamble. Either some are lies, or wildy understated, and everything in between -- at all levels of experience! "[J]ust to prove..." and yet so many can not do the 2022 version of FizzBuzz. I am stunned how many senior (well, so they say!) hands-on technical applicants cannot do basic things like write a very simple linked list class, or explain to me how a hash map works. For get about explaining the finer points of sorting algorithms (honestly, very low value in my line of work).
There is no reasonable alternative to testing of some kind for hands-on techincal roles -- I am flexible about method: (1) white board coding (ugh in 2022), (2) IDE/text editor on shared PC / video chat (meh / eh in 2022), or (3) take home (the best in 2022, even if there are drawbacks for people with families).
Joel Spolsky said it best about hiring: The goal is to avoid bad hires. Average and above are fine.
The backlash against leetcode is the same as backlash against other types of tests: most people are going to fail and most people don't like failing, so they blame the test.
It depends on the job. I have had interviews that broke the mold here and were panel discussions or more job-talk experience, and I found the interviews uniquely exhausting because they required their own set of skills to study for that were different from the “leet code” style. At the extreme end were the take home projects, which I simply didn’t have time to do for every company and were extremely unattractive to me for that reason. I actually find doing leetcode style interviews for me required the least amount of prep and was the most straightforward, especially when they were structured to leave me with time to ask and talk to real engineers at the company.
> Now, if I really have to spend that much time prepping to interview at your unprofitable company (that most likely will go under) don’t you think that I would try my best to work at faang instead ?
I feel the same way.
That said, my company and many others make the interview process much easier but still find it difficult to hire. I know this is a common problem, because I get bombarded with good job postings by recruiters and they are usually still there months later when I finally get around to responding.
This is not the only reason for the quick learner -> high dropout thing. By the article's definition, I'd be a "fast learner". Most of the industry expects me to come in already knowing what they want me to know, while most ways to obtain said knowledge are blocked by barriers difficult to bypass for non-corporates. Meanwhile, almost every corporate I get in expects me to do the same things for several months and gives me a few learning opportunities every year. At the same time, university primed me to absorb knowledge like a sponge and never get stuck on a single perspective, while corporates are complaining why graduates don't know Spring after graduating.
So somehow you're expecting me to stay while my knowledge deteriorates unless I keep it up in my own time, all the while giving lowball raises and not satisfying my desire for challenges. Yes, I get it, grunt work has to be done. But you really can't tell me you're in need of software developers when you actively push people to do the very thing you claim you don't want them to do.
That's not really the expectation... the expectation is that you'll be replaced by someone younger who's already learned all that.
The expectation for more senior people is that they'll go in to management or architecture.. or maybe burn out.. it doesn't really matter to most corporations, as their workers are replaceable.
And even as an engineering manager, I do not feel safe. I think only once you reach director level, you are protected from market hype and newest frameworks trends.
I was actually surprised because I heard so much about crazy new materials like carbon fibers, graphene, technologies like 3D-printing,... but apparently what makes a machine break are still the same things: mechanical stress, heat dissipation, friction,... new materials and processes might change the coefficients, but not the (mostly Newtonian) physics (my interpretation btw, not his words).
One could say the same thing about software engineering - true fundamental advances in algorithms and data structures are sufficiently rare that it wouldn't be a nuisance to keep up with them. But the %-age of how important those basics are relative to the extremely fast-changing landscape of tools and frameworks is much smaller (plus, one could argue that even the fundamentals see a lot of shifting ground in CS, with neural architectures, differentiable programming, not to mention quantum computing).
For example, SQL & Unix have been around since the 1970s.
Linux since late 1991.
Javascript: The end of 1995. NodeJS: 2009.
Sure, there's a ton of churn in the JS Ecosystem, but all it takes is a bit of wisdom, skepticism, and patience to avoid the hype-cycle.
Also, once you learn to build certain things with a programming language you learn the paradigms of the system you build.
For example-- Web Servers. Looking at ExpressJS vs Python Flask documentation, there are many analogous pieces, because they follow the same standards for protocols.
Another example-- data engineering / statistical computing: Checking out R vs Python packages, there are a lot of the same concepts, just in a slightly different format/language.
HTTP/1.0: 1996 (RFC 1945)
TCP: 1974 "In May 1974, Vint Cerf and Bob Kahn described an internetworking protocol for sharing resources using packet switching among network nodes."
"TLS is a proposed Internet Engineering Task Force (IETF) standard, first defined in 1999"
Considering all of this... I don't think most major things actually change that much. Sometimes a popular new framework takes the world by storm, but that's pretty rare compared to the output and churn of the ecosystem.
Especially if you’re a full stack eng who is constantly swimming over the entire stack and they keep pushing new DBs, new logging tools, etc.
There are commonalities but it is a lot of learning as you go. I used to know Angular pretty well but now I don’t remember it at all. I haven’t even gotten to really ramp on React as much because my company uses it in such a terrible way that it’s clearly not fit for.
After the first ten years software development becomes quite intuitive and you internalize all those best practices. You can be trusted to start a new service from an empty git repository. Later it gets incremental, there's a lot of path dependency in languages and frameworks and few things come out of the blue. Those that do are frequently intellectually stimulating to learn.
But interviews have been steadily getting strange and difficult (in a way not related to real life software development), at least in the last 10 years.
But doing the same to workers with 10 years of real world experience doesn't make nearly as much sense. Like hiring medical doctors by quizzing them on organic chemistry problems. Google and Facebook do it because they can and they don't know what else to do, but I don't understand how it became a universal practice.
I don't really see the difference between that an programming. Writing code is still a bunch of "if" statements, the same underlying data structures, same algorithms, etc. There's some new technology being added on the top akin to carbon fiber and the other things you mentioned but it's fundamentally the same.
The only attraction in software development is the relatively good pay. The job itself sucks. You'll spent your life sitting in a chair looking at a text editor, and that's the best part of your day as about 50% of it is distractions. You're quite unlikely to work on something truly creative or thrilling, so it's mostly a boring grind.
Then, as the article mentions, it turns out the grind was for nothing and the rug is pulled every few years and you have to start over again. The job is cognitively taxing so you'll turn into an absent person that lives in their heads, it drains your life energy.
If I would be young now, I'd say fuck it and go install solar panels or heat pumps. It's outside, physical but not too physical, thus healthy. You get to meet lots of people and you see the direct result of your work. There's no office politics and you're contributing to a tangible good thing for the world. Skill requirements don't change much.
You might come home somewhat physically tired (but over time it normalizes), with a clear head and not a care in the world. There's no overflow between work and personal life.
Chose wisely, young ones.
You go to different places and different people, every single day. That's 100% less repetitive compared to sitting at home or going to the office to see the same people.
As for the tasks themselves, it was merely an example, but even for this example I disagree. My brother-in-law basically does all of these things, both for private citizens and industry and comes across a wide array of different situations.
I'm not saying it's absolute perfection, no job is. But I stand by my point that it has a series of very meaningful advantages: far healthier, more social, direct impact of your work, no cognitive overload, no politics.
Amen. For me the advice I give the young—-go for a trade. HVAC (especially if you live where it’s hot) and plumbing to me are the most future and recession proof jobs there are. Literally by the time other folks are graduating with CS degrees and massive college debt, you have been making 70k a year for 3 years and are about to clip 6 figures for the rest of the time you want to work.
One day a computer will be able to write its own code (and that’s not that far off now), but no robot or computer in this world will be able to come to your house and fix a clogged toilet, or replace a blown capacitor in your heat pump and people will always be willing to pay nearly anything to get those two problems replaced.
As all the mid level white collar jobs keep getting automated away, there's going to be a glut of people in need of income who are more than capable of learning a trade if their ego can handle it.
Also, sitting and staring at a text editor should not sound that awful for anyone who likes to create via writing code. Because that's how you do it. But that's not what you experience, obviously. It's about what goes on in your head.
I remember the feeling after taking a new job as a fresh graduate. It felt ridiculous that I know get paid for doing the thing I've wanted to do most of the time in the past ~12 years but I was told that you can't do that all day because you have duties. (Now the pay was actually pretty bad, even for local standards, as my first job was in an academic research institute.)
Which is btw one of the depressing thing for a lot of data engineers: we used to play with those cool distributed processing frameworks, and now? We are mostly writing some terraform to deploy cloud resources, most of the distributed part being handled by those cloud providers.
Sounds to me like switching one provider/tool by another - or are data engineers feeling bummed because the job has become too trivial / less fun?
Here, this rando website says that 46% of software engineers are 40+
https://www.zippia.com/software-engineer-jobs/demographics/
Now I'm curious, this stack overflow survey paints a grimmer picture:
https://insights.stackoverflow.com/survey/2018
That's a survey though, I'm curious what biases are going to exist in the data. We might assume that older software developers move around less? May be less likely to respond to surveys? May be less likely to visit stack overflow, especially if they do less hands on coding?
40 is definitely not that old anymore for tech I think. Well I'm 38 I'll find out soon.
Deleted Comment
https://www.youtube.com/watch?v=y8OnoxKotPQ