The quality of your average tech worker has completely nosedived in the last 10-15 years.
All these huge companies wanted more products, more marketshare, more money, etc. They needed more people to pull this off. They started lowering hiring standards across the board because there just weren't enough people in tech.
Simultaneously, a huge portion of the world saw tech salaries and wanted in on it so they started taking every quick certification, bootcamp, degree, etc to get into tech.
It turns out that compared to the dedicated nerds of the previous generation, most new people just don't care that much about tech and don't want to go deeper than the bare minimum required by their job.
So I think tech overhired by a LOT, then they realized all these new people are actually net negatives on the company, and we are slowly correcting.
I think a solid 50% of people in tech are still on the chopping block. You can do much more with tools + really smart people in the year 2024 than you could before.
> The quality of your average tech worker has completely nosedived in the last 10-15 years
I find this opinion hilarious: Almost 30 years ago the second most popular software product was Windows 95 (Doom was the #1), which couldn't run for a few hours without BSODing. Almost 20 years ago the average tech worker was building atrocities with Visual Basic, MS Access and PHP.
Meanwhile today it was announced that Google decided to lay off several compiler engineers working on LLVM, but sure, please tell me more about those low quality bootcamp kiddos that are ruining everything...
Yeah, Windows 95 BSOD'd a lot; however the people developing it were doing it for the very first time, largely with single-digits of megabytes of memory to work with and with a CPU that not only contained bugs[0] but was so slow that every single operation would take milliseconds of wall time[1].
It's not comparable, it's almost to the level that rendering a character on my screen right now consumes more CPU cycles than the entire operating system would have done in a day.
PHP was good, actually, but it's borne of its time- it's easy to use 20:20 vision of history to say it's a bad design when fundamentally:
A) it solved problems
B) it was working with the best of human knowledge in language design at the time
C) it remains one of the most well optimised web languages to this day, even a variant from the era would easily outperform any django webapp, I'd put money on this.
We stand on the shoulders of giants, good abstractions and lessons from these periods are what make our software so robust today. We made it a lot slower though.
[1]: a 60Hz CPU executes an clock tick once every 16ms, Pentium 1 was the most popular CPU when Windows 95 released, and it had a clock speed of 60/66MHz; it was the first to be able to do two instructions per clock, meaning it had a best case scenario of 7.5ms assuming the 66MHz option was available.
30 years ago you could do a lot with 8-16 mb ram and no internet connection.
Nowadays... unplug a modern computer from the internet and it's mostly useless.
Want to write a simple document, using a basic word preocessor? You're going to need at least 1GB ram, mate.
VisualBasic allowed writing GUI-enabled applications that would be essentially be standalone, shipped as executable files sized at (usually) less than a megabyte that would mostly "just work". Nowadays most people write GUI app in the form of frame-less web browsers displaying html and running Javascript, requiring more than one cpu core and 2-4 GB of ram to be barely usable.
Yeah I definitely can see how "The quality of your average tech worker has completely nosedived in the last 10-15 years".
As a system engineer... I have worked with some developers that constantly had memory issue ("I need more memory to run my containers, please increase the memory quota in the (kubernetes cluster) namespace") and reasoning about memory was just a non-existant skill. They just had no idea what was using memory in their software. Literally, their ass was mostly saved by the fact that process management nowadays is "good" at restarting software and managing retries on failures. Knowledge about the underlying operating system was also very very basic.
Such people, while surely filled with good intentions, were mostly useless without some random Medium article instructing how to apply some generic fixes, or without some stackoverflow question covering their exact use case.
But wait, it's way worse than this. There also are people that got promoted to managers, don't have a solid technological background (besides the little coding they've been doing before being promoted) and just can't understand the willingness to optimize and "geek out" on performance issues (and the value of such activities).
Oh and don't even get me started on nowadays a "senior developer" is somebody that is good at "delivering value to customers" instead of somebody that knows the ins and outs of the problem domain and has mastery over the technological stack.
Working in tech nowadays is a lot less about tech, it seems.
GP left a fair number of qualifiers in their post which you have conveniently ignored so that you could cherry pick an example of people being laid off from roles perceived to be comparatively hard / prestigious.
This is a bad-faith argument. If you disagree with what they’re clearly trying to convey, and want to make a case against it, I’d be keen to hear one without cherry-picking some compiler engineering layoffs in a company where the vast majority of others are obviously doing something very different to that.
Your comment is basically akin to complaining how rough and slow it would take a person with a machete to trailblaze a path through a dense forest as opposed to how long it takes a person to travel the beautiful paved road that was built on top of that trailblazed path. Somebody had to thrash through all the shit for that superhighway you are traveling on my friend.
I was around back then. We didn’t have a magical browser box that you could type a couple of key terms into to get a thousand articles, code samples, philosophical discussions from hundreds of smarter people than you who have already solved your problem a dozen different ways and got to chose, and improve one. You didn’t have hundreds of languages, libraries, and frameworks where you could selectively pick the right tool. Back then you had your problem and you experimented and invented until you solved it generally with the one or two tools that were available at the time.
And with the benefit of nearly 40 years in tech…I can attest to the OP’s opinion that the quality of the average tech worker has nosedived since then.
The LLVM layoffs notwithstanding, OP is still largely correct in general. Your typical developer could solder, knew how the CPU works, knew roughly how many clocks would retire a MOV, ADD, MUL, etc, was cognizant of wait states, understood cache lines, protection rings, instruction decode, memory mapping, etc.
Try asking a typical Django or React developer today what a cache line even is.
I saw the same influx of salary-seekers during the dot com boom. Generally people that don't have a native curiosity and love of computing just don't pan out well. I'm not gatekeeping or against them trying, would love it if people were more honest about it but honesty isn't part of the hiring culture ("why do you want to work here?" "money" is frowned upon as an answer.)
It would be like if there was a housing boom and residential house framing ramped up base hourly pay to $100/hour with lunches provided and relaxation chambers onsite. You would see a huge influx of people going to it for the money rather than someone who is really into construction, knows load bearing tolerances, code for hurricanes or earthquakes, how to properly frame stairs, etc.
But I agree that the layoffs probably aren't based on programmer quality. When big companies lay off, they generally select products, and if you are unlucky enough to be working on that product...off you go. Merit-based layoffs are more stealthy and continuous, those are definitely being ramped up now, but probably not in the numbers companies need to cut due to overhiring or hiring in the wrong areas.
I will fully discard your argument because Windows 95 was a tech marvel of its time, huge leap in consumer OS segment and stayed unchallenged until introduction of Windows 2000 Professional, which was actually more on the workstation side.
BSODs were mostly avoided by using MS-authored drivers.
> It turns out that compared to the dedicated nerds of the previous generation, most new people just don't care that much about tech and don't want to go deeper than the bare minimum required by their job.
The average day-to-day work is menial, for example investigating a production issue. That isn't something that people get into as a hobby, that's work in a very unromantic sense. Maybe having good systems background can help in those situations, but it's a little more subtle
I do think though that it works the other way: people who actually care about computers don't care if tech is no longer a money field, they just want to work on computer stuff. And maybe earlier it wasn't as much of a money field. But that doesn't imply that the enthusiasts are better at fulfilling work responsibilities. A lot of work stuff is business specific and not very dependent on systems background
So yeah, I don't think the part about over-hiring is necessarily wrong, but this sort of "I'm better than you" attitude isn't very becoming. Maybe you should give this [0] a read, just to see a different perspective
I'd never heard of that novel or these laws, but as an American it would be unthinkable to consider them anything other than Kafkaesque dark satire. The idea that they've moved from that to some sort of positive model for egalitarian society is downright creepy. Specifically, because they all contain the hypocrisy of saying explicitly as a group the very thing they're telling the individual not to say. Taking the side of we is therefore willfully brutal and repugnant to individual dignity. Presented without irony, is this some sort of fascism-light?
This doesn’t mesh with my experience. I work at big tech co that went through massive rounds of layoffs. The criteria was a mixed bag. A lot of it was just entire products and associated teams being dismantled.
Also, one can make the argument the other way as well. Tools and infra have gotten good enough that the work doesn’t require the “dedicated nerds”. So let’s get rid of the highest paid engineers as that’s the most impactful to our bottom line. They coincidentally happen to be the nerds. We can always hire them back for less because what else are these nerds going to do in this market.
Not saying that’s the case but demonstrating that it’s pretty easy to string together silly notions we might have to explain reality.
What the GP said was how the bloat accumulated and why the professional quality went down. What you are saying is how they are trimming the bloat. It's not possible to trim by skill alone, because it would leave all teams and products understuffed. But the bloat is there and it is insane.
> Also, one can make the argument the other way as well. Tools and infra have gotten good enough that the work doesn’t require the “dedicated nerds”
Has it though? In my career I have seen people die on that hill, but only because their entire careers were utterly reliant upon those tools. If you cannot execute without a given set of tools you are in no position of objectivity when it comes to those tools, which results in a lot of catastrophically bad business decisions.
How do you define "quality" of tech worker? What is the profile of people being let go? Are they recent hire or people with quick certification, bootcamp, etc? A quick review of people on LinkedIn with Open to Work badges seems to paint a different picture than any claim of people being let go are of lower quality.
> It turns out that compared to the dedicated nerds of the previous generation, most new people just don't care that much about tech
Now having experienced almost three generations coming to adulthood, I hard disagree with any such notion of newer generation not being dedicated and smarter. I have lot more respect for the newer generation: what they have been served with and how they are handling it. The changes in last 10 or 20 years have far exceeded any changes I experienced in my first 0-40 years of life. And, imo newer generation has been adapting much better to such changes than the previous generations including my own. This adaption might come across not being "dedicated nerds" to previous generation but with the rapid change taking place, ability to adapt quickly trumps being "dedicated nerd."
> A quick review of people on LinkedIn with Open to Work badges seems to paint a different picture than any claim of people being let go are of lower quality.
Come on, LinkedIn (CVs) tell nothing. Having been somewhere is often made up as great experience even if did not yield much experience, or one wasn't as involved as claimed, and even been somewhere is a lie.
It is usually the opposite if I check known colleagues: null performers and phonies have outstanding profiles, the good or super programmer somtimes an awkward or does-not-care profile.
(But yeah, also admitting: Companies usually fire by the wrong metrics and even direct managers, which are bad managers, do not know the real differences of their underlings..)
> notion of newer generation not being dedicated and smarter
Full agree, I don't think it is much new generation vs old, there are the same kind of great peoples.
But computer science really changed and grew. E.g. I strongly remember when I joined university 20 years or so ago how all tutors claimed how big our year is, and that before us everybody knew each other (profs/tutors/students), but now impossible. But still, 80-90% of the students that survived the first semesters just belonged there and would be great coders or computer scientists, with the right skills and ambitions. If you look nowadays, it is different and maybe more similar to other professions, and more 50/50 between those guys, and the other half which is ambitionless and/or just lack the skills, mindset, whatever, and would have better become something else. If you aks those some even freely admit they hate their job, but only drawn by the money.
Explains why tools like Copilot seems to be so hyped, yet I as an old school developer really can't find a place where it would help me.
I guess most people really benefit from a tool to autogenerate an Erathostenes sieve in Python or a fizzbuzz function in Prolog or yet another React Router boilerplate.
I do really benefit from finally having a way to use shell scripting. Life is too short to remember hundreds of one letter options with obscure combinations
In my experience, this mostly is because the “hiring bar” at big companies is frequently disassociated from actual day-to-day work. It typically looks like
1. memorizing some 50-200 or so domain-specific general patterns which problem solutions will tend to follow, so you can solve an arbitrary leetcode medium in under 20 minutes time — a skill that has almost nothing to do with the plumbing and product work most tech-workers engage in on a day to day basis, and
2. Speak at length and extemporaneously on arbitrary from-scratch system design problems — a skill which, while sometimes important, is a skill which most engineers that aren’t working at a startup won’t usually exercise more than once or twice a year, as the ratio of work on “systems already designed” to “systems needing new designs” will overwhelmingly skew towards pre-existence.
When your hiring process and day job are so thoroughly dissociated and unrelated to each other, it’s not a surprise current workers who aren’t actively studying for a job change couldn’t pass it.
I don't think this is a significant factor in the size of the layoffs. Many high-level and high-tenure employees are being laid off, and this is apparent in many of the various companies that have had significant layoffs in the last couple years. The lay-offs have also heavily affected departments outside of engineering: product, marketing, finance, etc. have all been affected.
I don't disagree that companies greatly over-hired, but I don't think the largest portion there has been bootcamp/cert/junior SWEs.
I was just today reminiscing about how smart and capable everyone seemed at the startup I interned at in 2013, and comparing them to the quality of engineers I’ve worked with in the startup scene more recently. The difference is stark. I can’t pin down one reason but I agree with you. Too many bootcamps. Too much resume-driven development. Too many engineering managers whose only programming language is JQL.
Another factor here is that for many reasons startup equity (and thus startup comp in general) is much less valuable statistically than it was 10-15 years ago. Everyone knows that it a shit deal now, so many of the best engineers, those who can, are in big tech now. Leaving startups with lower quality talent pool, or those who just need a first job.
I say that with sadness and fond memories of ~2013 startup culture.
> The quality of your average tech worker has completely nosedived in the last 10-15 years.
Not my experience. I've conducted 300+ interviews at FAANG in the last 15 years and haven't seen a meaningful drop in the hiring bar.
Based on anecdotes, I have another theory. The people that build the systems got old and are unwilling to work the crazy hours. The new folks are hired to work on maintaining these systems, and it's not fun. So overall both quality and speed declined.
Not arguing over original point of quality of tech workers but...
> I've conducted 300+ interviews at FAANG in the last 15 years and haven't seen a meaningful drop in the hiring bar.
Are FAANG interviews these days an indicator of anything besides being good at FAANG interviews? Some of the best people I worked with that brought real value would never pass big tech leetcode interview...
It's much easier and faster to build greenfield systems that have no historical baggage than it is to retrofit older systems that already support large customer bases and revenues for newer features.
Not just pure tech systems, but businesses and non-tech systems as well. And tech companies have gotten large with a ton of baggage and customers
>The quality of your average tech worker has completely nosedived in the last 10-15 years.
Initially this sounded right, but then I remembered all the trash that showed up during the dot boom and subsequent booms. It's just hiring booms attract a lot of people who want to make easy money and those people think programing is easy money (it's not). Bootcamps and even CS degrees are just step one of many steps and many people think step one is the only step and don't work to further their craft. Once it becomes evident that an org has way too many of these people, it starts to cull. Unfortunately, large orgs are very bad at "rightsizing," and just "mow the lawn."
The problem with tech is to be good at it, you have to be passionate about it to the point of near obsession. That's a requirement above and beyond just being smart or good at math, or even good at programming. You can fill a room with brilliant but dispassionate people and you won't get much done.
So you're right in a 15 year cycle, but this cycle has happened before and we're now in the the culling part.
> The quality of your average tech worked has completely nosedived in the last 10-15 years.
It's definitely got worse over the last ~10 years, but the pandemic era hiring from 2020-2021 took it to new level of bad at many companies. Anyone who had done some basic coding was getting hired at a great salary in 2021, and these individuals are still around today – some now calling themselves "senior developers".
But I completely agree with what you're saying here about about it no longer being nerds working in tech. A nerd who isn't the best coding will still engage with me about technical details, be keen to learn new things, and care about their work. A lot of the new guys I work with today are just doing the bare minimum and they think you're an idiot for caring.
I have evidence to support this - I lost people from my team who were offered "senior" or "lead" developer roles when they were nowhere near the level required for that kind of job. The level they were at was already a a stretch that they were growing into.
...and with a salary boost to match.
Now in the UK salaries have flatlined and the number of jobs is way down, and this no longer seems to be a problem.
"I see no hope for the future of our people if they are dependent on the frivolous youth of today, for certainly all youth are reckless beyond words. When I was a boy, we were taught to be discrete and respectful of elders, but the present youth are exceedingly wise and impatient of restraint." (c) Hesiod
I agree with the opinion, but not the timeframe. I would say the quality has nosedived over the last 8 years, based on my experience as the tech interviewer for potential new hire interviews across the last 12 years on different projects.
The last 4 years I've gotten comments from co-interviewers of 'you ask the hard questions', 'you are tough'. These are in response to questions such as 'What are your thoughts on composition vs inheritance in object oriented design', 'What are the arguments for Java as a better language than Javascript? How about the other way around?', 'What's a progressive web app, what aspects of React and what react-related libs help with creating one?'. These were all directly relevant to the positions being interviewed, and some of those were for senior engineer positions.
These are questions that I think should be answerable by any senior frontend engineer, and most senior software engineers.
More charitably: Many people who 10-15 years ago would have excelled in a non-tech field have since then been drawn into tech due to shrinkage in those other fields and to the attractively high compensation in tech.
Having survived through many layoffs, this is 100% not the case. The modern layoffs don't even seem to consider whether or not the person was a top contributor or sme. They are done to reduce cost and look good to the board/investors/whoever.
I'm pretty sure I survive because my tco is probably more on the humble side but I'm far from cheap.
Even if we assume that what you think is true then it is still solely the companies' fault. Because if the prospective employees are so shit why would you hire them in the first place?
I feel like assigning blame is neither here nor there. Leaders in these companies often look at what other companies are doing, and follow suite. There's an argument that Elon laying off 80% of twitter is another catalyst - so why companies not follow?
Ultimately, if hiring managers offered these roles because work was needed but no longer is, would you blame them for adjusting their approach?
I'm not looking to assign blame or anything, I've just seen this pattern play out in the last couple of decades. I think it's a valid theory of the current "market correction."
Because time-to-market matters? Because even minimally productive folks were ROI-positive in ZIRP world, but a bad idea in 8% interest world? Because Wall Street rewards growth?
The proliferation of LeetCode for interviews means that we have an army of employed software engineers who are great at solving toy puzzles, but when it comes to actual business and / or customer problems, they stumble and falter. Management only have themselves to blame for spreading this kind of culture of interviewing around.
And yet, the engineering interviews still typically revolve around Leetcode algorithms instead of the messy but hardly CS-intensive day to day work that engineering typically involves.
The more experience you rack up in your career, the more likely you are to walk away from jobs with laborious hiring processes too.
I will politely decline any opportunity that requires me to do multiple code and architecture tests and several other lengthy meetings, usually spread out over the course of months, because it’s not worth my time to jump through so many hoops. It signals to me that there is already a tendency to implement process for the sake of process.
What will really sell me on a company is a more in depth conversation with the leadership and the team where we learn more about what we value and how we like to operate. And that’s how I would interview anyone who was referred by someone I trust.
That's (very charitably) because those are weed-out questions for early-stage engineers.
Lots of companies feel they need weed-out questions because they're swamped with applicants. How does applicant approach a problem? Does applicant describe their thought process figuring it out? Can applicant figure it out?
Did that tell anyone how good the applicant is at the job? No. It just provided some gates for the interviewer to say "no hire."
It's hard to differentiate the applicants' skills so early in their careers - not enough work history to go on. I have no idea why these would be used for senior positions except that "senior" is this year's "junior."
I'm scared about the skill level of some of my coworkers. They are 30+ yo and I hear them debating about whether or not docker needs vt-d activated, if .msi packages can be customized, if terraform is useful for deploying on a single esxi.... All things that can be foind out with 1s of googling they will talk about it for 5 to 15min in meetings (formal or not) and the conclusion is always "we will have to test this".
There’s a decent chance they want to chit-chat a bit. I’ll occasionally strike up a conversation I know I could avoid by googling, just to have a conversation.
The industry hired more people than the small percent of the population before that were attracted to tech.
The jobs needed to change to accommodate all the new, less-tech-interested people. They became more standardized and less thought-involving. More and more jobs are now about using frameworks and writing glue code instead of solving novel problems.
Even the new techier-than-average people are likely in less techy jobs. And jobs less oriented to making them grow into their best geek selves.
Because the jobs became oriented to hiring neophytes to do the bulk of the work, we also see title compression and a low cap to expected salaries. If someone seems too senior and starts to cost too much, this model allows replacing them with new tabula rasa (bootcamp) workers.
This exact complaint was common in 2001. Which hey maybe it’s true! Maybe the nerds from 2 generations ago could have built all this technology 10x over… except they didn’t.
I, for one, had the same complaint then. The dotcom boom pulled in lots of people not interested in tech, just in paychecks.
> Maybe the nerds from 2 generations ago could have built all this technology 10x over… except they didn’t
Fantasy math aside, the nerds don't really decide what tech to build. There was no mass of nerds 25 years ago arguing over NNTP about who could build the best ad platform for the newly-emerging World Wide Web. That's a suit thing.
There’s also just no process anymore. Product managers can’t even be bothered to fill out more than a title line in a ticket. Burndown charts? Ha who cares about points on the tickets.
I don’t think the average tech worker is dumb. They may even be better than the average in the 2000s/2010s (tech boom), maybe not, but I’m at least sure especially those at Google could get and stay hired in earlier times.
What I suspect is that tech companies have many more workers than needed, because most software simply doesn’t need many people to build and maintain. I remember massive tech companies running on teams of <100 people, and even today, I hear of many critical departments with only around a dozen or less.
Google search, Github, AWS, Outlook, Facebook, Uber, IntelliJ. These are software behemoths, but at their core they’re rather simple, and a single developer could make an MVP of any in a few weeks, and with adequate testers and resources, something scalable and stable that could be used by professionals in a few months. And that’s creating it from scratch (well, existing libraries), though maintaining everything maybe doesn’t require much less, it surely doesn’t require more. I understand the big products have more, at least name recognition, and I understand that a big product needs more than programmers (asset creation, marketing, QA, tech support, accounting, etc.) But the software development side isn’t very big, especially if you already have the software.
Around 2015, I read an article saying that computer science was the most popular major for Harvard undergraduates. I found it quite surprising. I would think that a Harvard student could basically do anything with their life that they want: medicine, law, business, politics, anything they set their minds to.
Why, I wondered, would they want to enter this strange industry with all our odd characters and personalities? To me it's like, this is a place where you're going to compete to fix or improve something in the Linux kernel in order for Linus to swear at you and publicly humiliate you in an immutable ledger called the LKML. (Welcome to tech!) Even now, I don't really understand how this could have become the most popular major at Harvard.
I think things will sort themselves out in time. A person might start a career with little inherent interest but stumble into something that excites them and excel at it. Or a person might work in technology for a while, decide it's not right for them, and move on to something else. We'll have to see where it goes.
And this is on a backdrop of a long term 'hollowing out of the middle' as technology automates rote work (AI being the latest incarnation), leaving only high skilled jobs (including in tech) on one end, and manual jobs (driving, picking, construction) that we've been unable to automate on the other. Thus both ends (high and low skill) are seeing 'refugees' from this economic process, with overskilled people doing menial work, and underskilled people trying to tackle high end work.
This doesn’t really make sense - if it were true a single company could pay many times what faang does to secure talent 10x+ better than a normal tech worker and have unfettered success. It also means a 10x dev years ago would be.. 100x now?
It also implies some weird logic that if you were into tech before, you’re better than someone who was born 10 years later and is into tech now. And implications that if you’re not devoted to your employers industry your work output is meaningfully worse, which maybe is true but maybe not.
Apologies, I'm not sure I understand what you're saying so would love to hear more of your thoughts.
> It also implies some weird logic that if you were into tech before, you’re better than someone who was born 10 years later and is into tech now.
This was not what I was trying to say. The amount of people who work in tech is far, far higher now than it was 20 years ago. It just brought more "average" people in who don't really push the boundaries of what is possible. Lots of apps that move strings in and out of databases, very little real innovation going on from smart people.
Your first point assumes that it is easy to identify top tier talent during the hiring process and that tech employment is not a market for lemons [1].
My go to example of this: jQuery. Now for JS it’s primarily React, but the nosedive, or rather nosebleed, started with jQuery. You no longer had to understand the technology or data structures because there were declarative APIs that made things easy. The products were crippled and slow but a new wave of people could suddenly participate.
"I think a solid 50% of people in tech are still on the chopping block. You can do much more with tools + really smart people in the year 2024 than you could before."
Absolutely not. So much more red tape now rquires a ton more headcount. Just think how much time we spend these days to fix dependency hells and patching security issues.
Maybe it's just beneficial to move web developers to a different catalog? So you have the web developers who mostly chase new or cool frames/tools, prefer dealing with human problems than machines problems, and other developers.
I mostly agree with your point and I have a corollary around this part:
> Simultaneously, a huge portion of the world saw tech salaries and wanted in on it so they started taking every quick certification, bootcamp, degree, etc to get into tech.
In 2007-2017 the concept of "startup company" was also heavily romanticized, to the point it was almost toxic. I was in university in 2013 and pretty much anyone wanted to build a startup. More often than not it was non-tech people exploiting tech people into building MVPs. It was painful to see (and in a few cases, to experience).
Don't even get me started on "apps". Every moron on the block had some kind of "idea for an app" and was completely clueless to how the app markets worked (let alone how to actually do that).
Of course, being pretty much all amateurish at best, no single business plan was in sight.
> The quality of your average tech worker has completely nosedived in the last 10-15 years.
The average tech job requires much less skills though. It's not like we're all working on rockets or the building blocks of the internet. A good 70% of tech jobs today are crud api and basic web/mobile apps
I've heard people describe the corporate travel platform I work on described as a "simple CRUD app" so I'm not sure I believe this is true about almost any software company.
Interesting idea. But supposing that the quality never nosedived, I suspect companies would still be doing these mass layoffs. Quality is always relative to the overall pool of job applicants, and companies would just layoff people with a higher bar in mind.
This is my bet as well. They're not firing because they have to. They're firing because the bean counters showed them how much more money they could have if they did.
Coast for a year or two, save millions, then rehire back when they need to innovate or update.
> The quality of your average tech worker has completely nosedived in the last 10-15 years.
I don't think your take holds water. It reads as a mix between self-aggrandizing and ladder-pulling, typically pinned on the so-called boomer mindset.
For the past 15 years you have seen universities producing graduates that are far better prepared for a cloud world in every single aspect of the business, and these graduates start off working in cloud-related projects. All frameworks that dominate front-end and back-end components were created in the past decade, and leverage the same cloud-based competencies that new graduates learn.
If anything, new graduates are heads and shoulders above veterans, and the only thing that they miss is two decades of work experience. That can be an asset or a liability.
I also add that the bulk of >15year veterans graduated in the 90s and 80s, which was a time when the academic world was still trying to figure out what it was supposed to teach in terms of software engineering, and most universities had basically scrambling to cover relevant topics. The graduates that went through those programs, unlike today's graduates, were woefully unprepared for the reality of software engineering. Two decades ago you could land a job by passing yourself as a "programmer" which meant you knew the syntax of a programming language such as Pascal or Cobol. That's the technical background of your average veteran with >15years of experience.
Therefore, I think you are entirely wrong. If anything, the quality of your average tech worker improved greatly in the last 15 years. Those who arrived in the field actually hit the ground running in today's tech world and often push over and replace much senior team members. Those who enter the field from non-tech fields have the technical chops ro replace both veterans and new graduates. In today's world you need more than claim to know C or Java to actually land a job.
Disclaimer: I'm a veteran with >15years of experience who worked at and was involved in the hiring process of a FANG.
I think this is only half the answer (or atleast part of it). This not only ends up attracting the make-quick-buck-with-certification types but also the political sociopaths who knows how to optimize time on the ladder over quality. This not only has lowered the bar but also made these places very toxic to work in! Passive aggressiveness anybody?
You are a company. The system we have demands growth. Even very stable and reliable profits are seen as failure. There must be growth.
The people who run a company can not press a magic button to increase revenues. They can't just pull a successful new project out of nowhere. Anything like that is going to be a risk, and will probably fail. It will also take time.
The one thing they can always do is cut costs. Projects can be cancelled. Divisions can be sold off. The biggest cost at most companies is labor, and labor can be let go.
When someone controls a company, they own a lot of shares in that company. Their bosses are all shareholders who only care that the stock price goes up. Nothing the company is doing is generating huge new revenue streams. Time for layoffs.
And when some people do layoffs, everyone does them. They're all subject to the same market pressures in the same industry. One company doing them gives all the other companies in the sector permission to do likewise. If a company doesn't follow suit the market might even start to question why.
You may have seen some news that Microsoft passed Apple briefly in terms of most valuable company on Earth. You may have also noticed that Apple is much more restrained in its layoffs than the others. Not doing as many layoffs, not doing as well in the market. These things are not unrelated.
This doesn't seem to me to explain why the layoffs are happening now. AFAIK Google has never done a true layoff until last January in the entire history of the company. Now it seems to be already routine.
As someone who grew up in the 80s-90s, my historical sense of layoffs is that they were a response to economic hard times, whether industry-wide or company-specific. Companies laid off when profits fell or disappeared, which then drove a need to search for ways to cut costs.
But the major tech companies are all seeing continuously growing revenues and profits within the context of very healthy general economic numbers, yet they are all laying people off.
The simple answer would be AI. That’s been the major change in the past year. I find it odd that everyone thinks Uber’s end game is automated cars, but no one seems to be talking about how tech companies end game is automated coders. LLMs are not even close, but what’s to say internally, they don’t have something that works. Or they have enough data to show they need fewer workers since they can leverage AI now.
"You are a company. The system we have demands growth. Even very stable and reliable profits are seen as failure. There must be growth."
But take a software company like McNeel. They make a popular but extremely niche market product called https://www.rhino3d.com. They've been around for decades. They have extremely low turnover. Many developers have worked there since the beginning.
They are used around the globe by architects, product designers, industrial designers, etc.
What are they doing right? Why/how do so many others get it wrong? Are humble software companies like this unicorns? Why?
> TLM, Inc. dba Robert McNeel & Associates is a closely held employee-owned Washington corporation funded solely from retained earnings.
So they're a private, employee owned company with a strong niche product that doesn't depend on outside money. They took the slow road, something that is antithetical to VCs but perfectly acceptable for bootstrapped founders that want to live comfortably but don't care about striking it rich.
McNeel is not a public company. Public companies are driven towards prioritizing revenue and growth much more than private companies because growth is a shared goal of all shareholders. I suppose it could be said that companies with outside PE or VC funding would also prioritize growth, but that would be in the same vein of getting to an exit or IPO rather than just making the stock go up.
idk what their profit structure or founding story is, but the poison pill for every company is investment capital. The second you take VC money you are beholden to someone other than your coworkers and clients. If you can get off the ground and become sustainable on sales alone, you are golden. Stability is possible but the problem is that most VCs demand 10x return, not 1.25x return.
Rhino is an amazing product that I use every day. The developers are active and responsive on their forums and I’ve had feature requests and bug fixes go from forum posts to shipping in a couple weeks. You can also pick up a telephone and immediately talk to a knowledgable tech support person. It’s amazing and I wish more companies operated like McNeel.
Edit: They also do that thing that everyone says is impossible and sell perpetual licenses. Every 18 months or so they offer paid upgrades to the next version and I always buy it because the price is reasonable and they're jam packed with useful new features.
> Rhinoceros (typically abbreviated Rhino or Rhino3D) is a commercial
> 3D computer graphics and computer-aided design (CAD) application
> software that was developed by TLM, Inc, dba Robert McNeel &
> Associates, an American, privately held, and employee-owned company
> that was founded in 1978.
So not a public company subject to the fickle stock market's expectations of constant growth...
> You may have also noticed that Apple is much more restrained in its layoffs than the others. Not doing as many layoffs, not doing as well in the market. These things are not unrelated.
There isn’t anything in life that’s “stable”. Not one single thing. Even rocks on the ground erode.
Because of inflation, stable is actually shrinking. If you raise prices perfectly in lock step with inflation, trends and tastes of your customers still change, necessitating innovation if only to maintain the exact same level.
In reality, you have to grow to ensure when the ground shifts from under you, there is still some buffer. Growth is insurance against an ever changing, unpredictable world.
The people who think things never change are people without imagination. They want safety and security, but that is an illusion.
> And when some people do layoffs, everyone does them. They're all subject to the same market pressures in the same industry. One company doing them gives all the other companies in the sector permission to do likewise. If a company doesn't follow suit the market might even start to question why.
I'd like to try and simplify this - it's simply the new "normal" and companies can get away with it(meaning the labor side does not punish it, yet - no unionising etc).
In the EU layoffs like this are not as easy to pull off by companies.
One reason I think not given enough weight is that it is now acceptable to do layoffs. These huge companies accrue a ton of dead weight and dead-end projects that they would love to flush routinely, but 5 years ago their stock prices would halve if they suddenly laid off 10% of their work force because of bad optics. Now it has become acceptable for all the various reasons so any company that wants to clean house is going to clean house.
Yep, executives just copy each other all the time. They start thinking about layoffs when other companies in the industry are doing it.
My company is discussing having people show up three days a week in the office. The only justification mentioned was that other companies are doing it. I am quite doubtful they did any sort of analysis to ensure it's a good idea.
Musk's bet, that every executive watched like a hawk, was to fire a whole bunch of "spoiled expensive developers" who would flap their wings about the sky falling (their perspective).
If Musk simply wanted to cut costs and get Twitter to the core of its profitability because the company had pretty much peaked, then it was a valid business plan in the Gordon Gecko realm.
But Elon talked out of all parts of his mouth nonstop, said he wanted vision and features but fired and hired for maintenance. It's weird that all the other companies too that as direction for their layoffs.
Executives were appalled at their lack of power in COVID too. I think the layoffs are an attempt to gain authority, I don't even think it is about the dollars and cents. Elites don't actually care about how rich they are after a certain point, what they care about is the gap between them and the "plebes", and the developer plebes were far too uppity in their view.
A lawyer commenting on the game industry (which is largely copying the Silicon Valley business model) has an excellent explanation of these cycles, and why it is that the industry is simultaneously reporting record profits and mass layoffs: https://www.youtube.com/watch?v=-653Z1val8s
tl;dw (in my own words): investors like it when you have extreme hiring during the good times, and they like it when you have extreme cutbacks during the bad times. Investors don’t like slow and steady growth.
Interestingly, Nintendo quite firmly rejects the Silicon Valley approach, sticking firmly with the slow & steady - and while the rest of the games industry was doing layoffs, Nintendo gave all their employees 10% raises.
I don't follow. Aren't employee salaries a deductible expense? The above makes me think more along the lines of no longer being able to deduct GPU costs incurred from deep learning research.
IIUC, whereas previously companies could deduct salaries in year Y from revenue in that same year Y, now section 174 allows deduction of only 1/5 of those salaries in each of the 5 years [Y,Y+1,Y+2,Y+3,Y+4].
What does the average company spend on R&D percentage-wise?
My old company was a contract engineering company, so they always tried to staff everyone possible on projects being paid for by customers. R&D was used to develop internal tech and build expertise on new technologies so it was only maybe 10-25% of most engineer's timesheets.
Do other tech companies spend that much more on R&D?
I think it's a combination of things and will vary by company.
For the largest companies, it's likely to some extent to put downward pressure on salaries in the market.
For others it can be like the Twitter case were they hired way too many devs for what was needed and that simply makes a bigger communications mess not more productivity.
For others still it's outside pressure to become financially stable or income positive with a reduction in investment capital over time.
The larger economy, the potential for a more widespread military conflict and a number of other factors taking their toll in different ways.
When belts need to tighten, late and expensive projects get cut.
There are still jobs out there, but with remote roles in particular, which I'm personally experiencing. Pay scales are all over the map and there are hundreds of applicants to many jobs out there.
It sucks to be looking and I imagine it sucks to be hiring as well. The latter because of the shear volumes of applicants to sift through.
I will say, I'd rather receive no contact at this point than the boilerplate rejection emails. If there's no substance or advice, it's just a net negative IMO. Used to hate getting dropped. But having to fill out a few dozen applications a day competing against hundreds and seeing the ones where you're less than ideal sucks.
I took 4 months off because I needed to get past the burnout. Now it's just hard getting back in.
IMO, what has happened is that big tech has realized they are monopolies or cartels, and the ethos of these companies as innovators has fundamentally died. Because Facebook's Metaverse, Google's multiple ventures in self-driving and other alphabet things, etc, have all failed miserably, particularly from the investment payoff.
Facebook and Google in the last 10 years went from usable worthwhile sites to ad-dominated monstrosities, and their revenue exploded. Likely the beancounters have taken total control, and recognized that worker salaries for all the worker bee tasks in the big tech companies are vastly overpriced.
This executive management decision is the underpinning of Musk's takeover and destruction of Twitter. It probably would have succeeded if he wasn't insane.
- excess hiring during the pandemic;
- interest rates raised;
- salaries too high;
- cut in remote work (to ease return to office later);
- some weird thing coming;
- AI;
(I could replace all those semi-colons with question marks.)
I don’t know about in general, but I can describe why I was laid off last year.
I was hired by a small consultancy on a software team supporting a dedicated software product for a small but powerful industry. Software was/is an emerging side business for that employer. This team completely lacked discipline and vision from a software execution perspective. The software was horribly organized, 80% of the logic was in SQL stored procedures, there was no test automation, and everything was copy/paste between environments. So, this was extremely high risk. The product side of the team, on the other hand, had extremely good discipline with a solid vision and extremely good documentation.
I was the only senior developer on the team with any advanced experience outside of SQL. It became super clear this employer was a mistake when they didn’t want me to fix anything and my junior peers back-stabbed me during 360s as salvation for their inability to communicate in writing. I just rode out the last several months until they eventually fired me when billable hours evaporated.
After a few months of looking for a new job I made a promise to myself to never EVER return to employment that feels immature. I would never take a job too reliant on frameworks and tools, such that the job/industry are compensating for talent with gimmicks. I abandoned my career writing JavaScript and eventually gained work in data science.
I can totally relate. This is painfully close to my experience (including the junior backstabbing to save their jobs and management not wanting to fix anything).
I can't say if this is common in the industry but it's certainly the most depressing situation to be in. I didn't see a way out of this that wasn't quitting. Changing the culture when people below and above you can't even realize they are drowning in their own mistakes seems like an impossible situation.
The quality of your average tech worker has completely nosedived in the last 10-15 years.
All these huge companies wanted more products, more marketshare, more money, etc. They needed more people to pull this off. They started lowering hiring standards across the board because there just weren't enough people in tech.
Simultaneously, a huge portion of the world saw tech salaries and wanted in on it so they started taking every quick certification, bootcamp, degree, etc to get into tech.
It turns out that compared to the dedicated nerds of the previous generation, most new people just don't care that much about tech and don't want to go deeper than the bare minimum required by their job.
So I think tech overhired by a LOT, then they realized all these new people are actually net negatives on the company, and we are slowly correcting.
I think a solid 50% of people in tech are still on the chopping block. You can do much more with tools + really smart people in the year 2024 than you could before.
I find this opinion hilarious: Almost 30 years ago the second most popular software product was Windows 95 (Doom was the #1), which couldn't run for a few hours without BSODing. Almost 20 years ago the average tech worker was building atrocities with Visual Basic, MS Access and PHP.
Meanwhile today it was announced that Google decided to lay off several compiler engineers working on LLVM, but sure, please tell me more about those low quality bootcamp kiddos that are ruining everything...
Yeah, Windows 95 BSOD'd a lot; however the people developing it were doing it for the very first time, largely with single-digits of megabytes of memory to work with and with a CPU that not only contained bugs[0] but was so slow that every single operation would take milliseconds of wall time[1].
It's not comparable, it's almost to the level that rendering a character on my screen right now consumes more CPU cycles than the entire operating system would have done in a day.
PHP was good, actually, but it's borne of its time- it's easy to use 20:20 vision of history to say it's a bad design when fundamentally:
A) it solved problems
B) it was working with the best of human knowledge in language design at the time
C) it remains one of the most well optimised web languages to this day, even a variant from the era would easily outperform any django webapp, I'd put money on this.
We stand on the shoulders of giants, good abstractions and lessons from these periods are what make our software so robust today. We made it a lot slower though.
[0]: https://en.wikipedia.org/wiki/Pentium_FDIV_bug
[1]: a 60Hz CPU executes an clock tick once every 16ms, Pentium 1 was the most popular CPU when Windows 95 released, and it had a clock speed of 60/66MHz; it was the first to be able to do two instructions per clock, meaning it had a best case scenario of 7.5ms assuming the 66MHz option was available.
Nowadays... unplug a modern computer from the internet and it's mostly useless.
Want to write a simple document, using a basic word preocessor? You're going to need at least 1GB ram, mate.
VisualBasic allowed writing GUI-enabled applications that would be essentially be standalone, shipped as executable files sized at (usually) less than a megabyte that would mostly "just work". Nowadays most people write GUI app in the form of frame-less web browsers displaying html and running Javascript, requiring more than one cpu core and 2-4 GB of ram to be barely usable.
Yeah I definitely can see how "The quality of your average tech worker has completely nosedived in the last 10-15 years".
As a system engineer... I have worked with some developers that constantly had memory issue ("I need more memory to run my containers, please increase the memory quota in the (kubernetes cluster) namespace") and reasoning about memory was just a non-existant skill. They just had no idea what was using memory in their software. Literally, their ass was mostly saved by the fact that process management nowadays is "good" at restarting software and managing retries on failures. Knowledge about the underlying operating system was also very very basic.
Such people, while surely filled with good intentions, were mostly useless without some random Medium article instructing how to apply some generic fixes, or without some stackoverflow question covering their exact use case.
But wait, it's way worse than this. There also are people that got promoted to managers, don't have a solid technological background (besides the little coding they've been doing before being promoted) and just can't understand the willingness to optimize and "geek out" on performance issues (and the value of such activities).
Oh and don't even get me started on nowadays a "senior developer" is somebody that is good at "delivering value to customers" instead of somebody that knows the ins and outs of the problem domain and has mastery over the technological stack.
Working in tech nowadays is a lot less about tech, it seems.
This is a bad-faith argument. If you disagree with what they’re clearly trying to convey, and want to make a case against it, I’d be keen to hear one without cherry-picking some compiler engineering layoffs in a company where the vast majority of others are obviously doing something very different to that.
I was around back then. We didn’t have a magical browser box that you could type a couple of key terms into to get a thousand articles, code samples, philosophical discussions from hundreds of smarter people than you who have already solved your problem a dozen different ways and got to chose, and improve one. You didn’t have hundreds of languages, libraries, and frameworks where you could selectively pick the right tool. Back then you had your problem and you experimented and invented until you solved it generally with the one or two tools that were available at the time.
And with the benefit of nearly 40 years in tech…I can attest to the OP’s opinion that the quality of the average tech worker has nosedived since then.
Try asking a typical Django or React developer today what a cache line even is.
It would be like if there was a housing boom and residential house framing ramped up base hourly pay to $100/hour with lunches provided and relaxation chambers onsite. You would see a huge influx of people going to it for the money rather than someone who is really into construction, knows load bearing tolerances, code for hurricanes or earthquakes, how to properly frame stairs, etc.
But I agree that the layoffs probably aren't based on programmer quality. When big companies lay off, they generally select products, and if you are unlucky enough to be working on that product...off you go. Merit-based layoffs are more stealthy and continuous, those are definitely being ramped up now, but probably not in the numbers companies need to cut due to overhiring or hiring in the wrong areas.
BSODs were mostly avoided by using MS-authored drivers.
The average day-to-day work is menial, for example investigating a production issue. That isn't something that people get into as a hobby, that's work in a very unromantic sense. Maybe having good systems background can help in those situations, but it's a little more subtle
I do think though that it works the other way: people who actually care about computers don't care if tech is no longer a money field, they just want to work on computer stuff. And maybe earlier it wasn't as much of a money field. But that doesn't imply that the enthusiasts are better at fulfilling work responsibilities. A lot of work stuff is business specific and not very dependent on systems background
So yeah, I don't think the part about over-hiring is necessarily wrong, but this sort of "I'm better than you" attitude isn't very becoming. Maybe you should give this [0] a read, just to see a different perspective
[0] https://en.wikipedia.org/wiki/Law_of_Jante
Also, one can make the argument the other way as well. Tools and infra have gotten good enough that the work doesn’t require the “dedicated nerds”. So let’s get rid of the highest paid engineers as that’s the most impactful to our bottom line. They coincidentally happen to be the nerds. We can always hire them back for less because what else are these nerds going to do in this market.
Not saying that’s the case but demonstrating that it’s pretty easy to string together silly notions we might have to explain reality.
What the GP said was how the bloat accumulated and why the professional quality went down. What you are saying is how they are trimming the bloat. It's not possible to trim by skill alone, because it would leave all teams and products understuffed. But the bloat is there and it is insane.
Has it though? In my career I have seen people die on that hill, but only because their entire careers were utterly reliant upon those tools. If you cannot execute without a given set of tools you are in no position of objectivity when it comes to those tools, which results in a lot of catastrophically bad business decisions.
> It turns out that compared to the dedicated nerds of the previous generation, most new people just don't care that much about tech
Now having experienced almost three generations coming to adulthood, I hard disagree with any such notion of newer generation not being dedicated and smarter. I have lot more respect for the newer generation: what they have been served with and how they are handling it. The changes in last 10 or 20 years have far exceeded any changes I experienced in my first 0-40 years of life. And, imo newer generation has been adapting much better to such changes than the previous generations including my own. This adaption might come across not being "dedicated nerds" to previous generation but with the rapid change taking place, ability to adapt quickly trumps being "dedicated nerd."
Come on, LinkedIn (CVs) tell nothing. Having been somewhere is often made up as great experience even if did not yield much experience, or one wasn't as involved as claimed, and even been somewhere is a lie. It is usually the opposite if I check known colleagues: null performers and phonies have outstanding profiles, the good or super programmer somtimes an awkward or does-not-care profile.
(But yeah, also admitting: Companies usually fire by the wrong metrics and even direct managers, which are bad managers, do not know the real differences of their underlings..)
> notion of newer generation not being dedicated and smarter
Full agree, I don't think it is much new generation vs old, there are the same kind of great peoples.
But computer science really changed and grew. E.g. I strongly remember when I joined university 20 years or so ago how all tutors claimed how big our year is, and that before us everybody knew each other (profs/tutors/students), but now impossible. But still, 80-90% of the students that survived the first semesters just belonged there and would be great coders or computer scientists, with the right skills and ambitions. If you look nowadays, it is different and maybe more similar to other professions, and more 50/50 between those guys, and the other half which is ambitionless and/or just lack the skills, mindset, whatever, and would have better become something else. If you aks those some even freely admit they hate their job, but only drawn by the money.
I actually don't hold that opinion because I think a lot of these people are actually pretty smart!
They just aren't oriented to be tech experts. They are really talented at some other thing in life, but ended up in tech chasing the salaries.
― George Carlin
I guess most people really benefit from a tool to autogenerate an Erathostenes sieve in Python or a fizzbuzz function in Prolog or yet another React Router boilerplate.
EDIT: a more nuanced version of this comment at https://news.ycombinator.com/item?id=38965283
1. memorizing some 50-200 or so domain-specific general patterns which problem solutions will tend to follow, so you can solve an arbitrary leetcode medium in under 20 minutes time — a skill that has almost nothing to do with the plumbing and product work most tech-workers engage in on a day to day basis, and
2. Speak at length and extemporaneously on arbitrary from-scratch system design problems — a skill which, while sometimes important, is a skill which most engineers that aren’t working at a startup won’t usually exercise more than once or twice a year, as the ratio of work on “systems already designed” to “systems needing new designs” will overwhelmingly skew towards pre-existence.
When your hiring process and day job are so thoroughly dissociated and unrelated to each other, it’s not a surprise current workers who aren’t actively studying for a job change couldn’t pass it.
I don't disagree that companies greatly over-hired, but I don't think the largest portion there has been bootcamp/cert/junior SWEs.
I say that with sadness and fond memories of ~2013 startup culture.
Not my experience. I've conducted 300+ interviews at FAANG in the last 15 years and haven't seen a meaningful drop in the hiring bar.
Based on anecdotes, I have another theory. The people that build the systems got old and are unwilling to work the crazy hours. The new folks are hired to work on maintaining these systems, and it's not fun. So overall both quality and speed declined.
> I've conducted 300+ interviews at FAANG in the last 15 years and haven't seen a meaningful drop in the hiring bar.
Are FAANG interviews these days an indicator of anything besides being good at FAANG interviews? Some of the best people I worked with that brought real value would never pass big tech leetcode interview...
Not just pure tech systems, but businesses and non-tech systems as well. And tech companies have gotten large with a ton of baggage and customers
Initially this sounded right, but then I remembered all the trash that showed up during the dot boom and subsequent booms. It's just hiring booms attract a lot of people who want to make easy money and those people think programing is easy money (it's not). Bootcamps and even CS degrees are just step one of many steps and many people think step one is the only step and don't work to further their craft. Once it becomes evident that an org has way too many of these people, it starts to cull. Unfortunately, large orgs are very bad at "rightsizing," and just "mow the lawn."
The problem with tech is to be good at it, you have to be passionate about it to the point of near obsession. That's a requirement above and beyond just being smart or good at math, or even good at programming. You can fill a room with brilliant but dispassionate people and you won't get much done.
So you're right in a 15 year cycle, but this cycle has happened before and we're now in the the culling part.
It's definitely got worse over the last ~10 years, but the pandemic era hiring from 2020-2021 took it to new level of bad at many companies. Anyone who had done some basic coding was getting hired at a great salary in 2021, and these individuals are still around today – some now calling themselves "senior developers".
But I completely agree with what you're saying here about about it no longer being nerds working in tech. A nerd who isn't the best coding will still engage with me about technical details, be keen to learn new things, and care about their work. A lot of the new guys I work with today are just doing the bare minimum and they think you're an idiot for caring.
...and with a salary boost to match.
Now in the UK salaries have flatlined and the number of jobs is way down, and this no longer seems to be a problem.
The last 4 years I've gotten comments from co-interviewers of 'you ask the hard questions', 'you are tough'. These are in response to questions such as 'What are your thoughts on composition vs inheritance in object oriented design', 'What are the arguments for Java as a better language than Javascript? How about the other way around?', 'What's a progressive web app, what aspects of React and what react-related libs help with creating one?'. These were all directly relevant to the positions being interviewed, and some of those were for senior engineer positions.
These are questions that I think should be answerable by any senior frontend engineer, and most senior software engineers.
I'm pretty sure I survive because my tco is probably more on the humble side but I'm far from cheap.
My previous two companies didn't do a great job with onboarding and I think it can be so varied and not very helpful many times.
Ultimately, if hiring managers offered these roles because work was needed but no longer is, would you blame them for adjusting their approach?
I will politely decline any opportunity that requires me to do multiple code and architecture tests and several other lengthy meetings, usually spread out over the course of months, because it’s not worth my time to jump through so many hoops. It signals to me that there is already a tendency to implement process for the sake of process.
What will really sell me on a company is a more in depth conversation with the leadership and the team where we learn more about what we value and how we like to operate. And that’s how I would interview anyone who was referred by someone I trust.
Lots of companies feel they need weed-out questions because they're swamped with applicants. How does applicant approach a problem? Does applicant describe their thought process figuring it out? Can applicant figure it out?
Did that tell anyone how good the applicant is at the job? No. It just provided some gates for the interviewer to say "no hire."
It's hard to differentiate the applicants' skills so early in their careers - not enough work history to go on. I have no idea why these would be used for senior positions except that "senior" is this year's "junior."
The industry hired more people than the small percent of the population before that were attracted to tech.
The jobs needed to change to accommodate all the new, less-tech-interested people. They became more standardized and less thought-involving. More and more jobs are now about using frameworks and writing glue code instead of solving novel problems.
Even the new techier-than-average people are likely in less techy jobs. And jobs less oriented to making them grow into their best geek selves.
Because the jobs became oriented to hiring neophytes to do the bulk of the work, we also see title compression and a low cap to expected salaries. If someone seems too senior and starts to cost too much, this model allows replacing them with new tabula rasa (bootcamp) workers.
I, for one, had the same complaint then. The dotcom boom pulled in lots of people not interested in tech, just in paychecks.
> Maybe the nerds from 2 generations ago could have built all this technology 10x over… except they didn’t
Fantasy math aside, the nerds don't really decide what tech to build. There was no mass of nerds 25 years ago arguing over NNTP about who could build the best ad platform for the newly-emerging World Wide Web. That's a suit thing.
What I suspect is that tech companies have many more workers than needed, because most software simply doesn’t need many people to build and maintain. I remember massive tech companies running on teams of <100 people, and even today, I hear of many critical departments with only around a dozen or less.
Google search, Github, AWS, Outlook, Facebook, Uber, IntelliJ. These are software behemoths, but at their core they’re rather simple, and a single developer could make an MVP of any in a few weeks, and with adequate testers and resources, something scalable and stable that could be used by professionals in a few months. And that’s creating it from scratch (well, existing libraries), though maintaining everything maybe doesn’t require much less, it surely doesn’t require more. I understand the big products have more, at least name recognition, and I understand that a big product needs more than programmers (asset creation, marketing, QA, tech support, accounting, etc.) But the software development side isn’t very big, especially if you already have the software.
Let me guess, they could write Dropbox in a weekend too.
Why, I wondered, would they want to enter this strange industry with all our odd characters and personalities? To me it's like, this is a place where you're going to compete to fix or improve something in the Linux kernel in order for Linus to swear at you and publicly humiliate you in an immutable ledger called the LKML. (Welcome to tech!) Even now, I don't really understand how this could have become the most popular major at Harvard.
I think things will sort themselves out in time. A person might start a career with little inherent interest but stumble into something that excites them and excel at it. Or a person might work in technology for a while, decide it's not right for them, and move on to something else. We'll have to see where it goes.
It also implies some weird logic that if you were into tech before, you’re better than someone who was born 10 years later and is into tech now. And implications that if you’re not devoted to your employers industry your work output is meaningfully worse, which maybe is true but maybe not.
> It also implies some weird logic that if you were into tech before, you’re better than someone who was born 10 years later and is into tech now.
This was not what I was trying to say. The amount of people who work in tech is far, far higher now than it was 20 years ago. It just brought more "average" people in who don't really push the boundaries of what is possible. Lots of apps that move strings in and out of databases, very little real innovation going on from smart people.
[1]: https://en.m.wikipedia.org/wiki/The_Market_for_Lemons
What technology or data structure do you think developers don't understand if they work on React projects?
Isn't it silly to claim any person is suddenly incompetent just because they are assigned to a project which uses a particular framework?
Absolutely not. So much more red tape now rquires a ton more headcount. Just think how much time we spend these days to fix dependency hells and patching security issues.
Does it make sense?
> Simultaneously, a huge portion of the world saw tech salaries and wanted in on it so they started taking every quick certification, bootcamp, degree, etc to get into tech.
In 2007-2017 the concept of "startup company" was also heavily romanticized, to the point it was almost toxic. I was in university in 2013 and pretty much anyone wanted to build a startup. More often than not it was non-tech people exploiting tech people into building MVPs. It was painful to see (and in a few cases, to experience).
Don't even get me started on "apps". Every moron on the block had some kind of "idea for an app" and was completely clueless to how the app markets worked (let alone how to actually do that).
Of course, being pretty much all amateurish at best, no single business plan was in sight.
The average tech job requires much less skills though. It's not like we're all working on rockets or the building blocks of the internet. A good 70% of tech jobs today are crud api and basic web/mobile apps
Coast for a year or two, save millions, then rehire back when they need to innovate or update.
Deleted Comment
Deleted Comment
I don't think your take holds water. It reads as a mix between self-aggrandizing and ladder-pulling, typically pinned on the so-called boomer mindset.
For the past 15 years you have seen universities producing graduates that are far better prepared for a cloud world in every single aspect of the business, and these graduates start off working in cloud-related projects. All frameworks that dominate front-end and back-end components were created in the past decade, and leverage the same cloud-based competencies that new graduates learn.
If anything, new graduates are heads and shoulders above veterans, and the only thing that they miss is two decades of work experience. That can be an asset or a liability.
I also add that the bulk of >15year veterans graduated in the 90s and 80s, which was a time when the academic world was still trying to figure out what it was supposed to teach in terms of software engineering, and most universities had basically scrambling to cover relevant topics. The graduates that went through those programs, unlike today's graduates, were woefully unprepared for the reality of software engineering. Two decades ago you could land a job by passing yourself as a "programmer" which meant you knew the syntax of a programming language such as Pascal or Cobol. That's the technical background of your average veteran with >15years of experience.
Therefore, I think you are entirely wrong. If anything, the quality of your average tech worker improved greatly in the last 15 years. Those who arrived in the field actually hit the ground running in today's tech world and often push over and replace much senior team members. Those who enter the field from non-tech fields have the technical chops ro replace both veterans and new graduates. In today's world you need more than claim to know C or Java to actually land a job.
Disclaimer: I'm a veteran with >15years of experience who worked at and was involved in the hiring process of a FANG.
Never and nowhere
I’d be shocked if you didn’t enter the industry 20-25 years ago.
https://www.hpalumni.org/hp_way.htm
It was a dream job because other tech jobs were almost nothing like that.
Most tech jobs featured techies managed by MBA-types and that can be a recipe for passive-aggressiveness.
Dead Comment
You are a company. The system we have demands growth. Even very stable and reliable profits are seen as failure. There must be growth.
The people who run a company can not press a magic button to increase revenues. They can't just pull a successful new project out of nowhere. Anything like that is going to be a risk, and will probably fail. It will also take time.
The one thing they can always do is cut costs. Projects can be cancelled. Divisions can be sold off. The biggest cost at most companies is labor, and labor can be let go.
When someone controls a company, they own a lot of shares in that company. Their bosses are all shareholders who only care that the stock price goes up. Nothing the company is doing is generating huge new revenue streams. Time for layoffs.
And when some people do layoffs, everyone does them. They're all subject to the same market pressures in the same industry. One company doing them gives all the other companies in the sector permission to do likewise. If a company doesn't follow suit the market might even start to question why.
You may have seen some news that Microsoft passed Apple briefly in terms of most valuable company on Earth. You may have also noticed that Apple is much more restrained in its layoffs than the others. Not doing as many layoffs, not doing as well in the market. These things are not unrelated.
As someone who grew up in the 80s-90s, my historical sense of layoffs is that they were a response to economic hard times, whether industry-wide or company-specific. Companies laid off when profits fell or disappeared, which then drove a need to search for ways to cut costs.
But the major tech companies are all seeing continuously growing revenues and profits within the context of very healthy general economic numbers, yet they are all laying people off.
But take a software company like McNeel. They make a popular but extremely niche market product called https://www.rhino3d.com. They've been around for decades. They have extremely low turnover. Many developers have worked there since the beginning.
They are used around the globe by architects, product designers, industrial designers, etc.
What are they doing right? Why/how do so many others get it wrong? Are humble software companies like this unicorns? Why?
> TLM, Inc. dba Robert McNeel & Associates is a closely held employee-owned Washington corporation funded solely from retained earnings.
So they're a private, employee owned company with a strong niche product that doesn't depend on outside money. They took the slow road, something that is antithetical to VCs but perfectly acceptable for bootstrapped founders that want to live comfortably but don't care about striking it rich.
[1] https://www.rhino3d.com/mcneel/about/
Which means it’s more syndicalist organized rather than purely for profit
Not perfect but as close as you can get to ideal organizationally
Edit: They also do that thing that everyone says is impossible and sell perpetual licenses. Every 18 months or so they offer paid upgrades to the next version and I always buy it because the price is reasonable and they're jam packed with useful new features.
Deleted Comment
Apple stock is down because Apple's hardware revenue is down. Simple as that. https://sixcolors.com/post/2023/11/apples-fiscal-2023-in-cha...
Because of inflation, stable is actually shrinking. If you raise prices perfectly in lock step with inflation, trends and tastes of your customers still change, necessitating innovation if only to maintain the exact same level.
In reality, you have to grow to ensure when the ground shifts from under you, there is still some buffer. Growth is insurance against an ever changing, unpredictable world.
The people who think things never change are people without imagination. They want safety and security, but that is an illusion.
> And when some people do layoffs, everyone does them. They're all subject to the same market pressures in the same industry. One company doing them gives all the other companies in the sector permission to do likewise. If a company doesn't follow suit the market might even start to question why.
I'd like to try and simplify this - it's simply the new "normal" and companies can get away with it(meaning the labor side does not punish it, yet - no unionising etc). In the EU layoffs like this are not as easy to pull off by companies.
My company is discussing having people show up three days a week in the office. The only justification mentioned was that other companies are doing it. I am quite doubtful they did any sort of analysis to ensure it's a good idea.
Twitter basically made an example that made layoffs a valid option in the heads of many people I guess.
If Musk simply wanted to cut costs and get Twitter to the core of its profitability because the company had pretty much peaked, then it was a valid business plan in the Gordon Gecko realm.
But Elon talked out of all parts of his mouth nonstop, said he wanted vision and features but fired and hired for maintenance. It's weird that all the other companies too that as direction for their layoffs.
Executives were appalled at their lack of power in COVID too. I think the layoffs are an attempt to gain authority, I don't even think it is about the dollars and cents. Elites don't actually care about how rich they are after a certain point, what they care about is the gap between them and the "plebes", and the developer plebes were far too uppity in their view.
tl;dw (in my own words): investors like it when you have extreme hiring during the good times, and they like it when you have extreme cutbacks during the bad times. Investors don’t like slow and steady growth.
Interestingly, Nintendo quite firmly rejects the Silicon Valley approach, sticking firmly with the slow & steady - and while the rest of the games industry was doing layoffs, Nintendo gave all their employees 10% raises.
> An IRS tax code change in Section 174. This change eliminates the ability for businesses to deduct R&D as an expense.
> Hear of lots of layoffs directly because of this, as a start.
[1] https://twitter.com/GergelyOrosz/status/1735030983173230944?...
My old company was a contract engineering company, so they always tried to staff everyone possible on projects being paid for by customers. R&D was used to develop internal tech and build expertise on new technologies so it was only maybe 10-25% of most engineer's timesheets.
Do other tech companies spend that much more on R&D?
So let's say a company needs 5 years to build some new hardware. Those 5 years of labor and expenses are all R&D.
I think that's why this is such a big deal. I could be wrong though.
For the largest companies, it's likely to some extent to put downward pressure on salaries in the market.
For others it can be like the Twitter case were they hired way too many devs for what was needed and that simply makes a bigger communications mess not more productivity.
For others still it's outside pressure to become financially stable or income positive with a reduction in investment capital over time.
The larger economy, the potential for a more widespread military conflict and a number of other factors taking their toll in different ways.
When belts need to tighten, late and expensive projects get cut.
There are still jobs out there, but with remote roles in particular, which I'm personally experiencing. Pay scales are all over the map and there are hundreds of applicants to many jobs out there.
It sucks to be looking and I imagine it sucks to be hiring as well. The latter because of the shear volumes of applicants to sift through.
I will say, I'd rather receive no contact at this point than the boilerplate rejection emails. If there's no substance or advice, it's just a net negative IMO. Used to hate getting dropped. But having to fill out a few dozen applications a day competing against hundreds and seeing the ones where you're less than ideal sucks.
I took 4 months off because I needed to get past the burnout. Now it's just hard getting back in.
Facebook and Google in the last 10 years went from usable worthwhile sites to ad-dominated monstrosities, and their revenue exploded. Likely the beancounters have taken total control, and recognized that worker salaries for all the worker bee tasks in the big tech companies are vastly overpriced.
This executive management decision is the underpinning of Musk's takeover and destruction of Twitter. It probably would have succeeded if he wasn't insane.
(Edit: formatting)
I was hired by a small consultancy on a software team supporting a dedicated software product for a small but powerful industry. Software was/is an emerging side business for that employer. This team completely lacked discipline and vision from a software execution perspective. The software was horribly organized, 80% of the logic was in SQL stored procedures, there was no test automation, and everything was copy/paste between environments. So, this was extremely high risk. The product side of the team, on the other hand, had extremely good discipline with a solid vision and extremely good documentation.
I was the only senior developer on the team with any advanced experience outside of SQL. It became super clear this employer was a mistake when they didn’t want me to fix anything and my junior peers back-stabbed me during 360s as salvation for their inability to communicate in writing. I just rode out the last several months until they eventually fired me when billable hours evaporated.
After a few months of looking for a new job I made a promise to myself to never EVER return to employment that feels immature. I would never take a job too reliant on frameworks and tools, such that the job/industry are compensating for talent with gimmicks. I abandoned my career writing JavaScript and eventually gained work in data science.
I can't say if this is common in the industry but it's certainly the most depressing situation to be in. I didn't see a way out of this that wasn't quitting. Changing the culture when people below and above you can't even realize they are drowning in their own mistakes seems like an impossible situation.