It is going to be a rough ride as America re-calibrates to a world which no longer relies on it. We took enormous amounts of benefits for granted.
It is going to be a rough ride as America re-calibrates to a world which no longer relies on it. We took enormous amounts of benefits for granted.
Interviewing is a skill and unfortunately the best way to practice that skill is real interviews.
If a month goes by without a single interview, that is all the feedback you need that you need to try something different.
It's good that you have made it a routine to apply, I would just try to fine tune your application towards specific roles.
Also consider how AI is changing what employer's are looking for. The job posting you're seeing likely exists because underneath is something that AI can't do. i.e Perhaps that simply means knowing how best to leverage AI or there's some communication / ownership element to the role that they want a human to be in charge of, etc.
If you look at things in this way you'll apply for fewer jobs. Some days you may not apply to any because none meet your criteria.
So the TLDR here is to remember it's more about focused quality instead of playing the numbers and aiming for quantity.
1. Knowing what you don't know
2. Knowing who is likely to know
3. Asking the question in a way that the other human, with their limited attention and context-window, is able to give a meaningful answer
You could be made unemployable even without AI, all it takes is a bit of bad luck.
This fear of AI taking over your job is manufactured.
Companies would change their tune on WFH real quick if that were the case.
A lot of us tried it and just said, "huh, that's interesting" and then went back to work. We hear AI advocates say that their workflow is amazing, but we watch videos of their workflow, and it doesn't look that great. We hear AI advocates say "the next release is about to change everything!", but this knowledge isn't actionable or even accurate.
There's just not much value in chasing the endless AI news cycle, constantly believing that I'll fall behind if I don't read the latest details of Gemini 3.1 and ChatGPT 6.Y (Game Of The Year Edition). The engineers I know who use AI don't seem to have any particular insights about it aside from an encyclopedic knowledge of product details, all of which are changing on a monthly basis anyway.
New products that use gen AI are — by default — uninteresting to me because I know that under the hood, they're just sending text and getting text back, and the thing they're sending to is the same thing that everyone is sending to. Sure, the wrapper is nice, but I'm not paying an overhead fee for that.
You touched on some of the reasons; it doesn't take much skill to call an API, the technology is in a period of rapid evolution, etc.
And now with almost every company trying to adopt "AI" there is no shortage of people who can put AI experience on their resume and make a genuine case for it.
like asking people to accept that 1+1=3. or that the day after Monday is Thursday. maybe that's the real function of these hoops -- selecting people who are good at doublethink
If the goal is to keep your motivation up, you have to find some joy in the activity itself.
It is an issue simply because leetcode grinding makes me feel like all my 10+ years of commitment to my previous employer (often foolishly at the expense of my personal well-being) and all the things I have contributed and picked up on the way mean nothing / nada / zilch to my future prospective employers. The whole prep process makes me feel like I need to start from scratch and nothing that I did in the past matters at all. I find this extremely frustrating.
You have to accept this on a visceral level.
Alternatively, remember that the reason the company is making you jump through these hoops is that there are many other candidates who are equally qualified.
The irony is that I haven't seen AI have nearly as large of an impact anywhere else. We truly have automated ourselves out of work, people are just catching up with that fact and the people that just wanted to make money from software can now finally stop pretending that "passion" for "the craft" was every really part of their motivating calculus.
So when things break or they have to make changes, and the AI gets lost down a rabbit hole, who is held accountable?