I realize everybody's going to jump in and rant about algorithms in interviews, but I wish you'd all add something constructive as well.
I just had to conduct a round of interviews in a non-SF large US city, and it was a hellish crapshoot. Resumes are meaningless, and often re-written by recruiters to match the job anyway. Everyone has the same canned answers to the stupid behavioral questions. And as for the code, we included what we thought was a trivial nested for-loop problem and virtually nobody could even get started on it.
Is this kind of code problem too complicated in your opinion? For all I join in when complaining about irrelevant algorithmic questions, I have to admit that they at least test something, even if it's just willingness to study for the interview.
Instead of reading everybody's complaints about interviewing, I'd love to hear how you think it should be done. Because I have to admit I'm pretty much lost right now.
I can tell you how I do it and would certainly recommend it as the way it should be done. For some context, I've been interviewing software engineers for about 25 years in companies ranging from established multi-nationals to tiny startups in very fast headcount-growth mode. I'm in silicon valley. I can say that I've never regretted a hire I said yes to, so the method works to my satisfaction.
It'd be nice to think I have some special skill here but I really don't. This is just how interviewing was done in the 90s. To some of the younger generations, I've been told it sounds crazy.
If you send me your resume I'll actually read it, carefully. If the person described in this resume fits the background experience the role needs, you get an interview.
During the interview we'll talk about all those projects you worked on that are relevant to this role. Which parts you enjoyed the best and why? Which parts were boring and why? Which parts were the most challenging and why? What you find too easy and why? What would you have done differently? Could you have? If you were to do the same project all over how would you approach it? Other open ended conversations along these lines.
I don't ask anyone to whiteboard code, that's not part of the job so it's not part of the interview. No puzzles, no trivia-pursuit style questions.
It works great. You can't BS your way through such a conversation with a senior technical peer if you didn't actually do the work described in the resume. You just can't.
It is, however, vital that the interviewer must be a expert in the field.
> > No puzzles, no trivia-pursuit style questions.
I swear HN technical interview threads are the poster child of talking past one another. First, for-loop is nowhere near a trivia-pursuit question. Second, different companies of different sizes/industries/goals have different requirements. Let's all move forward with this discussion and acknowledge that we can't all use the same process because we're not all hiring for the same type of job. If the engineers you hire consider for-loops a "puzzle" that's totally fine and OP using it doesn't invalidate your process for your multi-national or startup companies.
> I can say that I've never regretted a hire I said yes to
The real question is, has anyone else regretted that hire, which unfortunately can't be answered as they may not tell you.
> To some of the younger generations, I've been told it sounds crazy.
Doesn't sounds crazy at all. A single process guaranteed to work for everyone? Now that sounds crazy.
It’s also very subjective and hard to train for or to audit externally. When a company gets too big, you can’t properly vet hiring personally anymore, so you have to scale it.
You don’t want ‘bozo cliques’ to form, so you make a semi-objective process like ‘solve this algorithmic question’ as part of the interview loop. I think execs and the founders doing a final review before hiring all engineers comes from that fear.
Then other companies cargo cult interview processes from larger companies and the trend propagates.
If you want to ‘hack hiring’ as a smaller company, you should use hard to scale processes like the one described in the parent post.
These questions should definitely be part of the interview process, but not all of it. I've done a lot of these kinds of interviews and I've definitely seen candidates that speak impressively but fail basic technical tests.
If you don't actually verify the technical problem-solving ability of the candidate in some way you're forgoing signal that can massively increase the confidence you can have in your decision.
My immediate reaction is that the problem with that approach is that it’s simply too expensive for the hiring company, in the same way that take-home programming challenges are too expensive for the applicants.
A company trying to hire engineers can easily give a take-home programming challenge to a dozen engineer applicants and take very little time analyzing the submissions. That’s pretty unfair. But it also feels a bit untenable for a senior engineer at a hiring company to have very deep investigations of each applicant’s work history.
Another problem is objectivity. If your company’s engineering hiring process relies heavily on a senior engineer’s subjective impression of an applicant, you’re going to have big problems with your own engineers’ biases, whether subconscious or not. Expect to hear a lot of evaluations like “well, the applicant did seem to have good knowledge and experience, but I just wasn’t impressed for some reason.”
I agree with your approach and use it myself but one thing is different now and that’s the proliferation of tiny skills. Back then you would have a few big skills, you would claim to know one or two main languages, one or two databases and so on. Now people list hundreds - literally hundreds - of skills sometimes. And there’s no way to tell on reading if they really know it, or just saw it once and maybe did a tutorial or “hello world”. People now will add a skill to their CV if they’ve done it for a hour total in their entire lives! Or read a blog post about it. It is a massive time sink to pick through that.
This works for non-technical positions too. Generally it requires an experienced interviewer to listen to the candidate, think on their feet, be fully engaged, and basically do the opposite of the one-size-fits-all coding or behavioral interview.
It takes a lot more effort on the part of the interviewer and is "harder to scale", in that you can't just train people to ask canned questions. But since it's open ended, it's a lot better at finding out what the candidate is really good at, and it's very useful to have one of these in every hiring loop, usually by a senior team member or exec.
>It is, however, vital that the interviewer must be a expert in the field.
This is key. There are a lot of hiring managers masquerading as experts and are frustrated when cargo culting hiring processes falter and lack the people skills to diagnose a situation. You'd also be amazed at the quality of resumes a high, advertised salary will bring.
These are the kinds of interviews I love, and I appreciate that they're still given at some places. I agree, it's nearly impossible to bullshit your way through this kind of a conversational interview but people still think they need to ask trivia questions.
Been doing it the same for over 20 as well, only had one bad hire and it had to do with drugs and attitude more than skill set. He was a good developer and was sober when I hired him but fell off the wagon and started missing work, no showing etc. I fully concur that if you can't spot a competent developer from a conversation, you probably should not be in the position of evaluating potential developers.
I don't disagree that there are a lot of fake it till you make it developers who did a vo-tech class and are applying for jobs out of their league, but I just don't know how people don't spot them. It takes me about 5 phone calls to find a competent individual, and then I bring them in for a in person. As a hiring manager, I don't find it cumbersome to weed thru 5-10 people to find a good hire.
> I can say that I've never regretted a hire I said yes to, so the method works to my satisfaction.
You think this is a claim about how good your hiring practice is, but the only way I can think to read this is as a point about how little hiring you do or how little evaluation of hires you do. In the real world, perfection isn't possible, so claims of it are a sign of inexperience or naivete.
Thank you for your refreshing point of view. I'm a mid-level software developer and I was on the other side of the table as a junior web developer and encountered this same style of questioning and embraced it. This is my go-to style when interviewing because if you are confident in the lingo and zeitgeist of development, there is no way they can bullshit there way and lie through their teeth on their own experiences. It is very easy to tell. Whiteboarding doesn't give that detection since it is too broad of a spectrum to start as a indicator of experience.
> I can say that I've never regretted a hire I said yes to, so the method works to my satisfaction.
This may just mean that you say a lot Of wrong “No”s. To get very high precision or very high recall is really easy... what you must measure is your F-score
I'm an engineer and my contract is very explicit which parts of my work I'm allowed to speak about. The answer to all your questions will be "I can't speak about it." That's precisely why Google interviewers always ask abstract puzzle questions and avoid like fire any possibility of being exposed to protected IP.
If you haven't hired any people that you regretted (ultimately), I wonder if you're being too conservative in your hiring? Sure you don't fail, but are you ever surprised that a marginal borderline outside the box case turned out to be 100x ?
>If the person described in this resume fits the background experience the role needs, you get an interview.
This is one way the current status quo might be better than the past: you don't get pigeonholed so much by your past experience into being a "fit" only for similar roles. Sometimes the hiring manager is really looking for a specialist, but in general, we don't care what industry you were in or what tools you were using, as long as you can prove you're smart. Some of the most impressive people we have working on Go microservices were enterprise C# developers before.
While it maybe easy to hire dev who have been on the market for 10 years. You still have to keep in mind that new developers are still a very large proportion of the dev population.
This form of interviewing requires a certain degree of skill and lots of experience to keep the conversation objective and on track. I don't have that. I rely on a base technical question based on the candidate's past projects and then go on from there.
This is how much of my interviews have been and how I wish (however unlikely) my future ones will be. Unfortunately I rarely get to the interview stage of things as my resume is filtered out
Personally I can't see an issue with very simple FizzBuzz style programming interview questions. I used to ask a simple "count duplicate substrings" question [1]. Maybe some people consider this too hard? I never used to require exact syntax, and would have been happy with pseudo-code. Using libraries is fine etc..
I also found very few people could solve this (similar non-SF large city location). Occasionally, people who could not solve this were hired for other teams. Based on their performance, I don't think I would have been comfortable working with them.
I don't think it's unreasonable. But I'm not sure I'd use it as a screen if I was hiring now. I think I'd just have a chat and try and discuss a previous project. After that I'd move to a paid take home project (ideally representing real, useful work).
[1] Take a string, for example "ABCCABC" and count the number of times each 3 character substring occurs. In this case the answer would be 2xABC 1xBCC 1xCCA 1xCAB.
I never had any technical interview prep when I started. I could code just fine. I demonstrated that at my other jobs and in school well enough. Then I started interviewing for SF positions that did these interviews.
I froze up. I couldn't talk and think at the same time. I didn't have the skillset for doing this in a very intense scenario. In my case, I was homeless and needed a job ASAP. Every interview felt like life or death to me.
First one I was asked to reverse a string in C. I hadn't done C in a few months. I froze up on syntax. I looked like an idiot who couldn't do it.
I could imagine many people who have never experienced this format (or haven't experienced it much) would easily freak out and look stupid as bricks like I did.
I've since done over 200 technical interviews (as the interviewee) and I usually sweep. Still fail at FAANG but I always get the solutions. (Even the leetcode hard ones) Just not sure why I fail but cest la vie.
Of course some people will consider that question to be too hard. But you may not want to hire those people. The point of an interview process is for you to pick the people you want to hire, not to ask questions that everyone agrees aren’t too hard.
Your problem actually is a good illustration of the whiteboard test problems.
I just tried it, and it was trivial to do in a minute or two on my laptop. However, I did took note of two syntax mistakes that I made in the python REPL that were immediately obvious there and took seconds to fix, but which I most likely would not have noticed on a whiteboard.
So there's quite a bunch of problems where if the acceptable "format" of the answer is "hey, I'll just pull out my laptop from the backpack and push the solution to github in 15 minutes" then it'd be okay, but it'd be hard to do it while 'whiteboarding' without access to immediate feedback and easily accessible API documentation.
For example, I work in many languages, and for many APIs I can't remember whether in this particular language the same thing is called add or append or something else; e.g. I've worked on Java code for a dozen years and would be a quite productive Java developer, but since I haven't written any new code in Java for quite some time recently, I can't remember off of my head what's the right boilerplate to open a text file for reading in Java - it's something that a Java tutorial might have in the first pages right after Hello world, but I'd still have to look up the incantation to pass the encoding properly - there's like three Reader classes to instantiate and I don't recall their names off the top of my head.
If I were with you I'd not necessarily write anything.
Id put out my fingers 3 characters wide to the first substring and step through the rest of it in 3 char 'spans' with my fingers. I'd say "extract each substring, put it into a map with that substring as the key and either 1 as the value if it wasn't already present, or if it was, increment the value"
I'd probably not bother mentioning how to extract the results unless asked.
If someone said that to me it would show the solution, and I'd be 100% happy.
I'd assume that they could then render that into code - perhaps that would be a mistaken assumption though but in my experience solving the core problem is the thing I'm interested in, not the syntax. But those with more experience may say the former doesn't always imply the latter, and the code needs to be shown.
That beats fizzbuzz for me - I think I shall steal it. It's simple enough that it can be done in five minutes but it has enough edges that people could go down the wrong rabbit hole
If so, then the company you're interviewing for must not be attractive to first-class CS grads (top 10-20% I think) from any non-online university.
As a comparison, any FAANG + Palantir/Jane Street/Two Sigma in the UK have tougher questions as their FIRST phone interview for INTERNSHIPS. (Palantir requires you to go through 6-8? interviews before getting an offer)
If this is a job that requires actual software engineers, I think rejecting everyone that failed this question would be perfectly reasonable.
It's not an unreasonable problem, but one way to smooth it out could be to provide a list of "potentially helpful" string functions relevant to the variety of ways one might solve this problem. Not all string libraries are equal among programming languages, and recognition is easier than recall. I guess it really depends what you want to measure with the problem though. I use a problem that at the end of the day requires them to output an edge (start point, end point) that is used to feed a line drawing routine. I provide the standard junior high formulas for euclidean distance, mid-point, slope, inverse-slope, and two ways to represent a line, since I'm not testing for remembering those, but whether a subset of them can be used to solve the problem. I've still run into candidates who seemingly had never modeled a line mathematically before, and found myself hastily trying to explain how one would do so, like I were giving a lecture to a junior high student for the first time in pre-algebra. Needless to say they weren't able to then program a solution, or anything at all really... They would have been filtered out by your problem in less time. I think your problem could also be made easier to filter out the same people faster, but from my standpoint, I'm usually mandated by policy that I'm going to spend an hour with the person, so I try to make my problem answer a bit more than just the basic "can you code at all?"
Do you get candidates that ever ask "can I assume the strings are ascii?" The usual solutions to this will break in amusing ways if you allow arbitrary unicode. I know some interviewers who actually would have as a hidden scoring criteria "candidate asked about input encoding", and a lack of asking about that is a fail even if they correctly solve the problem for ascii. I myself disagree with using such hidden criteria -- if I'm going to score something, and not tell the candidate what exactly I'm scoring, it's at least going to be something in the code like "correctly avoided the divide-by-zero case without me pointing it out" and not an expectation of the candidate to read my mind on my expectations of things I haven't told them. (I do tell them to try and write code without errors (I help fix up basic syntax quirks or if I spot a typo I'll point it out), or ask how confident in their code they are and whether they might have in mind any edge cases to try -- I'd like to get a candidate who actually writes a unit test on their own, I always point out junit is set up...)
That seems like a nice, easy puzzle someone should be able to do on a whiteboard, but I think it would work better on a computer -- my first iteration of this included a bug, as well as some print statements intended help detect and diagnose bugs. I could carefully think through the problem to ensure I get all edge-cases/bugs, but I do better playing with it interactively. (In this case, my first iteration included 'BC' and 'C' as substrings each occurring once.)
You see my process better if you watch me write and correct this program on a computer. If my tasks is to think up a fully correct solution without iterative trying incorrect solutions, you're missing an important part of my process, even with a trivial FizzBuzz. (Though with this one you could probably just act as the interpreter and point out the errors for me.)
>Take a string, for example "ABCCABC" and count the number of times each 3 character substring occurs. In this case the answer would be 2xABC 1xBCC 1xCCA 1xCAB.
>>> how you think it should be done. Because I have to admit I'm pretty much lost right now.
- Treat recruiting in the same way as you do software development.
- Formulate a set of requirements.
- Define interview questions that give insight into whether or not the candidate meets those requirements. This is the equivalent of "tests" in the software process.
- Specific skills with your technology stack is good, but not necessarily essential.
- Ability to discuss sophisticated software concepts, and to explain software that they have built, and how they would build out ideas given to them is good.
- Evidence that this person gets stuff done is good (ref Joel Spolsky).
Coding tests are, for the most part, garbage. Not because the test is of no value, but because you the employer probably don't evaluate the result properly.
While I can't say anything from the prospect of an interviewer, as an interviewee I am a big advocate for this style. Give someone a set of requirements, and then talk through how they would solve it. Allow for further clarification, and just talk about tech. It's a little more unstructured and less formal, but surely you'll very quickly pick up what experience they have, whether or not they've made past solutions that'll fit, what new technology they'd like to use, what technology they'd end up actually using, and there's no set right or wrong answer. The interviewer might even end up learning about something they themselves weren't aware of.
If someone could blag that while not being able to even write FizzBuzz, all I can say is well played to them.
1) Resume screen
2) 30-minute coderpad/codeshare exercise on a problem/pattern you actually use/encounter during the course of your work over phone. (No inverting binary trees). Expect a 20% pass rate here.
3) Reasonable take-home problem that you've timed 2 of your own staff completing well in 50 minutes. This is where you will get the most complaints from applicants, but that's OK. Let them select themselves out of the process. Expect a 30-40% pass rate here.
4) In-person interview. At this point, you should be mostly committed to hiring the candidate. Do a couple livecoding deals, but be extremely lenient in how you interpret results. Other week, we had a candidate fail the problem, but they kept their composure and showed they knew what they were doing on the way to bombing the problem. Candidate seemed sad at the end of the interview and happily surprised when we extended an offer.
IMHO this process works fairly well and does a good job of being economical with people's time.
I'm curious about the codepad/codeshare approach right at the first touch point. I fully agree that screening actual tech skill early on is important. Do you not find that you commit a lot of engineer time to codeshare interviews that don't work out further down the line?
Also 95% of job descriptions list skills candidates will never use and screen candidates with problems they will never encounter.
At the final interview to join the SRE team at Google I was asked to implement the kNN algorithm. I barfed at implementing a kD-tree after regurgitating the brute force solution.
Has any SRE ever had to implement a kD-tree in < 20 minutes or Google would go down?
I asked the interviewer at the end. They had never implemented one on the job.
As long as companies insist on these inane rituals I think it’s fair game to optimize for it as an interviewee.
I've been through a few different types of interviews and the random algorithm style seems off to me. I recognize that there are companies like FAANGs that want a deep bench and will have expert algorithm people on-call for that one time when it's needed. Most companies should be OK to just present real problems either solved or being worked on which they would expect applicants to be able to solve.
This is more true for startups that need less know "how to reverse a binary string" and more "how to properly design a database with 3rd order normalization", etc. If you're presenting an interview question, it should have actual job relevance and your co-workers should be able to solve it in the same time as the candidate. If you're drilling people on non-job qualities (e.g. invert a binary tree for a web dev role...) then you should expect a large difference between audience that can pass your bad interview tests and audience which will perform well at actual job.
Not trying to complain. I think job interviews should focus more on the 99% of what you do in your job on a Tuesday. Poor interviews seem to be more gotchas and algorithm tricks to disqualify roles which have little actual use for algos + data structures. The tests might seem too easy but as a front-end engineer I would rather be with a coworker which understands the CSS box model, knows semantic markup for accessibility, and similar web things than a person which is good at creating hash tables and doubly-linked lists in JS. Leet code probably doesn't test vertical centering techniques with CSS but if you're applying for a web dev position you better know them.
One company sent me (before the interview) a small technical assignment. After I had submitted the code, an interview was scheduled. The entire interview was an extended code review -- talking about trade-offs, about other potential solutions, etc.
I felt this was much better in that it was less stressful, yet allowed me to demonstrate both knowledge and design skills.
Another company did something similar but more thorough: They invite candidates for a full day of work where they try to solve a small problem. Then the code is reviewed and evaluated together. They also start the day with a 1-hour overview of their current architecture and you get to ask questions and talk about alternatives. I think this gives both sides a better chance of finding the right fit.
I've been thinking for a while that companies should create their hiring tests from bugs and/or feature requests that came from their actual software in the past. Then they can gauge the quality of the employee for their purposes by comparing the candidate's solution(s) to those the actual employees wrote.
Resumes and prior experience should be taken more seriously, backed by more rigorous verification of said things and real reference checking. In a way, that is what already happens for the more technical positions (someone already knows you can do the job, the interview is just a formality) and sucks for those who don’t have well known enough reputations.
> And as for the code, we included what we thought was a trivial nested for-loop problem and virtually nobody could even get started on it.
It's possible that you asked the question poorly, or the solution wasn't as obvious as you thought.
Designing interview questions is hard[1]. I'll test out new questions on my peers at least two or three times before putting them in front of a candidate. And many don't make the cut. If a good engineer who's relaxed can't solve it easily, then a stressed out candidate will have no hope.
[1] This is why I hate seeing candidates share specific questions online. As an interviewer you'll have to scrub a good question, and switch to something you're not as familiar with. This hurts good candidates.
At my current position for a remote job, one of the interview assignments was reviewing a pull request for a very basic example app using their tech stack, and then implementing my own suggestions during a video call (while sharing my screen), updating/running tests, discussing trade-offs etc.
I think this was a great way to not only verify coding ability, but also testing team work and communication skills.
Where are you posting your jobs? Are you paying a recruiter? How much? Maybe you could have interviewers rank candidates (hire; competent, but no hire; slightly incompetent; how did they get here?), then see where the majority of your "how did they get here?" candidates come from.
For the code portion of things I stick to a set of increasingly difficult "real world" problems. The first one should be easily answerable by any potential candidate, the last one should be too hard for most people but I'm really looking at how they problem solve and handle themselves. I've recommended people for hiring who eventually gave up on that last one (and who went on to be fantastic in their jobs).
> Everyone has the same canned answers to the stupid behavioral questions.
The art of behavioral questions isn’t “ask and answer” it’s the follow up questions. As you astutely point out, the questions are ‘stupid’. They might as well be “do you want a stick of gum?”
Next time you ask those questions do a couple of things. First keep in the forefront of your thoughts what information you’re trying to get out of it and keep the candidate on track answering your data point. Do that by relentlessly asking follow up questions. When you think you have everything ask more.
As an anecdote I was being shadowed during an on-site recently. I asked some arbitrary ‘dumb’ behavioral question, went back and forth a bit, wasn’t getting much out of it. I noticed my shadow clearly moving on to the next question in their notes and decided to keep pushing on the original question - why did you do this, what were you trying to solve, what motivated you. Turns out the candidate did all of this to generate new revenue for the company and ended up bringing in $10m a year extra at the small company there currently worked for. Loads of great data, would have never gotten there if I’d settled for the canned answer the candidate had.
Behavioral questions aren’t comp-sci trivial questions, you can’t just ask the behavioral equivalent to fizbuz/Fibonacci/floodfill and copy down the answer (and you should never be asking those questions either, but that’s a separate rant).
Behavioral questions are stupid and to some degree that’s the point. When you ask your significant other or kids “how was your day?” — guess what, that’s a stupid question too. What matters is what follows from your line of interviewing.
If you want to get good at behavioral questions listen to Fresh Air and try to be like Terry Gross.
I like the for loop code thing. Imo, good hiring process for hiring programmers must involve small piece of code. Not difficult or algorithmic, but something to distinguish those who can't do anything at all.
Seems like your problem is basic competency? Move the "can you even code?" question to as early in the filter stage as possible (first 'phone' screening). If you have a lot of applicants, you'll have to do some earlier filtering in the name of time (like on degrees, years experience, "the lucky half"), but don't pretend it's fair or very accurate since both the false positive and false negative rates will be high.
I agree that some coding problem needs to be used to try and answer the question, though with the right interviewer they can answer it without seeing code. The problems you use for that don't have to be at octree-collision-detection whatever challenge, a trivial nested for-loop is fine -- fizzbuzz level is fine. Sometimes you can rely on github or strong internal referral to skip this, but watch out, and anyway it's worth giving your questions to people you're sure will do fine (you've timed at least yourself right?) for the benchmark data and because sometimes they don't do fine, perhaps since maybe your question is too much. e.g. Floyd-Warshall can be done simply with a few nested loops, still I would never give it as a problem and I'd expect nearly everyone I've worked with to flunk it given only the standard hour (which really means 45 minutes).
Some jobs only need basic competence, so you might want to extend an offer if you've been convinced of its presence. At my last job, which ended up being more technically challenging / interesting than my current job, I was hired after posting my resume to Craigslist which led to exchanging some emails and having lunch with the startup founder to talk about my past work and whether I would be useful for his most pressing work. At my current job, I've been part of on-sites where I've established "can you even code?" is "no". Those were costly failures of not having that answered earlier. But we also like to believe we need more than basic competence, so rejections can still occur because of a lack of "testing mindset" or certain "behavioral answers". Only once you fix your "can you even code?" filter is it even worth considering what else you might want to justify an interview pipeline with more stages than a 'phone' screen or lunch conversation.
I couldn't agree more with the idea that you should move a 'can you code at all' test to as early as possible in your hiring pipeline.
I used to wait to the first in-person interview to try simple fizzbuzz style questions (with the candidates on a machine and a compiler/interpreter). In about a third of cases that meant we'd committed a significant chunk of time to engineers that apparently couldn't solve trivial problems.
Now it's one of the first things I check. Done right, it's a relatively small hurdle for capable people to overcome, but really helps as a filter for those who aren't suited to the role.
I recently created a service (https://candidatecode.com) to help companies manage issuing and reviewing their coding challenges; I think it's got real potential to help some people out.
The way most fields that require some level of knowledge or ability handle this is by having industry-standard assessments that everyone takes. When doctors interview they get almost entirely behavioral questions because the hiring hospital / office merely needs to check that the doctor is a) licensed and b) certified in whatever specialty they are being hired for. They certainly never get asked whatever the medical equivalent of FizzBuzz might be.
I'm getting up the gumption to look for a different, hopefully better job in the nearer future (in automotive control software, looking to move to AV), and I had a bit of a revelation when talking to a friend of mine who does interviewing. I cannot talk about the things in my current job that would make me a good hire for the things I want to move into. It's all hunting down an obscure bug buried in layers of technical debt and overly complicated standards, but giving any depth beyond that bare platitude requires going into things that my NDA covers. I even work in driver assist technologies, but I can't go into detail about that because I'm working on unreleased, unannounced features. That means that the only thing left is side projects or whiteboard interviews. To be fair, this is at least partly an industry problem due to long product development cycles and a culture of secrecy, but it makes a lot of the better solutions to interviewing unworkable and causes companies to drop back to whiteboard interviews. And as a candidate, my personal maximizing function is to hit the books and be ready for curly braces and logic puzzles. Sometimes there really isn't a better way.
If you risk having a bullshitter you need some way of identifying them. The best way is through references, personal projects etc.
But if you need to recruit someone without any such credentials then you may need to do a simple coding aptitude test. Could be a code review or a simple excercise but whatever you do, don’t do whiteboard coding and don’t have people recite/implement memorized CS textbook algorithms. Anyone can do that and still no code.
> Everyone has the same canned answers to the stupid behavioral questions.
If the person conducting the interview thinks the behavioral questions are stupid, then perhaps they are. In that case, don't ask "stupid" behavioral questions.
> Resumes are meaningless, and often re-written by recruiters to match the job anyway
Was the position entry level? Students coming right out of compsci often have little to no practicable experience. They may have difficulty thinking about what to put in their resume. After one or two years of full time experience that should no longer be an issue.
> For all I join in when complaining about irrelevant algorithmic questions, I have to admit that they at least test something, even if it's just willingness to study for the interview.
Asking those "stupid" behavioural questions and receiving the same canned answers also demonstrates a willingness to study for an interview.
The coding problem should be testing a candidate's problem solving capabilities as practicably required by the role being interviewed for. The chosen problem should reflect the types of problems that they will actually need to solve if hired. For example, you could select a small PR from one of projects being actively developed by the company. The selected PR should involve only one or two classes (assuming a language with classes) and require improvement. You can look through the history of a PR and just pull out a segment that was selected for improvement by the reviewer(s), or have the team select it for you. Then ask the candidate:
It was very similar to the “given an array of stock prices, find the optimal buy and sell indices for the biggest profit" problem that somebody referenced above.
But we emphasized repeatedly we weren't looking for the O(n) solution, just the brute force naive solution was 100% OK. And it definitely didn't look like people were freezing up trying to figure out the optimal problem, they were struggling on the basic nested loop.
What works for me is to consider what work the engineer will be doing, and ask them to write code to prove they can do it.
For instance, a front end engineer can be expected to be able to write a to do list or similar app in a framework (ideally the one your team uses, but not a hard no hire if not) app with minimal googling (although that’s fine as long as not excessive) in ~45 minutes.
Then you have to look at what level of experience they have. Less experience requires more mentoring generally, which may be fine depending on how much time your team budgets for that work.
Lastly, measure their body language and tone of voice to check for red flags pointing to difficult communication styles or people who treat others poorly.
I prefer take-home assignments. I know that they often get a bad rep and have the potential for abuse; for this reason I'd argue for implementing some or other solution in the field.
I recently had one requiring me to develop a native mobile application, which I enjoyed. It was interesting, the code is useful down the line, and if I don't land the job, it beefs up my portfolio.
Initial screening by recruiters is tough as my background's missing a degree and industry experience.
Context: self-taught, started out with game dev, tried going solo - not a runaway success. Looking to move away from the field.
I use basically the same process, but use a very simple programming problem at the end - simple, but amenable to discussing optimization and edge cases. I let the candidate choose the language (or just use pseudo code) and don't care at all about the syntax.
I find that this is very useful especially when interviewing juniors who don't have many projects under their belt for the first part. It's also useful when a candidate has good verbalisation skills, but poor programming ones (which happens).
I have a similar experience doing interviews in NYC. We started by asking algo questions, but quickly found out that 95% of the candidates can't answer them. My boss, an ex-programmer, was surprised too, and she asked me to dumb them down, a lot. We ended up with a bunch of really trivial stuff like "write a function to reverse a string" or its slightly harder version - "reverse an integer".
Could you give the exact wording of your "nested for-loop" question?
I've given phone screens to individuals who turned out to be a different person when they showed up onsite. Not "nice on phone, jerk in real life" but rather "Bob does the phone screen for Charlie (and passes), Charlie shows up at the onsite."
I loath writing algorithm on whiteboard, especially the 'catchy' type. But I've interviewed people who can't even write a for loop.. the amount of brain drain in the flyover country is insane.
> And as for the code, we included what we thought was a trivial nested for-loop problem and virtually nobody could even get started on it.
How did you run these? While it sounds like something that should be ok even on paper, you can vary comfort a lot through the medium. E.g. for the last interview I had with a substantial coding part, I think being able to do it on my personal machine made a big difference. (I obviously was told before what kind of environment I'd need to have ready)
I agree a code test is necessary. I’ve seen several panels neglect to do a code test, the candidate was hired, then within a month fired because it was clear they couldn’t do anything (other than, well, argue).
I’ve been in a panel where I was the only person who asked a code question, the candidate flunked, and then the VP of Engineering went over my complaints and hired the guy anyways. He had been a Professor of Software Engineering and had a graduate degree from Princeton. Within three weeks, the VP of Engineering had to fire him because he couldn’t make it through a simple code review.
BUT the extremely negative sentiment here towards the technical interview process is very well-deserved.
Assessment of code (and the selection of problems) is most often no less subjective than any non-technical assessment. Sometimes the interviewer doing the grading is flat out wrong. Several times I’ve been asked the famous “given an array of stock prices, find the optimal buy and sell indices for the biggest profit.” One interviewer was not aware of the linear time solution to this problem, and didn’t believe me when I wrote and tried to explain it to him.
But sometimes, the interviewer doesn’t even want you to do well. Once time I was interviewing with an injury that prevented me from typing efficiently. I had a doctor’s note and the injury was quite conspicuous. Nevertheless, three start-ups made me solve problems by typing on a keyboard, which guaranteed an excessively long completion time. Those companies held those results against me. (An it’s not like there is anybody to hold those panels accountable).
And then there are those who just don’t care. I had a phone interview with Airbnb that was literally as bad as the stories on Glassdoor: the guy answered the phone in a noisy office (not a conference room), gave no introduction, then simply stated the problem and dumped me into Coderpad. I literally thought it was a prank, since I had met with people at Airbnb face-to-face prior to the call. But the recruiters confirmed the guy was a real employee.
The root problem here is there is no feedback loop back to interviewers. The candidates get “feedback,” but people asking code questions, especially new grads, typically get zero assessment on how well they are doing as interviewers. What’s worse is that recruiters and hiring managers both have incentives to deprive ICs of such feedback, since it would invariably make ICs more aware of opportunities outside the company.
Until the incentive structure of technical interviewing changes dramatically, we’re stuck with Leetcode and hope for the best. People like Gayle Laakmann are helpful (especially when Facebook gives candidates a live hour-long session with her for free), but these people are ultimately invested in their own income and not the task of fundamentally fixing this broken process.
what is everyone smoking? youre still trying to optimize for a test and not a job.
dont give them a fizbuz. give them an example of a real problem your engineers need / are trying to solve. how do they respond? thoughts / intuitions / pseudo code. do they show knowledge of the problem space / domain?
ot if you just want a kid who can code and they pass the fizbuz but fail at the real job, what does your training/culture look like? who does that really reflect on?
it seems to me that interviewing is terribly cargo cult. the problem is real, the practises ostensibly supposed to be solutions are not.
I realize everybody's going to jump in and rant about algorithms in interviews, but I wish you'd all add something constructive as well.
I was like this a couple years ago. I was a self taught, "college is a scam", "practical experience" type guy. I now, however do see immense value in the ability to be able to work through these algorithm questions,especially if you ever want to do something besides web / app development.
Algorithms are great but in the real world what matters is being able to recognise a class of problem then go to the literature e.g. Knuth to find the right one. No working programmers know every algorithm off the top of their heads.
The classic detect a loop in a linked list question. The original guy took years to devise the algorithm for it. In an interview either you’ve seen it before, in which case you rattle it off, or you have to write a research paper effectively in 5 minutes.
This is surprisingly easy, but you have to understand what the problem is first or the solution won't be apparent. If this sounds patronizing it comes about from my frustration on being on the other side of the table too often.
What you want is very good football (soccer) players. Unfortunately you (or upper management) may not know all of the rules to soccer. You may not know the training regime that goes into winning a good soccer game - and it's a big risk spending money training for the big game on the off-season only to lose during the Big Show. So what do you do to test potential candidates? You see if you can get along with them, if they're a team player, and then see how well they play foosball.
It's perfect! There are soccer players on the field, there's a goal, it takes skill and coordination. But oh no, it turns out in the population at large really good soccer players really sort of suck at foosball. After all, they'd rather spend their time and energy playing soccer. So now they spend all their time reading up on books about foosball and what the best foosball strategies are.
You see where I'm going with this? No one uses algorithms in their day jobs. OK a few of you, but come on man, I make full stack web applications. As do most programmers. So why are you testing theory that has literally nothing to do with the job? Somehow someone thought this was a proxy for smart people, but I mean, if the guy who wrote this "Tech Interview Handbook" was really smart wouldn't he have spent his time writing a cool program? I mean how lame is this?
If you want to hire competent engineers tell them what you're building and ask them how they feel is best for them to show they're competent. If you have a big data pipeline in Scala ask if they can construct a data pipeline example that is cool over a few days (take home) or do something similar in the office. Some people like one, some like the other. But just communicate with the people you want to hire! And if not everyone's interview process is the same then maybe that's ok.
I mean I just got a guy who sent me some automated code interview program that had a timer that counted down from an hour at the top! WHO THINKS TREATING THEIR POTENTIAL EMPLOYEES LIKE STAR WARS DRONES IS A GOOD THING? All you have to do is treat the people you want to work with like you would want to be treated and demand that they know their shit.
Given the current state of tech interviews, they have more or less become like standardized tests, such as SAT, ACT, GMAT, GRE... with guides, cheat sheets and perhaps neighborhood coaching institutes on the horizon with instructors who have cleared tech interviews in FAANGs.
Are we going to see tech recruitment become more and more like college admissions where a top score in the interview is just one of the criteria and no longer sufficient to get a job?
Perhaps next is asking people to write essays regarding career, goals, why they want to work their, extra curricular etc..
Y'know, as someone who went to a very competitive public school, there were a particular brand of driven, studious kids who knew exactly how to paint along the lines, play the game, and get into the college of their choice. They understood the precise combo of grades, extracurriculars and essays to get into a "good" school.
I definitely see some of those kids doing the same tactics to get into a "good" company. The steps are slightly different but the philosophy is the same: grind Leetcode, participate in tech clubs and take bullshit jobs juuust long enough to get an internship, get said internship and parlay it into a return offer. Put your head down, play the game and reap the rewards.
I mean it's all a game as much as it sucks. One of my best friends in high school was a salutatorian because the school didn't wait grade. She took zero AP classes and very few honors classes. She knew what she was doing.
Friends in college took less challenging majors to boost GPA for med school (business admin, environmental policy, etc) to boost grades and there core classes were easy so it gave them more time to focus on med school pre-reqs.
It has been and always will be a game. Few things in this world come down to meritocracy because the way we socialize and the way we interview does not beget that in absolute terms.
Tech did a better-than-most job as a meritocracy for while, but I believe those days are done. Most areas f tech are too mainstream with too much obvious money at stake.
That's how most kids get into prestigious universities. Rarely, if ever, do kids just "stumble" into HYPS or Oxbridge, out of sheer intellectual power and luck.
If you want to get into those schools, you gotta know how to game works, and practice specifically for that match.
I went to a school like that, and the majority of my classmates were from upper-middle to upper class families that had poured money into their education since they were young, with the specific goal of getting them into top schools.
Yeah, that's unfortunately the thing, when you start throwing in a ton of different criteria / measurements. The people that know how the system works, will study to maximize those points.
My honest opinion is that you kinda of end up with smart people that are very good at test-taking, but may not be the best people when presented a set of problems with non-obvious solutions, or any guides or road maps on how to solve the problems.
You joke, but I've seen several applications where I've been asked questions like "What achievement are you proudest of?" and "How would you contribute to the diversity of our team?".
Also cover letters cover some of this.
Actually, I wouldn't mind a standardized test like the GRE, where a good score might actually keep your resume from getting thrown out immediately.
Maybe I stand alone in feeling this, but I don't see those questions are absurd during an interview. Especially at smaller startups, where early hires are going to be interfacing heavily with the majority of other employees. Maybe this should be filed under things that comprise "culture fit", but I also think it's different than that.
If there were a standardized Programming GRE, it will end up increasing the competition for jobs thanks to the future Kaplan/Princeton/local training centers. What would companies do then? Start another layer of programming interview on top of this programming GRE. Eventually, we will reach a point where companies don't trust such GRE scores.
Bright high school students are vastly oversupplied compared to seats in elite colleges. Graduation rates are in the high 90s. There are many more than 5,000 kids who can handle the workload; which 5,000 you pick is arbitrary.
Engineering competence is not even slightly oversupplied compared to useful engineering work. Project failures, incompetent people, and systematically incompetent orgs are still very much alive at the most selective tech employers. There are real business needs to hire better engineers.
Admission at Georgia Tech has spoken on this online. Per their analysis, they can definitely differentiate the top 30% of applicants from the rest and it makes a difference in performance. Within the 30% they have found no differentiator that significantly impacts their academic performance.
So they have a full 30% of their applicants qualified to be there but they have to narrow it to 1%. Whatever method they choose must have the appearance of meritocracy, abide by laws and regulations, be resistant to corruption, achieve goals other than academic performance such as culture, volunteering, and sports, and so on.
I think the same must be true for companies. At some point in the elimination process everyone is technically qualified, so they might as well hire someone "because we like his face". But that's demoralizing and corrupt so they invent some criteria that on its face seems useful, even though it's truly not.
Even worse, engineering management is desperately in short supply. However the beauty competition nature of the project and vendor selection process is the root cause of failure imho. And that's not fixable in tech, in fact, the problem is so meta (and metastatic) that only the PG recommended "end-run" to provide value to end users can cut through the BS.
I was against the algorithm tests or something more like exams before.
Until I interviewed a lot of people with my colleagues during these years. Interview processes are highly biased based on the knowledge of the interviewer's background and value, or even mood.
Some interviewers are too sloppy on interviewing, asking ill-defined questions, demanding answers they want, or just in a hurry and wanting to go back to work. I often feel bad and angry for interviewees - they spent time and patience preparing themselves carefully, then was treated very casually. It isn't fair at all.
I hate to say that, although standardized tests are bad, they are better than the most nowaday interviews.
I went to a Cal State, and I recall in the last few years Google has taken professors to their campus and pretty much preach Cracking the Code Interview. So it looks like schools are already making that their standard. I have yet to see what classes look like but I could see there being a required technical interview course you have to take in order to increase school’s numbers on where their alumni work.
> This course will prepare students to interview for software engineering and related internships and full-time positions in industry. Drawing on multiple sources of actual interview questions, students will learn key problem-solving strategies specific to the technical/coding interview. Students will be encouraged to synthesize information they have learned across different courses in the major. Emphasis will be on the oral and combination written-oral modes of communication common in coding interviews, but which are an unfamiliar settings for problem solving for many students.
Judging from the people I work with (FAANG). I would say that this screening is successful, if someone truly awful got in, then there's still PIP. But those are relatively uncommon.
Having hired dozens of devs, I can confidently say there is absolutely 0% chance to consistently successfully identify good developers in any reasonable amount of interviewing / assessment time period.
The best way is someone brings in an existing code portfolio and discusses it.
The second best way is someone completes multiple design and development exercises of varying complexity, constraints, and use cases.
The third best way is they complete a single exercise and provide commentary on alternative designs.
There is no fourth best way; all other approaches are essenrially stochastic and select for interviewing traits not development traits.
The actual best method I think is a 3 month probationary period which is more or less an extended interview. They're asked to contribute to existing codebases, participate in code review, go through some architecture design sessions, conduct stakeholder interviews - things that again are mostly impossible to accurately gauge in a typical candidate assessment window.
By the way a tremendous book for interviewers and hiring managers is How Judges Think by Richard Posner. A lot of it applies to hiring, and he's a great writer.
#1 is great, but impossible for a lot of people due to NDAs. The 3-month probationary period is indeed the only way to really evaluate devs, but I'm not so sure it's a good way to hire them. After 3 months, the mediocre devs have friends, etc., which makes it hard to get rid of them without creating a weird morale issue and massive team disruptions. And then you have to add in the limiting effect on the hiring pool (though I wonder if that really matters, I suspect good devs know who they are).
Number 3 is my favorite, but there's a trade-off: almost anything large enough to allow for multiple designs is probably too large to require all applicants to complete. But I also love having a candidate discuss their own code. I'm really curious what types of exercises you have for this kind of thing, and where it fits in the interview process?
> After 3 months, the mediocre devs have friends, etc., which makes it hard to get rid of them without creating a weird morale issue and massive team disruptions
I’ve seen this mistake happen so many times. Especially in early stage start ups. If you want morale issues and disruptions, you’ll achieve it 100% of the time by retaining underperforming staff. Even worse, this ultimately leads to your best team members leaving if it’s an issue you can’t solve. Fail fast applies to HR too, you need to learn to fire fast.
The actual best method I think is a 3 month probationary period
My workplace has a six month probationary period; I brought in a mandatory three month review after watching one group screw this up badly. At the end of the six months, the employee came in to work all happy as usual thinking everything was great, and was promptly fired.
The three month review is the point at which the employee is told that they're on track, or they're below standard. If on track, just keep going the same way and if they don't get taken aside for a specific chat in the following three months, told to assume that they're going to pass the probation period; we have definitely turned probation failures into probation successes via this mid-point review. It's also their opportunity to tell the company what the company is doing wrong; what the company is doing that will make them choose not to stay. This too has happened, and we have retained good employees by listening to them at the three month point and making changes.
If they're below standard, they're told what they need to improve and are offered help to improve, or they can just sack it now and walk (or, as happened once and once only so far, they're considered unrecoverable and we take a long hard look at how that person was hired).
The principle we subscribe to is that if the employee is surprised by the results of their probation period, that employee's team lead and by association "the company" has really screwed up. If an employee doesn't know how they're doing after six months on the job, something has gone very wrong.
Nobody does probationary periods because few candidates would pick a probationary period offer over a standard job offer. And it's presumed that top level talent will have multiple offers on the table, and if you don't then there's something wrong with you, so having a probationary period selects for lower quality candidates.
I would assume that a company without probationary periods is lower quality; either the work is so uniform that anyone can do it and employees are fungible, or they simply accept bad employees and live with them. Every company I've worked for has had some kind of probationary period.
I don't live or work in the US, though; we can dismiss someone during their probationary period with ease, but after that they have workplace protections. If we could fire anyone at anytime for anything we like, that effectively makes the entire employment period a probationary period. Probation never ends if you can be fired at any time.
> The actual best method I think is a 3 month probationary period which is more or less an extended interview
Most states in the US have at-will employment terms. Probationary periods are common in other countries. And yet no one seems to want to do what you're suggesting.
The problem is, most companies in the US want to hire you on as a Contractor and not a full time employee for those three months. I can't speak for everyone (but most folks, I assume), but I need medical and dental. I can't go three months for the chance of maybe getting it. I can't do it personally and i can't do that sort of thing to my family. there are other options in other countries (depending on the country). I'm a centrist by nature, but this makes a "three month interview" untenable, IMO.
How often are you hiring people whose professional work is open source? How often are you hiring people for roles where their professional work will be open source?
There is something I cannot understand and this has already become a cliche:
> Technology industry is an extremely fast-moving one. Many technologies used today didn't even exist/were popular a decade ago; in 2009, mobile app development and blockchain were pretty much unheard of. Engineers constantly need to upgrade their skills to stay relevant to the demands of the job market. Engineering is a great career for passionate individuals who like to learn.
Why would anyone give this advice? Can we stop handing out this advice and encourage everyone to stand up for their rights instead? If you think about it, by proxy this gives you the following advice: as a programmer you will have no life outside of work and you are supposed to be an idiot who spends his time working and studying outside office hours even when you could spend time with your family. So yeah, go for it and suck it up you idiot.
At least that's how most employers handle this problem. And interestingly by comparison no-one tells a MBA diploma holder that it's a great career for passionate individuals who like to spend their whole life studying.
And before you think that I am against studying, that's not the case at all. If my company pays for it and I can kick back on a sofa during working hours to study I am fine with it. But I cannot see any value studying something on my own expense on my own time which will be outdated in 3 years anyway and so by definition it only benefits my employer.
It seems relevant to getting a tech job. If one is looking for a job, chances are that things have changed since the last time they looked for a job, which on an average for most people these days is every 2-5 years.
This is not limited to engineering or tech, and extends to most specialized jobs in most industries.
Can we stop handing out this advice and encourage everyone to stand up for their rights instead?
In the right context absolutely. In general, its possibly a bad idea. You could always balance it out and educate the folks about the rights of an employer
If you were on my team, I would not expect you to sacrifice any of your rights. But if you kept falling behind your peers, to the determent of the team's performance, at some point you would be put on a performance plan. The unfortunate thing about performance plans is that by the time its enforced things are close to unrecoverable.
And if you did find yourself failing, I hope someone tells you that:
The technology industry is an extremely fast-moving one. Many technologies used today didn't even exist/were popular a decade ago; in 2009, mobile app development and blockchain were pretty much unheard of. Engineers constantly need to upgrade their skills to stay relevant to the demands of the job market.
I don't know what is a performance plan. Can you elaborate? I have never worked in a company where there was such a plan.
Is this public to employees or you just simply whip everyone until they work themselves to death without telling them the reasoning?
Also I assume you are in the US. In most European countries you can absolutely do nothing about someone who works full time on your team unless they let's say causes you financial losses or punch you in the face, so you can basically shove up your plan to your bottom part in other cases.
I didn't read that as saying you have to spend hours outside of work to learn. Learning on the job is pretty much a given these days, no? At least in my experience. Even decades ago I was always given time to research/study new things and I really like that about our industry.
Where did you work if I may ask? In the companies where I worked there was never such a thing. I worked in 2 companies in Hungary, 2 companies in Germany, 3 companies in Japan and one in the US.
It was always basically assumed that you will study all the necessary things on your own time and when you sit in the office then you make productive things, a.k.a. as you deliver. You always had to code and show some progress of a specific development task each week. There was never such thing as time to research/study.
MBA holders is a bad example since many of them happily work many hours of overtime, so I am pretty sure that they work more hours than software engineers even if you include the time it takes to study for interviews.
> It turns out that the median number of hours racked up by an MBA in his or her first year of employment is a whopping 54 hours a week
I am confused. Why do you think software engineers stop working as much as MBAs when they land a job? That's when they have to start studying even more beside working.
54 sounds about average for an engineer anyway. In Germany for example it's not rare to have 9 hour working days, with 30 to 60 minute lunch breaks in between. If it's so then basically one week comes to about 45 hours / week. If you study 2 hours a day, read books or read the news, follow trends, watch recordings of past conferences, etc which are pretty standard things that you are supposed to do on your job then it easily comes to 54 hours / week. So that's not high at all.
There are no employee rights in modern capitalism. In the 19th century, they worked 80 hours a week in mines, had child labor, no vacations, women were ignored, indentured servitude existed. All this and more until unions and labor laws.
Karl Marx's entire philosophy was based on the abuses of workers he observed.
Nowadays people mock progressive causes as 'socialism', mock unions and worker protections, demonize progressive politicians, and idolize oligarchs. Even though it's all against their own best interest.
Your idealistic notion of 'rights' doesn't exist and will never happen. You have no rights as long as someone controls your purse strings, which with booming inequality is more and more people nowadays as well.
I can't wait until most jobs are finally automated and we're done with this whole capitalist system entirely and have to figure out what to do next.
I agree, and I think you hit on exactly why white-boarding persists (the least of all interviewing evils, IMHO): if this industry is so fast-moving, the fundamentals are the most reliable thing we can measure that has lasting value for (what we hope) are long-term hiring decisions.
Teachers, doctors, lawyers, professional engineers, and many others get their licenses revoked by the government if they fail to go back to college and take classes every few years.
These are good questions, I like how they are phrased so concretely instead just asking “do you have a lot of tech debt?” They’re interesting for getting to know the team and getting a piece of insight you usually wouldn’t have until a few weeks into the job.
But I would be careful how you interpret these. In fact I would almost factor in these answers in the opposite way of what I think you intended. The company that admits to the worst technical issues is at least honest and self reflective. The company that doesn’t admit to any serious issues might be just as bad or worse, but their strategy is to tell employees to lie about it rather than be open to addressing it.
For all the negativity for these type of tech interviews, they are, from what i've seen, one of the most merit based systems out there. It is either this or we need to create some sort of national developer exam. The other alternate is to get jobs at good companies, they will only look at what school you went, whether you graduated with a CS degree, what companies you worked etc... All things which do not guarantee merit.
it is not a perfect system but nobody can really predict a good developer till someone is already on the job. A good developer is more than algo skills, he/she also has good communication, works well with others etc... These are not things we can test yet. The problem with our profession is since it is lucrative and has no real licensing, it is a perfect breeding ground for fakers.
While this is very good to study before a technical interview, over time however I can see that this alone is going to make it 40x harder to differentiate say 100 candidates that are all perfect at interviews in general, that we are going to start asking ridiculous Oxbridge-style interview questions and expect perfect scores to advance 'good' candidates.
Perhaps companies will start asking candidates to construct mathematical proofs of data structures, algorithms, formulas and common equations from university-level entrance examinations just to do a mobile app or a web dev job.
As soon as that happens, the 'ideal candidate' companies will be expecting to interview would be a very prodigious candidate, former math Olympiad champion and decorated with titles and research papers in their name.
You guessed it: 𝔜𝔢 𝔬𝔩𝔡𝔢 𝔩𝔢𝔤𝔢𝔫𝔡 𝔬𝔣 𝔶𝔢 10𝔵 𝔡𝔢𝔳𝔢𝔩𝔬𝔭𝔢𝔯.
Perhaps so. Those sort of questions would benefit a company working at the scale of FAANG or Microsoft and actually tackling or researching real computer science problems.
Now would this make sense for a graduate entry level role for a web / mobile app developer position? Interviewers looking for such candidates need to lower their expectations a bit in for positions like that.
I once interviewed at a startup for a senior engineer position and was asked “if aliens came to Earth and asked you to go into their UFO with them, would you?”...
As much shit as we give white-boarding, I would have chosen it instead if it were an option.
At one time i was asked "why do frogs croak? I gave a series of answers like: to attract the opposite sex, gave some biological explanation of how they achieve that. But the stupid interviewer kept asking me why?
Needless to say after the interview ended i ran away from that deal.
What's the point of hypothesizing on things significantly less likely than someone winning the lottery (edit: while being hit by lightning, for good measure)? In fact "what would you do if you won the lottery" might actually give more interesting insight.
I just had to conduct a round of interviews in a non-SF large US city, and it was a hellish crapshoot. Resumes are meaningless, and often re-written by recruiters to match the job anyway. Everyone has the same canned answers to the stupid behavioral questions. And as for the code, we included what we thought was a trivial nested for-loop problem and virtually nobody could even get started on it.
Is this kind of code problem too complicated in your opinion? For all I join in when complaining about irrelevant algorithmic questions, I have to admit that they at least test something, even if it's just willingness to study for the interview.
Instead of reading everybody's complaints about interviewing, I'd love to hear how you think it should be done. Because I have to admit I'm pretty much lost right now.
It'd be nice to think I have some special skill here but I really don't. This is just how interviewing was done in the 90s. To some of the younger generations, I've been told it sounds crazy.
If you send me your resume I'll actually read it, carefully. If the person described in this resume fits the background experience the role needs, you get an interview.
During the interview we'll talk about all those projects you worked on that are relevant to this role. Which parts you enjoyed the best and why? Which parts were boring and why? Which parts were the most challenging and why? What you find too easy and why? What would you have done differently? Could you have? If you were to do the same project all over how would you approach it? Other open ended conversations along these lines.
I don't ask anyone to whiteboard code, that's not part of the job so it's not part of the interview. No puzzles, no trivia-pursuit style questions.
It works great. You can't BS your way through such a conversation with a senior technical peer if you didn't actually do the work described in the resume. You just can't.
It is, however, vital that the interviewer must be a expert in the field.
> > No puzzles, no trivia-pursuit style questions.
I swear HN technical interview threads are the poster child of talking past one another. First, for-loop is nowhere near a trivia-pursuit question. Second, different companies of different sizes/industries/goals have different requirements. Let's all move forward with this discussion and acknowledge that we can't all use the same process because we're not all hiring for the same type of job. If the engineers you hire consider for-loops a "puzzle" that's totally fine and OP using it doesn't invalidate your process for your multi-national or startup companies.
> I can say that I've never regretted a hire I said yes to
The real question is, has anyone else regretted that hire, which unfortunately can't be answered as they may not tell you.
> To some of the younger generations, I've been told it sounds crazy.
Doesn't sounds crazy at all. A single process guaranteed to work for everyone? Now that sounds crazy.
You don’t want ‘bozo cliques’ to form, so you make a semi-objective process like ‘solve this algorithmic question’ as part of the interview loop. I think execs and the founders doing a final review before hiring all engineers comes from that fear.
Then other companies cargo cult interview processes from larger companies and the trend propagates.
If you want to ‘hack hiring’ as a smaller company, you should use hard to scale processes like the one described in the parent post.
If you don't actually verify the technical problem-solving ability of the candidate in some way you're forgoing signal that can massively increase the confidence you can have in your decision.
A company trying to hire engineers can easily give a take-home programming challenge to a dozen engineer applicants and take very little time analyzing the submissions. That’s pretty unfair. But it also feels a bit untenable for a senior engineer at a hiring company to have very deep investigations of each applicant’s work history.
Another problem is objectivity. If your company’s engineering hiring process relies heavily on a senior engineer’s subjective impression of an applicant, you’re going to have big problems with your own engineers’ biases, whether subconscious or not. Expect to hear a lot of evaluations like “well, the applicant did seem to have good knowledge and experience, but I just wasn’t impressed for some reason.”
I agree with your approach and use it myself but one thing is different now and that’s the proliferation of tiny skills. Back then you would have a few big skills, you would claim to know one or two main languages, one or two databases and so on. Now people list hundreds - literally hundreds - of skills sometimes. And there’s no way to tell on reading if they really know it, or just saw it once and maybe did a tutorial or “hello world”. People now will add a skill to their CV if they’ve done it for a hour total in their entire lives! Or read a blog post about it. It is a massive time sink to pick through that.
It takes a lot more effort on the part of the interviewer and is "harder to scale", in that you can't just train people to ask canned questions. But since it's open ended, it's a lot better at finding out what the candidate is really good at, and it's very useful to have one of these in every hiring loop, usually by a senior team member or exec.
This is key. There are a lot of hiring managers masquerading as experts and are frustrated when cargo culting hiring processes falter and lack the people skills to diagnose a situation. You'd also be amazed at the quality of resumes a high, advertised salary will bring.
I don't disagree that there are a lot of fake it till you make it developers who did a vo-tech class and are applying for jobs out of their league, but I just don't know how people don't spot them. It takes me about 5 phone calls to find a competent individual, and then I bring them in for a in person. As a hiring manager, I don't find it cumbersome to weed thru 5-10 people to find a good hire.
You think this is a claim about how good your hiring practice is, but the only way I can think to read this is as a point about how little hiring you do or how little evaluation of hires you do. In the real world, perfection isn't possible, so claims of it are a sign of inexperience or naivete.
This may just mean that you say a lot Of wrong “No”s. To get very high precision or very high recall is really easy... what you must measure is your F-score
Have you let new hires go in the first 90 days? That's just one of the ways personal regret is probably not the best metric here.
This is one way the current status quo might be better than the past: you don't get pigeonholed so much by your past experience into being a "fit" only for similar roles. Sometimes the hiring manager is really looking for a specialist, but in general, we don't care what industry you were in or what tools you were using, as long as you can prove you're smart. Some of the most impressive people we have working on Go microservices were enterprise C# developers before.
While it maybe easy to hire dev who have been on the market for 10 years. You still have to keep in mind that new developers are still a very large proportion of the dev population.
I also found very few people could solve this (similar non-SF large city location). Occasionally, people who could not solve this were hired for other teams. Based on their performance, I don't think I would have been comfortable working with them.
I don't think it's unreasonable. But I'm not sure I'd use it as a screen if I was hiring now. I think I'd just have a chat and try and discuss a previous project. After that I'd move to a paid take home project (ideally representing real, useful work).
[1] Take a string, for example "ABCCABC" and count the number of times each 3 character substring occurs. In this case the answer would be 2xABC 1xBCC 1xCCA 1xCAB.
I froze up. I couldn't talk and think at the same time. I didn't have the skillset for doing this in a very intense scenario. In my case, I was homeless and needed a job ASAP. Every interview felt like life or death to me.
First one I was asked to reverse a string in C. I hadn't done C in a few months. I froze up on syntax. I looked like an idiot who couldn't do it.
I could imagine many people who have never experienced this format (or haven't experienced it much) would easily freak out and look stupid as bricks like I did.
I've since done over 200 technical interviews (as the interviewee) and I usually sweep. Still fail at FAANG but I always get the solutions. (Even the leetcode hard ones) Just not sure why I fail but cest la vie.
I just tried it, and it was trivial to do in a minute or two on my laptop. However, I did took note of two syntax mistakes that I made in the python REPL that were immediately obvious there and took seconds to fix, but which I most likely would not have noticed on a whiteboard.
So there's quite a bunch of problems where if the acceptable "format" of the answer is "hey, I'll just pull out my laptop from the backpack and push the solution to github in 15 minutes" then it'd be okay, but it'd be hard to do it while 'whiteboarding' without access to immediate feedback and easily accessible API documentation.
For example, I work in many languages, and for many APIs I can't remember whether in this particular language the same thing is called add or append or something else; e.g. I've worked on Java code for a dozen years and would be a quite productive Java developer, but since I haven't written any new code in Java for quite some time recently, I can't remember off of my head what's the right boilerplate to open a text file for reading in Java - it's something that a Java tutorial might have in the first pages right after Hello world, but I'd still have to look up the incantation to pass the encoding properly - there's like three Reader classes to instantiate and I don't recall their names off the top of my head.
Id put out my fingers 3 characters wide to the first substring and step through the rest of it in 3 char 'spans' with my fingers. I'd say "extract each substring, put it into a map with that substring as the key and either 1 as the value if it wasn't already present, or if it was, increment the value"
I'd probably not bother mentioning how to extract the results unless asked.
If someone said that to me it would show the solution, and I'd be 100% happy.
I'd assume that they could then render that into code - perhaps that would be a mistaken assumption though but in my experience solving the core problem is the thing I'm interested in, not the syntax. But those with more experience may say the former doesn't always imply the latter, and the code needs to be shown.
If so, then the company you're interviewing for must not be attractive to first-class CS grads (top 10-20% I think) from any non-online university.
As a comparison, any FAANG + Palantir/Jane Street/Two Sigma in the UK have tougher questions as their FIRST phone interview for INTERNSHIPS. (Palantir requires you to go through 6-8? interviews before getting an offer)
If this is a job that requires actual software engineers, I think rejecting everyone that failed this question would be perfectly reasonable.
Do you get candidates that ever ask "can I assume the strings are ascii?" The usual solutions to this will break in amusing ways if you allow arbitrary unicode. I know some interviewers who actually would have as a hidden scoring criteria "candidate asked about input encoding", and a lack of asking about that is a fail even if they correctly solve the problem for ascii. I myself disagree with using such hidden criteria -- if I'm going to score something, and not tell the candidate what exactly I'm scoring, it's at least going to be something in the code like "correctly avoided the divide-by-zero case without me pointing it out" and not an expectation of the candidate to read my mind on my expectations of things I haven't told them. (I do tell them to try and write code without errors (I help fix up basic syntax quirks or if I spot a typo I'll point it out), or ask how confident in their code they are and whether they might have in mind any edge cases to try -- I'd like to get a candidate who actually writes a unit test on their own, I always point out junit is set up...)
You see my process better if you watch me write and correct this program on a computer. If my tasks is to think up a fully correct solution without iterative trying incorrect solutions, you're missing an important part of my process, even with a trivial FizzBuzz. (Though with this one you could probably just act as the interpreter and point out the errors for me.)
Can be done in O(n) time with a hash table.
- Treat recruiting in the same way as you do software development.
- Formulate a set of requirements.
- Define interview questions that give insight into whether or not the candidate meets those requirements. This is the equivalent of "tests" in the software process.
- Specific skills with your technology stack is good, but not necessarily essential.
- Ability to discuss sophisticated software concepts, and to explain software that they have built, and how they would build out ideas given to them is good.
- Evidence that this person gets stuff done is good (ref Joel Spolsky).
Coding tests are, for the most part, garbage. Not because the test is of no value, but because you the employer probably don't evaluate the result properly.
If someone could blag that while not being able to even write FizzBuzz, all I can say is well played to them.
IMHO this process works fairly well and does a good job of being economical with people's time.
No. Just programming is actually not easy and a lot of people apply for jobs they can't do.
Plus, a nonzero number of people freeze up in any interview situation.
I once failed an interview loop because I forgot how bucket sort works.
At the final interview to join the SRE team at Google I was asked to implement the kNN algorithm. I barfed at implementing a kD-tree after regurgitating the brute force solution.
Has any SRE ever had to implement a kD-tree in < 20 minutes or Google would go down?
I asked the interviewer at the end. They had never implemented one on the job.
As long as companies insist on these inane rituals I think it’s fair game to optimize for it as an interviewee.
It’s stupid but what else can you do?
This is more true for startups that need less know "how to reverse a binary string" and more "how to properly design a database with 3rd order normalization", etc. If you're presenting an interview question, it should have actual job relevance and your co-workers should be able to solve it in the same time as the candidate. If you're drilling people on non-job qualities (e.g. invert a binary tree for a web dev role...) then you should expect a large difference between audience that can pass your bad interview tests and audience which will perform well at actual job.
Not trying to complain. I think job interviews should focus more on the 99% of what you do in your job on a Tuesday. Poor interviews seem to be more gotchas and algorithm tricks to disqualify roles which have little actual use for algos + data structures. The tests might seem too easy but as a front-end engineer I would rather be with a coworker which understands the CSS box model, knows semantic markup for accessibility, and similar web things than a person which is good at creating hash tables and doubly-linked lists in JS. Leet code probably doesn't test vertical centering techniques with CSS but if you're applying for a web dev position you better know them.
I felt this was much better in that it was less stressful, yet allowed me to demonstrate both knowledge and design skills.
Another company did something similar but more thorough: They invite candidates for a full day of work where they try to solve a small problem. Then the code is reviewed and evaluated together. They also start the day with a 1-hour overview of their current architecture and you get to ask questions and talk about alternatives. I think this gives both sides a better chance of finding the right fit.
I realize this is not always reasonable.
It's possible that you asked the question poorly, or the solution wasn't as obvious as you thought.
Designing interview questions is hard[1]. I'll test out new questions on my peers at least two or three times before putting them in front of a candidate. And many don't make the cut. If a good engineer who's relaxed can't solve it easily, then a stressed out candidate will have no hope.
[1] This is why I hate seeing candidates share specific questions online. As an interviewer you'll have to scrub a good question, and switch to something you're not as familiar with. This hurts good candidates.
I think this was a great way to not only verify coding ability, but also testing team work and communication skills.
The art of behavioral questions isn’t “ask and answer” it’s the follow up questions. As you astutely point out, the questions are ‘stupid’. They might as well be “do you want a stick of gum?”
Next time you ask those questions do a couple of things. First keep in the forefront of your thoughts what information you’re trying to get out of it and keep the candidate on track answering your data point. Do that by relentlessly asking follow up questions. When you think you have everything ask more.
As an anecdote I was being shadowed during an on-site recently. I asked some arbitrary ‘dumb’ behavioral question, went back and forth a bit, wasn’t getting much out of it. I noticed my shadow clearly moving on to the next question in their notes and decided to keep pushing on the original question - why did you do this, what were you trying to solve, what motivated you. Turns out the candidate did all of this to generate new revenue for the company and ended up bringing in $10m a year extra at the small company there currently worked for. Loads of great data, would have never gotten there if I’d settled for the canned answer the candidate had.
Behavioral questions aren’t comp-sci trivial questions, you can’t just ask the behavioral equivalent to fizbuz/Fibonacci/floodfill and copy down the answer (and you should never be asking those questions either, but that’s a separate rant).
Behavioral questions are stupid and to some degree that’s the point. When you ask your significant other or kids “how was your day?” — guess what, that’s a stupid question too. What matters is what follows from your line of interviewing.
If you want to get good at behavioral questions listen to Fresh Air and try to be like Terry Gross.
> Is this kind of code problem too complicated in your opinion?
May you please post the problem so that we can provide you with meaningful feedback regarding said problem?
I agree that some coding problem needs to be used to try and answer the question, though with the right interviewer they can answer it without seeing code. The problems you use for that don't have to be at octree-collision-detection whatever challenge, a trivial nested for-loop is fine -- fizzbuzz level is fine. Sometimes you can rely on github or strong internal referral to skip this, but watch out, and anyway it's worth giving your questions to people you're sure will do fine (you've timed at least yourself right?) for the benchmark data and because sometimes they don't do fine, perhaps since maybe your question is too much. e.g. Floyd-Warshall can be done simply with a few nested loops, still I would never give it as a problem and I'd expect nearly everyone I've worked with to flunk it given only the standard hour (which really means 45 minutes).
Some jobs only need basic competence, so you might want to extend an offer if you've been convinced of its presence. At my last job, which ended up being more technically challenging / interesting than my current job, I was hired after posting my resume to Craigslist which led to exchanging some emails and having lunch with the startup founder to talk about my past work and whether I would be useful for his most pressing work. At my current job, I've been part of on-sites where I've established "can you even code?" is "no". Those were costly failures of not having that answered earlier. But we also like to believe we need more than basic competence, so rejections can still occur because of a lack of "testing mindset" or certain "behavioral answers". Only once you fix your "can you even code?" filter is it even worth considering what else you might want to justify an interview pipeline with more stages than a 'phone' screen or lunch conversation.
I used to wait to the first in-person interview to try simple fizzbuzz style questions (with the candidates on a machine and a compiler/interpreter). In about a third of cases that meant we'd committed a significant chunk of time to engineers that apparently couldn't solve trivial problems.
Now it's one of the first things I check. Done right, it's a relatively small hurdle for capable people to overcome, but really helps as a filter for those who aren't suited to the role.
I recently created a service (https://candidatecode.com) to help companies manage issuing and reviewing their coding challenges; I think it's got real potential to help some people out.
But if you need to recruit someone without any such credentials then you may need to do a simple coding aptitude test. Could be a code review or a simple excercise but whatever you do, don’t do whiteboard coding and don’t have people recite/implement memorized CS textbook algorithms. Anyone can do that and still no code.
If the person conducting the interview thinks the behavioral questions are stupid, then perhaps they are. In that case, don't ask "stupid" behavioral questions.
> Resumes are meaningless, and often re-written by recruiters to match the job anyway
Was the position entry level? Students coming right out of compsci often have little to no practicable experience. They may have difficulty thinking about what to put in their resume. After one or two years of full time experience that should no longer be an issue.
> For all I join in when complaining about irrelevant algorithmic questions, I have to admit that they at least test something, even if it's just willingness to study for the interview.
Asking those "stupid" behavioural questions and receiving the same canned answers also demonstrates a willingness to study for an interview.
The coding problem should be testing a candidate's problem solving capabilities as practicably required by the role being interviewed for. The chosen problem should reflect the types of problems that they will actually need to solve if hired. For example, you could select a small PR from one of projects being actively developed by the company. The selected PR should involve only one or two classes (assuming a language with classes) and require improvement. You can look through the history of a PR and just pull out a segment that was selected for improvement by the reviewer(s), or have the team select it for you. Then ask the candidate:
- to conduct a code review of the PR
- to improve the code
But we emphasized repeatedly we weren't looking for the O(n) solution, just the brute force naive solution was 100% OK. And it definitely didn't look like people were freezing up trying to figure out the optimal problem, they were struggling on the basic nested loop.
For instance, a front end engineer can be expected to be able to write a to do list or similar app in a framework (ideally the one your team uses, but not a hard no hire if not) app with minimal googling (although that’s fine as long as not excessive) in ~45 minutes.
Then you have to look at what level of experience they have. Less experience requires more mentoring generally, which may be fine depending on how much time your team budgets for that work.
Lastly, measure their body language and tone of voice to check for red flags pointing to difficult communication styles or people who treat others poorly.
If all three match, hire!
I recently had one requiring me to develop a native mobile application, which I enjoyed. It was interesting, the code is useful down the line, and if I don't land the job, it beefs up my portfolio.
Initial screening by recruiters is tough as my background's missing a degree and industry experience.
Context: self-taught, started out with game dev, tried going solo - not a runaway success. Looking to move away from the field.
I find that this is very useful especially when interviewing juniors who don't have many projects under their belt for the first part. It's also useful when a candidate has good verbalisation skills, but poor programming ones (which happens).
Could you give the exact wording of your "nested for-loop" question?
Deleted Comment
If no one can answer the question, maybe they don't understand it? Maybe there's something unclear in the way it is worded?
Which I think is why a lot of startups locate in the expensive Bay Area - not a lot of cities have a similar concentration of decent talent.
You'd be surprised (or maybe not now) how many applicants for a senior frontend position can't build a progress bar for the phone screem.
I loath writing algorithm on whiteboard, especially the 'catchy' type. But I've interviewed people who can't even write a for loop.. the amount of brain drain in the flyover country is insane.
Could it be that you're having issues communicating the problem?
Deleted Comment
How did you run these? While it sounds like something that should be ok even on paper, you can vary comfort a lot through the medium. E.g. for the last interview I had with a substantial coding part, I think being able to do it on my personal machine made a big difference. (I obviously was told before what kind of environment I'd need to have ready)
I’ve been in a panel where I was the only person who asked a code question, the candidate flunked, and then the VP of Engineering went over my complaints and hired the guy anyways. He had been a Professor of Software Engineering and had a graduate degree from Princeton. Within three weeks, the VP of Engineering had to fire him because he couldn’t make it through a simple code review.
BUT the extremely negative sentiment here towards the technical interview process is very well-deserved.
Assessment of code (and the selection of problems) is most often no less subjective than any non-technical assessment. Sometimes the interviewer doing the grading is flat out wrong. Several times I’ve been asked the famous “given an array of stock prices, find the optimal buy and sell indices for the biggest profit.” One interviewer was not aware of the linear time solution to this problem, and didn’t believe me when I wrote and tried to explain it to him.
But sometimes, the interviewer doesn’t even want you to do well. Once time I was interviewing with an injury that prevented me from typing efficiently. I had a doctor’s note and the injury was quite conspicuous. Nevertheless, three start-ups made me solve problems by typing on a keyboard, which guaranteed an excessively long completion time. Those companies held those results against me. (An it’s not like there is anybody to hold those panels accountable).
And then there are those who just don’t care. I had a phone interview with Airbnb that was literally as bad as the stories on Glassdoor: the guy answered the phone in a noisy office (not a conference room), gave no introduction, then simply stated the problem and dumped me into Coderpad. I literally thought it was a prank, since I had met with people at Airbnb face-to-face prior to the call. But the recruiters confirmed the guy was a real employee.
The root problem here is there is no feedback loop back to interviewers. The candidates get “feedback,” but people asking code questions, especially new grads, typically get zero assessment on how well they are doing as interviewers. What’s worse is that recruiters and hiring managers both have incentives to deprive ICs of such feedback, since it would invariably make ICs more aware of opportunities outside the company.
Until the incentive structure of technical interviewing changes dramatically, we’re stuck with Leetcode and hope for the best. People like Gayle Laakmann are helpful (especially when Facebook gives candidates a live hour-long session with her for free), but these people are ultimately invested in their own income and not the task of fundamentally fixing this broken process.
Dead Comment
dont give them a fizbuz. give them an example of a real problem your engineers need / are trying to solve. how do they respond? thoughts / intuitions / pseudo code. do they show knowledge of the problem space / domain?
ot if you just want a kid who can code and they pass the fizbuz but fail at the real job, what does your training/culture look like? who does that really reflect on?
it seems to me that interviewing is terribly cargo cult. the problem is real, the practises ostensibly supposed to be solutions are not.
/end rant
I was like this a couple years ago. I was a self taught, "college is a scam", "practical experience" type guy. I now, however do see immense value in the ability to be able to work through these algorithm questions,especially if you ever want to do something besides web / app development.
The classic detect a loop in a linked list question. The original guy took years to devise the algorithm for it. In an interview either you’ve seen it before, in which case you rattle it off, or you have to write a research paper effectively in 5 minutes.
What you want is very good football (soccer) players. Unfortunately you (or upper management) may not know all of the rules to soccer. You may not know the training regime that goes into winning a good soccer game - and it's a big risk spending money training for the big game on the off-season only to lose during the Big Show. So what do you do to test potential candidates? You see if you can get along with them, if they're a team player, and then see how well they play foosball.
It's perfect! There are soccer players on the field, there's a goal, it takes skill and coordination. But oh no, it turns out in the population at large really good soccer players really sort of suck at foosball. After all, they'd rather spend their time and energy playing soccer. So now they spend all their time reading up on books about foosball and what the best foosball strategies are.
You see where I'm going with this? No one uses algorithms in their day jobs. OK a few of you, but come on man, I make full stack web applications. As do most programmers. So why are you testing theory that has literally nothing to do with the job? Somehow someone thought this was a proxy for smart people, but I mean, if the guy who wrote this "Tech Interview Handbook" was really smart wouldn't he have spent his time writing a cool program? I mean how lame is this?
If you want to hire competent engineers tell them what you're building and ask them how they feel is best for them to show they're competent. If you have a big data pipeline in Scala ask if they can construct a data pipeline example that is cool over a few days (take home) or do something similar in the office. Some people like one, some like the other. But just communicate with the people you want to hire! And if not everyone's interview process is the same then maybe that's ok.
I mean I just got a guy who sent me some automated code interview program that had a timer that counted down from an hour at the top! WHO THINKS TREATING THEIR POTENTIAL EMPLOYEES LIKE STAR WARS DRONES IS A GOOD THING? All you have to do is treat the people you want to work with like you would want to be treated and demand that they know their shit.
This. Isn't. That. Hard.
Ah, dynamic programming.
Are we going to see tech recruitment become more and more like college admissions where a top score in the interview is just one of the criteria and no longer sufficient to get a job?
Perhaps next is asking people to write essays regarding career, goals, why they want to work their, extra curricular etc..
I definitely see some of those kids doing the same tactics to get into a "good" company. The steps are slightly different but the philosophy is the same: grind Leetcode, participate in tech clubs and take bullshit jobs juuust long enough to get an internship, get said internship and parlay it into a return offer. Put your head down, play the game and reap the rewards.
Friends in college took less challenging majors to boost GPA for med school (business admin, environmental policy, etc) to boost grades and there core classes were easy so it gave them more time to focus on med school pre-reqs.
It has been and always will be a game. Few things in this world come down to meritocracy because the way we socialize and the way we interview does not beget that in absolute terms.
Tech did a better-than-most job as a meritocracy for while, but I believe those days are done. Most areas f tech are too mainstream with too much obvious money at stake.
If you want to get into those schools, you gotta know how to game works, and practice specifically for that match.
I went to a school like that, and the majority of my classmates were from upper-middle to upper class families that had poured money into their education since they were young, with the specific goal of getting them into top schools.
Yeah, that's unfortunately the thing, when you start throwing in a ton of different criteria / measurements. The people that know how the system works, will study to maximize those points.
My honest opinion is that you kinda of end up with smart people that are very good at test-taking, but may not be the best people when presented a set of problems with non-obvious solutions, or any guides or road maps on how to solve the problems.
Also cover letters cover some of this.
Actually, I wouldn't mind a standardized test like the GRE, where a good score might actually keep your resume from getting thrown out immediately.
Engineering competence is not even slightly oversupplied compared to useful engineering work. Project failures, incompetent people, and systematically incompetent orgs are still very much alive at the most selective tech employers. There are real business needs to hire better engineers.
So they have a full 30% of their applicants qualified to be there but they have to narrow it to 1%. Whatever method they choose must have the appearance of meritocracy, abide by laws and regulations, be resistant to corruption, achieve goals other than academic performance such as culture, volunteering, and sports, and so on.
I think the same must be true for companies. At some point in the elimination process everyone is technically qualified, so they might as well hire someone "because we like his face". But that's demoralizing and corrupt so they invent some criteria that on its face seems useful, even though it's truly not.
Until I interviewed a lot of people with my colleagues during these years. Interview processes are highly biased based on the knowledge of the interviewer's background and value, or even mood.
Some interviewers are too sloppy on interviewing, asking ill-defined questions, demanding answers they want, or just in a hurry and wanting to go back to work. I often feel bad and angry for interviewees - they spent time and patience preparing themselves carefully, then was treated very casually. It isn't fair at all.
I hate to say that, although standardized tests are bad, they are better than the most nowaday interviews.
> This course will prepare students to interview for software engineering and related internships and full-time positions in industry. Drawing on multiple sources of actual interview questions, students will learn key problem-solving strategies specific to the technical/coding interview. Students will be encouraged to synthesize information they have learned across different courses in the major. Emphasis will be on the oral and combination written-oral modes of communication common in coding interviews, but which are an unfamiliar settings for problem solving for many students.
At least personally in my hirings I'll never use or trust anything like this.
There are so many things wrong with this approach that I'm kinda speachless as to where to start.
Deleted Comment
The best way is someone brings in an existing code portfolio and discusses it.
The second best way is someone completes multiple design and development exercises of varying complexity, constraints, and use cases.
The third best way is they complete a single exercise and provide commentary on alternative designs.
There is no fourth best way; all other approaches are essenrially stochastic and select for interviewing traits not development traits.
The actual best method I think is a 3 month probationary period which is more or less an extended interview. They're asked to contribute to existing codebases, participate in code review, go through some architecture design sessions, conduct stakeholder interviews - things that again are mostly impossible to accurately gauge in a typical candidate assessment window.
By the way a tremendous book for interviewers and hiring managers is How Judges Think by Richard Posner. A lot of it applies to hiring, and he's a great writer.
Number 3 is my favorite, but there's a trade-off: almost anything large enough to allow for multiple designs is probably too large to require all applicants to complete. But I also love having a candidate discuss their own code. I'm really curious what types of exercises you have for this kind of thing, and where it fits in the interview process?
I’ve seen this mistake happen so many times. Especially in early stage start ups. If you want morale issues and disruptions, you’ll achieve it 100% of the time by retaining underperforming staff. Even worse, this ultimately leads to your best team members leaving if it’s an issue you can’t solve. Fail fast applies to HR too, you need to learn to fire fast.
My workplace has a six month probationary period; I brought in a mandatory three month review after watching one group screw this up badly. At the end of the six months, the employee came in to work all happy as usual thinking everything was great, and was promptly fired.
The three month review is the point at which the employee is told that they're on track, or they're below standard. If on track, just keep going the same way and if they don't get taken aside for a specific chat in the following three months, told to assume that they're going to pass the probation period; we have definitely turned probation failures into probation successes via this mid-point review. It's also their opportunity to tell the company what the company is doing wrong; what the company is doing that will make them choose not to stay. This too has happened, and we have retained good employees by listening to them at the three month point and making changes.
If they're below standard, they're told what they need to improve and are offered help to improve, or they can just sack it now and walk (or, as happened once and once only so far, they're considered unrecoverable and we take a long hard look at how that person was hired).
The principle we subscribe to is that if the employee is surprised by the results of their probation period, that employee's team lead and by association "the company" has really screwed up. If an employee doesn't know how they're doing after six months on the job, something has gone very wrong.
I don't live or work in the US, though; we can dismiss someone during their probationary period with ease, but after that they have workplace protections. If we could fire anyone at anytime for anything we like, that effectively makes the entire employment period a probationary period. Probation never ends if you can be fired at any time.
Most states in the US have at-will employment terms. Probationary periods are common in other countries. And yet no one seems to want to do what you're suggesting.
Deleted Comment
Dead Comment
Why would anyone give this advice? Can we stop handing out this advice and encourage everyone to stand up for their rights instead? If you think about it, by proxy this gives you the following advice: as a programmer you will have no life outside of work and you are supposed to be an idiot who spends his time working and studying outside office hours even when you could spend time with your family. So yeah, go for it and suck it up you idiot.
At least that's how most employers handle this problem. And interestingly by comparison no-one tells a MBA diploma holder that it's a great career for passionate individuals who like to spend their whole life studying.
And before you think that I am against studying, that's not the case at all. If my company pays for it and I can kick back on a sofa during working hours to study I am fine with it. But I cannot see any value studying something on my own expense on my own time which will be outdated in 3 years anyway and so by definition it only benefits my employer.
It seems relevant to getting a tech job. If one is looking for a job, chances are that things have changed since the last time they looked for a job, which on an average for most people these days is every 2-5 years.
This is not limited to engineering or tech, and extends to most specialized jobs in most industries.
Can we stop handing out this advice and encourage everyone to stand up for their rights instead?
In the right context absolutely. In general, its possibly a bad idea. You could always balance it out and educate the folks about the rights of an employer
If you were on my team, I would not expect you to sacrifice any of your rights. But if you kept falling behind your peers, to the determent of the team's performance, at some point you would be put on a performance plan. The unfortunate thing about performance plans is that by the time its enforced things are close to unrecoverable.
And if you did find yourself failing, I hope someone tells you that:
The technology industry is an extremely fast-moving one. Many technologies used today didn't even exist/were popular a decade ago; in 2009, mobile app development and blockchain were pretty much unheard of. Engineers constantly need to upgrade their skills to stay relevant to the demands of the job market.
Because, I will not.
Is this public to employees or you just simply whip everyone until they work themselves to death without telling them the reasoning?
Also I assume you are in the US. In most European countries you can absolutely do nothing about someone who works full time on your team unless they let's say causes you financial losses or punch you in the face, so you can basically shove up your plan to your bottom part in other cases.
It was always basically assumed that you will study all the necessary things on your own time and when you sit in the office then you make productive things, a.k.a. as you deliver. You always had to code and show some progress of a specific development task each week. There was never such thing as time to research/study.
> It turns out that the median number of hours racked up by an MBA in his or her first year of employment is a whopping 54 hours a week
https://www.forbes.com/sites/poetsandquants/2018/03/06/the-6...
54 sounds about average for an engineer anyway. In Germany for example it's not rare to have 9 hour working days, with 30 to 60 minute lunch breaks in between. If it's so then basically one week comes to about 45 hours / week. If you study 2 hours a day, read books or read the news, follow trends, watch recordings of past conferences, etc which are pretty standard things that you are supposed to do on your job then it easily comes to 54 hours / week. So that's not high at all.
Karl Marx's entire philosophy was based on the abuses of workers he observed.
Nowadays people mock progressive causes as 'socialism', mock unions and worker protections, demonize progressive politicians, and idolize oligarchs. Even though it's all against their own best interest.
Your idealistic notion of 'rights' doesn't exist and will never happen. You have no rights as long as someone controls your purse strings, which with booming inequality is more and more people nowadays as well.
I can't wait until most jobs are finally automated and we're done with this whole capitalist system entirely and have to figure out what to do next.
We have to figure that out much sooner than all jobs are automated.
> What is the most costly technical decision made early on that the company is living with now?
> What is something you wish were different about your job?
> What has been the worst technical blunder that has happened in the recent past?
But I would be careful how you interpret these. In fact I would almost factor in these answers in the opposite way of what I think you intended. The company that admits to the worst technical issues is at least honest and self reflective. The company that doesn’t admit to any serious issues might be just as bad or worse, but their strategy is to tell employees to lie about it rather than be open to addressing it.
Perhaps companies will start asking candidates to construct mathematical proofs of data structures, algorithms, formulas and common equations from university-level entrance examinations just to do a mobile app or a web dev job.
As soon as that happens, the 'ideal candidate' companies will be expecting to interview would be a very prodigious candidate, former math Olympiad champion and decorated with titles and research papers in their name.
You guessed it: 𝔜𝔢 𝔬𝔩𝔡𝔢 𝔩𝔢𝔤𝔢𝔫𝔡 𝔬𝔣 𝔶𝔢 10𝔵 𝔡𝔢𝔳𝔢𝔩𝔬𝔭𝔢𝔯.
Perhaps we’d produce better software if people prioritised correctness like this in practice!
Now would this make sense for a graduate entry level role for a web / mobile app developer position? Interviewers looking for such candidates need to lower their expectations a bit in for positions like that.
Deleted Comment
As much shit as we give white-boarding, I would have chosen it instead if it were an option.
Needless to say after the interview ended i ran away from that deal.