A major problem with data in the job market is that there are bad actors on both sides, employer and employee, with incentives to lie about themselves; ie companies with toxic cultures won't reveal that, nor will malicious or incompetent job candidates. Many readers of HN know that companies can be bad, but don't recognize that, say, competent software engineers are not the median candidate applying for most roles. The other side has a filtering problem, too.
A second major problem with data is that it's often hard to know why something didn't work out. Yeah, not everybody's a good fit for a role. But where would you get an objective source of truth about an individual's performance. Or ... transpose it to dating: would you trust a group of ex-girlfriends' or ex-boyfriends' to give an accurate assessment of someone as a partner? Might they have incentives to distort the record? Where does good data come from?
So it's not just that AI lacks the data. It's that there are structural problems with ever gathering the data. It's not like the data's out there and we just don't have access. If it even exists, it's poisoned at the source.
The best hires I've seen in my life have come through word of mouth referrals. No data, no recruiters, no leetcode, no HR screenings. Good people know other good people. They trust one another, they are motivated to discuss opportunities of mutual benefit.
Of course that doesn't scale when you're trying to fill in a large team quickly, but at that point you have to realize that "average" is about the best you can hope to get.
> The best hires I've seen in my life have come through word of mouth referrals. No data, no recruiters, no leetcode, no HR screenings.
Nepotism. It gets a lot of bad rap but the reason this is effective is because you're essentially comparing hiring via a piece of paper vs via intimate knowledge of a candidate's actual knowledge and work habits. The latter clearly has more information than you could get from any screening process. Of course this doesn't mean you should just hire someone because they're a relative or friend because that's not actually using the information gain. But we should recognize why the process is successful. If we want to make a fair system that doesn't rely on nepotism then we have to also recognize why it works in the first place. I don't see these complicated hiring processes doing this.
Of course, you can also just use a noisy process. But if you're going to use a noisy process maybe don't make it so expensive (for company and candidate; money and time). If this is the method, then the process should be about minimizing the complexity and costs of hiring.
Some of the worst hires come through word-of-mouth referrals, too. That's how you get unqualified nepotism hires. In fact, networking can be thought of as a form of nepotism or cronyism (depending on how touchy a given person is about "nepotism" being applied to the manner in which they got their position). I know I'm likely alone in viewing such hiring practices - feelings over process - as signal of an impending collapse. I think it's a sound heuristic, though, at least for previously-established companies. If you're not connected to eminently-qualified potential hires (and you're probably overestimating the quality of your network), "hire my friends" is a sure way to end up with an increasingly incompetent workforce.
Arbitrary opinions that cannot be decided by mere bias in data aggregated by an AI of any sample size or source. This is the fundamental flaw and why it's all horseshit.
You know artists many years ago already covered all this.
"competent software engineers are not the median candidate applying for most roles"
Most developers with a few years of experience are pretty competent. Most employers are not toxic waste dumps.
Give the worst interview candidates a try and they will do well. Give the best interview candidates a try and they will fail or leave early. Interviewing is a skill, anyone who is extremely good at it most likely practiced more often and that could mean they needed to. The more experienced and poor interview they give the more of an unearthed gem you've discovered
I just interviewed someone who I am certain lied on their resume; they could barely write valid code during their interview, despite supposedly being a senior engineer with nearly 10 years of experience.
Randomly hiring the first people to apply (or "giving them a shot") is a terrible strategy unless you have tons of money to burn and no deadlines to meet.
We have internship positions for people willing to learn, but when we are trying to fill a senior position, we expect people to be at a certain level.
This is more subtle and profound that it might seem.
I work for a Big Tech and do tons of interviews, and not even us we have the data about what works and what doesn't (and if that data does exist, it's a very closely guarded secret). Even if we had hard data, we would still be blindsided by our inherent biases i.e. we know how candidates we accepted did, but not how candidates we rejected could have done.
I would even argue that knowing just how well the accepted candidates did is also extremely difficult. Some assesment might be possible based on some simple metrics, but the ultimate 'how much value does this person add' is a hard one. I wouldn't always trust performance reviews and the like for someone's added value, since that adds another point for added bias.
the data doesn't exist because it can't. there are no independent variables. even a team of 1 relies on good management, getting assigned good work that is well specified, and thousands of other variables. maybe a super expensive unsupervised model could figure out how well one person with a ton of data on their working would work on a specific team that also has produced a lot of data on how they work, but that doesn't seem to happen so it's likely not more cost effective than letting people decide.
I have seen human recruiters and companies blatantly lie about what they want, and what they are willing to offer as well.
Bad data problem doesn't go away because there's a human doing the matchmaking manually. If anything, it's adversarial against job seekers if the process isn't done algorithmicly.
Author here. I replied to a similar comment, but I'll reply to this one too because it's a really good point, and I think I should have done a better job of calling out in the post that humans suck at hiring as well.
Most recruiters are terrible, and they're kinda set up to fail because the data isn't there in exactly the same way. The difference is that good human recruiters can make some meaningful warm intros, let 2 engineers get in a room together to see if there's chemistry, and get the hell out of the way.
Similarly, I have so many terrible experiences at 'job fairs' for engineering students, where they only sent HR. Things like asking what stack they use, and getting answers in the form of 'Diverse an challenging technology is at the heart of our innovative agenda ...'. Even when I _know_ the insider details through friends, HR always does such an appalling job of selling it to you, even when they're actually good.
Humans being bad at social decision X is never an argument against humans making those social decisions. This is because of our moral status: we are responsible for our own social systems. We cannot morally evade or nullify that responsibility by asking a Magic Eight Ball to make decisions for us.
Science Fiction writers going back to Asimov have already imagined such dystopias as when machines imprison or enslave humans because humans cannot be trusted to know what is “best” for their own lives. And of course that is how despots like Putin justify themselves.
Maybe I am biased or suboptimal in my hiring, but I own my business and I will live or die by my OWN decisions, thank you. Does any CEO feel different?
AI doesn’t have the data. Neither do humans. So why are humans better again? I see it answered in another part. Humans are better at deciding who they like working with. Which is fair enough. Although AI might help remove some biases that creates.
By the way this is a very hard problem because it is like solving “war” in a sense. People need money to not die. They are very motivated to get a job. Therefore any system anyone comes up with gets gamed. There are 2 types of skill: ones you use in jobs and ones you use in interviews and there is very little overlap. It will be hard to ungame something where people need a job.
I would even question how good humans are at deciding who they like working with better.
Even more so than war it is like dating and that is exactly why I question the above. Humans can't even tell better than random who is a good marriage candidate with nearly unlimited time to make the decision. There is a game theoretic poker match going on but we pretend as if both sides don't have cards they hold close to the vest and that there isn't this huge random element with how the cards come off the deck.
The flush draw hits and the hiring manager pretends that it was some kind of fortune telling skill they have.
It is really a great example of where we think of ourselves as this highly evolved society from the Enlightenment but using a process that is practically no different than something the medieval church would come up with.
Totally agree with this. I was tasked with getting some insights from hiring data once and everyone was disappointed we just couldn’t get anything meaningful. There were echos of something. But no signal could rise above the noise into the realm of statistical significance. Seems like we just don’t know how to measure what matters.
> everyone was disappointed we just couldn’t get anything meaningful.
Nice .. my cynical side thinks that most of the time this is the expected outcome, in which case it wouldn't lead to disappointment, but just the next step of finding something that could plausibly be meaningful
Author here. 100%. Most recruiters are terrible, and they're kinda set up to fail because the data isn't there in exactly the same way.
The difference is that good human recruiters can make some meaningful warm intros, let 2 engineers get in a room together to see if there's chemistry, and get the hell out of the way.
I should make this clearer in the post. "Recruiters can't do hiring... and neither can AI. Both can't because the data's not there."
As it stands now, modern recruiting is roughly the same as an AI recruiter.
Scrape 10,000 emails/basic biographics and email blast everyone with an exciting opportunity at an {X} company doing {Y} thing that raised {Z} money. Insert unlimited leave, work life balance, and some other thing no one cares about but the company.
If you wanted ideal data, you'd want an employer to hire large numbers of people for a single role, hire them as randomly as possible, and be willing to train them. You could then see what sort of candidate was successful there.
There are a handful of militaries that effectively do that, but it's pretty much impossible for a normal employer.
When I was at Google, I remember hearing that this experiment was done on interns once. It wasn’t completely random, but they let in people who failed the standard Google interview to validate the effectiveness of that interview. Since interns were hired for a limited time, there was a lower cost to a false positive hire compared to an FTE.
Which militaries? I'm just genuinely curious. To my knowledge, most militaries in the world use IQ tests to determine placement qualification which is more or less completely antithetical to this concept. It'd be interesting to compare and contrast how it works out.
AI can't do hiring because it knows the best hire is the one you don't do: why bother with a pesky human when it can do the job itself cheaper/faster/soonish better.
Aside: having LinkedIn is a red flag in my book. I understand that one's desperation could lead to try getting a job using that cesspool of anti-patterns glued together with pure spite for the user, nevertheless, a red flag. Or to put it otherwise, if having a LinkedIn is a requirement for you to get hired, your job will disappear in 5 to 7 years (due to AI or other corporate movements).
I hate that "service" as much as you do, but people use it for all sorts of reasons. So if you're going to use it as a blocking filter, well, I guess that's the kind of luxury you can treat yourself to.
Yes, sorry Mr. Dean, no job for you here. Even if he told me he wanted to work for me, I would just call an ambulance, for he must have some kind of stroke.
As I typed the initial rant I opened an incognito tab and searched "Andrej Karpathy LinkedIn", had the same curiosity as you, if the top stars use that "service". He has an account. I clicked the link, LinkedIn took me through a quick security check/Verification process through which I had to select which bull was standing in the "right position", then it opened Mr. Karpathy's LinkedIn page and I saw as his last activity/posted article that he hires for Tesla AI, which is weird because I knew he quit, I clicked the article to read more, and LinkedIn opened a modal to Log In. Closed the page, and promised myself I will never again bait myself to open LinkedIn (probably I will break the promise in a year or two, one must check LinkedIn to see the state of the tart† in what not to do in user experience).
† wanted to use "f-art" but ChatGPT recommended "t-art" as wordplay: 'This phrase combines the concept of a "tart," which can refer to a promiscuous or morally questionable person, with the original phrase. It adds a touch of humor while implying a negative connotation. However, please note that wordplay involving potentially derogatory terms or stereotypes should be used with caution and sensitivity.'
A second major problem with data is that it's often hard to know why something didn't work out. Yeah, not everybody's a good fit for a role. But where would you get an objective source of truth about an individual's performance. Or ... transpose it to dating: would you trust a group of ex-girlfriends' or ex-boyfriends' to give an accurate assessment of someone as a partner? Might they have incentives to distort the record? Where does good data come from?
So it's not just that AI lacks the data. It's that there are structural problems with ever gathering the data. It's not like the data's out there and we just don't have access. If it even exists, it's poisoned at the source.
Of course that doesn't scale when you're trying to fill in a large team quickly, but at that point you have to realize that "average" is about the best you can hope to get.
Nepotism. It gets a lot of bad rap but the reason this is effective is because you're essentially comparing hiring via a piece of paper vs via intimate knowledge of a candidate's actual knowledge and work habits. The latter clearly has more information than you could get from any screening process. Of course this doesn't mean you should just hire someone because they're a relative or friend because that's not actually using the information gain. But we should recognize why the process is successful. If we want to make a fair system that doesn't rely on nepotism then we have to also recognize why it works in the first place. I don't see these complicated hiring processes doing this.
Of course, you can also just use a noisy process. But if you're going to use a noisy process maybe don't make it so expensive (for company and candidate; money and time). If this is the method, then the process should be about minimizing the complexity and costs of hiring.
Arbitrary opinions that cannot be decided by mere bias in data aggregated by an AI of any sample size or source. This is the fundamental flaw and why it's all horseshit.
You know artists many years ago already covered all this.
Impressionism: https://en.m.wikipedia.org/wiki/Impressionism
Most developers with a few years of experience are pretty competent. Most employers are not toxic waste dumps.
Give the worst interview candidates a try and they will do well. Give the best interview candidates a try and they will fail or leave early. Interviewing is a skill, anyone who is extremely good at it most likely practiced more often and that could mean they needed to. The more experienced and poor interview they give the more of an unearthed gem you've discovered
Randomly hiring the first people to apply (or "giving them a shot") is a terrible strategy unless you have tons of money to burn and no deadlines to meet.
We have internship positions for people willing to learn, but when we are trying to fill a senior position, we expect people to be at a certain level.
I’ve reviewed hundreds of assignments in last two years. No, that’s not true, chief.
I work for a Big Tech and do tons of interviews, and not even us we have the data about what works and what doesn't (and if that data does exist, it's a very closely guarded secret). Even if we had hard data, we would still be blindsided by our inherent biases i.e. we know how candidates we accepted did, but not how candidates we rejected could have done.
Bad data problem doesn't go away because there's a human doing the matchmaking manually. If anything, it's adversarial against job seekers if the process isn't done algorithmicly.
Most recruiters are terrible, and they're kinda set up to fail because the data isn't there in exactly the same way. The difference is that good human recruiters can make some meaningful warm intros, let 2 engineers get in a room together to see if there's chemistry, and get the hell out of the way.
Science Fiction writers going back to Asimov have already imagined such dystopias as when machines imprison or enslave humans because humans cannot be trusted to know what is “best” for their own lives. And of course that is how despots like Putin justify themselves.
Maybe I am biased or suboptimal in my hiring, but I own my business and I will live or die by my OWN decisions, thank you. Does any CEO feel different?
By the way this is a very hard problem because it is like solving “war” in a sense. People need money to not die. They are very motivated to get a job. Therefore any system anyone comes up with gets gamed. There are 2 types of skill: ones you use in jobs and ones you use in interviews and there is very little overlap. It will be hard to ungame something where people need a job.
Even more so than war it is like dating and that is exactly why I question the above. Humans can't even tell better than random who is a good marriage candidate with nearly unlimited time to make the decision. There is a game theoretic poker match going on but we pretend as if both sides don't have cards they hold close to the vest and that there isn't this huge random element with how the cards come off the deck.
The flush draw hits and the hiring manager pretends that it was some kind of fortune telling skill they have.
It is really a great example of where we think of ourselves as this highly evolved society from the Enlightenment but using a process that is practically no different than something the medieval church would come up with.
Nice .. my cynical side thinks that most of the time this is the expected outcome, in which case it wouldn't lead to disappointment, but just the next step of finding something that could plausibly be meaningful
The difference is that good human recruiters can make some meaningful warm intros, let 2 engineers get in a room together to see if there's chemistry, and get the hell out of the way.
I should make this clearer in the post. "Recruiters can't do hiring... and neither can AI. Both can't because the data's not there."
Scrape 10,000 emails/basic biographics and email blast everyone with an exciting opportunity at an {X} company doing {Y} thing that raised {Z} money. Insert unlimited leave, work life balance, and some other thing no one cares about but the company.
There are a handful of militaries that effectively do that, but it's pretty much impossible for a normal employer.
Part of what they want to predict is your ability to complete the training programs.
Aside: having LinkedIn is a red flag in my book. I understand that one's desperation could lead to try getting a job using that cesspool of anti-patterns glued together with pure spite for the user, nevertheless, a red flag. Or to put it otherwise, if having a LinkedIn is a requirement for you to get hired, your job will disappear in 5 to 7 years (due to AI or other corporate movements).
https://www.linkedin.com/in/jeff-dean-8b212555/
I hate that "service" as much as you do, but people use it for all sorts of reasons. So if you're going to use it as a blocking filter, well, I guess that's the kind of luxury you can treat yourself to.
As I typed the initial rant I opened an incognito tab and searched "Andrej Karpathy LinkedIn", had the same curiosity as you, if the top stars use that "service". He has an account. I clicked the link, LinkedIn took me through a quick security check/Verification process through which I had to select which bull was standing in the "right position", then it opened Mr. Karpathy's LinkedIn page and I saw as his last activity/posted article that he hires for Tesla AI, which is weird because I knew he quit, I clicked the article to read more, and LinkedIn opened a modal to Log In. Closed the page, and promised myself I will never again bait myself to open LinkedIn (probably I will break the promise in a year or two, one must check LinkedIn to see the state of the tart† in what not to do in user experience).
† wanted to use "f-art" but ChatGPT recommended "t-art" as wordplay: 'This phrase combines the concept of a "tart," which can refer to a promiscuous or morally questionable person, with the original phrase. It adds a touch of humor while implying a negative connotation. However, please note that wordplay involving potentially derogatory terms or stereotypes should be used with caution and sensitivity.'