I took their quiz without thinking once from a throwaway account because I was afraid of taking it from a real account and also because I had no context about what would be in it, which made it seem high risk. That made me realize it wasn't as scary a thing as I was afraid it might be, but also kind of summerizes what I don't love about it.
Later, I think I took the quiz from a real email. I think could have scheduled an in-person or something. But I was always afraid to: just like with the quiz, it felt like another layer of high risk, where being slightly off on one thing would kind of potentially blacklist you forever. (Which sort of sounds absurd to me now, but young me was like "what if it matters?" and "it'd be humiliating to not make it through.")
I work for a company that's been invested in by reputable venture firms so I don't think I'm bottom of the barrel as an engineer or anything. But the above anecdotes are to illustrate that I always found that the platform push my deeply ingrained risk aversion buttons. Which is interesting to think about.
With job interviews it's extremely annoying to prove you bona fides over and over, with every interview or application. But having the opportunity to do so only once seems even worse.
Edit: that said, I think doing a professional exam after college wouldn't have seemed risky in the same way as a ~12 question quiz and a couple interviews being so potentially determinitive.
Oooh, this lets me tell a great anecdote that didn't have a place in the post.
So it turns out you were not alone. Getting people who did well on the quiz - whose parameters were trained on interview performance, by the way, so "did well on the quiz" was synonymous with "had a good shot at the interview" - to sign up for interviews was a major problem for a while.
The solution? We told them the first booking was just a "practice" interview. And then if they did well, surprise, it was a real interview all along. (If they didn't, we'd let them try again, though it almost never changed the outcome.) IIRC this like doubled booking rates overnight. I don't really plan to do this at Otherbranch (partly because being aggressively upfront about things is a founding value, even when it comes to white lies of that sort), but it's a fun story that I don't think was actually unethical.
There's a lot I could (do [1]) say about the psychology of interviewing, and it's something I'd love to write more about down the line. Mental health - especially around anxiety and depression - is my #1 personal cause and job hunting touches so concretely on so much of it.
> *The solution? We told them the first booking was just a "practice" interview. And then if they did well, surprise, it was a real interview all along. (If they didn't, we'd let them try again, though it almost never changed the outcome.)
Honestly, saying it's a "practice" interview, and then saying "Oh, you did so well we're just going to let you skip the real one" isn't lying. Some people might need to believe there's a safety net in order to join. Other people might really need practice.
I would love to learn if TripleByte's data was sold off somewhere? I also had this concern (which was probably a bit overblown). One of the main scenarios I was concerned about was a case where the firm was sold off for parts.
I went through their process for frontend like 6 years ago and got to the in person interview and it was the most inane BS I have ever wasted time on. A bunch of questions like listing "best practices for X and tell me about <buzzword tech thing>.
I did the online interview and then the in-person one and those worked out ok. But, I think Triplebyte was only viable because the huge engineering shortage at the time made that high-touch and expensive process worth pursuing. Its purpose was to find talented people who otherwise would fall through the cracks. Now there are unemployed engineers with impressive resumes everywhere, so there is less demand for a deep-digging recruiting operation like Triplebyte.
It's interesting to see how much "standardized cross-company interview for software eng" has been consistently a cursed problem in the industry.
Unlike airline pilots (or I'm certain many other professions), every company in the valley insists on re-interviewing a candidate in their own custom and unique way, instead of trusting some sort of an industry-wide certificate only achievable through a standardized test. Wonder if this will ever be solved.
I remember way back in the day where it felt like Triplebyte might finally figure that one out, but it unfortunately never happened.
If you create a standardized test it will be gamed. Even with the small modicum of standardization around interview questions that we currently see, people have published books like Cracking The Code Interview, making it easier for people who don’t have the skills for a particular job to pass interviews at any place that uses standard-ish questions.
Furthermore, as an avowed enemy of “Clean Code”, I don’t want to see standardization because I fear that well-promoted ideas that I think are terrible would become required dogma. I prefer chaos over order that I don’t like.
The current system is already gamed and virtually standardized. The only difference that official standardization would present is that applicants would no longer have to go through the Leetcode gauntlet each time they want to switch jobs, which would save a breathtaking amount of time and effort currently occupied by wasteful redundancy in the interview process.
Corporations can use that standard exam/license as a baseline and then focus their interviews on domain-specific questions and the like. The existence of standardization does not negate custom processes.
How do we let someone fly hundreds of people through the upper atmosphere with a certificate, but you can't make a login page with javascript without a unique multi-day interview for each distinct company?
> If you create a standardized test it will be gamed.
Well, the medial profession has a standardized licensing process. It's not perfect, but it certainly keeps the interview process to (mostly) mutual interest.
I think we can learn from the medical profession here. Otherwise, "I prefer chaos" implies that the incompetents are the ones who are the ones who will lose.
Just out of curiosity, what are some of the problems with "Clean Code"? I thought most of it made sense as basic guidelines. It's been a while since I read it though
Standardization reminds me of old stories about 1970-80's blue chip companies trying to hire programmers like they hired secretaries. They'd test applicants for things like word per minute typing speed, simple programming tests, hire in bulk and then dole batches of them out to various departments. Which sounds like triplebytes model, the motivation behind things like clean code and the webshitification of everything.
Opposite of that is the idea that work and interpersonal habits, communication skills, and domain knowledge are more important than raw programming skill for most jobs.
> Unlike airline pilots (or I'm certain many other professions), every company in the valley insists on re-interviewing a candidate in their own custom and unique way, instead of trusting some sort of an industry-wide certificate only achievable through a standardized test. Wonder if this will ever be solved.
The airplane pilot interview process on top of the standardized government certifications includes:
- On-line application (resume and cover letter)
- On-line Psychometric/Aptitude testing (sometimes this is hands-on, on site for some airlines)
- Video Interview, SKYPE or Telephone interview
- Assessment Day (includes: Technical Questions / Panel Behavioral Interview / Scenario Based Questions / Flight Planning Exercise and sometimes a Clinical Assessment)
- Sim Test
- Referee Check
- Medical Check
The exact details differ by airline and I'm assuming the risk profile of the cargo (ie: passengers or not).
Gosh, not so different from software engineers, is it? Except you also need to do a bunch of bureaucratic certifications on top of that.
Not to mention all of the licensing, regulations, and formalized training hours that you have to put in just to reach that point. It’s all substantially harder than studying LeetCode for a short slice of your life.
It’s amazing how often I hear about how easy interviews are in other professions, according to engineers who dislike coding interviews.
Then you look into those other professions and it turns out changing jobs is actually a lot harder than the internet comments would lead you to think.
It's completely different from software engineering interviews. The process you described for airline pilots gets to the actual qualifications for the job. Whereas for software engineers, literally no one needs to reverse a binary tree yet they base the decision on in large part on this sort of question. Ideally, the BS would be encapsulated in a certification so that interviews can focus on the real useful stuff.
For many roles the interview is as much a cognitive function and socialization test as it is a skills test. You can have exquisitely detailed knowledge of systems internals (skill) but if you have limited working memory (cognitive) then you will struggle to design non-trivial systems software. These are orthogonal dimensions. You might prefer someone with high cognitive aptitude and low skill, since the latter is readily trainable.
Cataloging a list of skills is insufficient information to determine if a person is suited for a role. I don't find it likely that software engineers will be subjecting themselves to a battery of standardize cognitive function tests any time soon.
Is law a good example? My understanding is if you didn’t go to a top 14 school (whoever came up with that arbitrary number) it basically forecloses on the best opportunities.
It's because companies don't want capable, experienced or well-equipped. They want genius and it is really hard to test for genius. Granted, almost nobody that gets through any process is an actual genius....
I'd say it's the exact opposite. There are hordes of unqualified people applying to every software dev role imaginable regardless of what you put in the job description or requirements. The tests are there because people are good at lying but bad at faking skills.
This process is also pretty much guaranteed never to yield mid-career geniuses at the height of their powers. Those candidates don't go looking for work at all. Work comes looking for them. Why would they go on _any_ jobs platform, ever? Effective filtering of the candidates who actually engage with the platform can, at best, accurately identify the next tier down: effective engineers in mid- and late-career, and inexperienced whiz kids. Not that this is a bad thing; that first category makes the world go 'round.
> Those candidates don't go looking for work at all. Work comes looking for them. Why would they go on _any_ jobs platform, ever?
Because I don't know what’s out there, or who will give me the best offer. If you’re skilled and in the middle of your career, it’s easy to find a job, but if your options are wide open, a matchmaking service like this with a wide pool of companies is very valuable.
The problem is that software development is less like hiring an airline pilot or a structural engineer, and more like hiring an artist. Try making up a "standard exam" that will tell you whether an artist will produce several great unique works for you in the future, so you know which one to hire...
That's an interesting point, but then one wonders that if software eng are ultimately artists, why are we not having them work on their portfolios like the other art disciplines? Is that the fundamental problem?
Yet frankly, what most of us do is more like plumbing than art. In that we're just fitting systems together and in that it's actual skilled labor and in that we're seen by everyone else as the ones willing to do the shitty work.
Management puts up with us and they pay us because even though they think they can do our work, they wouldn't want to.
Plumbers are licensed and unionized, two possible solutions to the problems posed in this thread.
Part of me wonders if the recruiting itself is over-engineered anyway. I mean, imagine if you just asked:
Implement Bubble Sort, in 2 Languages of your choice, with multithreading or other language-provided parallelism, with the correct number of parallel threads to be most algorithmically efficient
Would that really not weed out a lot of people? I think it would. I know the above algorithm is hardly production-ready, but the requirements are easy to understand. (It's also a bit of a trick question - there is no optimal algorithmically efficient number of threads in a bubble sort, only the number of CPU cores in the system.)
Our coding problem is easier than that (by a fair margin, it's all completely synchronous single-threaded procedural code unless you're doing something extremely weird) and it weeds out the vast majority of applicants.
The same was true of all three of the standard coding problems Triplebyte used. They're not quite literal fizzbuzz, but they require - at best - some basic critical thinking and basic language features, and that is a filter that eliminates 90+% of applicants to dev jobs. Now, granted, this is under time pressure. I imagine, given several hours, most could finish it (although maybe even that is overestimating things). But still.
There's an old Randall Munroe article quoting a physicist:
> The physicist who mentioned this problem to me told me his rule of thumb for estimating supernova-related numbers: However big you think supernovae are, they're bigger than that.
and I feel like this applies to recruiting: however bad you think the average applicant is, they're worse than that.
This sounds like what a lot of companies do - except the scaled problem with this approach (and the certification approach of the grandparent) is that most companies want to avoid candidates who've memorized a specific solution, as then they don't get any data about whether they can code anything aside from what was memorized.
The other problem is that implementing bubble sort will tell you about their skills in a particular dimension, but being a software engineer these days may look very different depending on the job.
I do a tree-search-ish thing when interviewing people. I’ll start with a super basic question about something beginner-ish on their resume. If they can’t answer that, the interview is politely wrapped up. I’ve eliminated a surprising number of people who had jQuery on their resume by asking them to write code that will make a div with the ID “mydiv” disappear if the user clicks a button with the id “mybutton”.
After that I ask a super difficult or niche trivia question like “in CSS, what two overflow properties cannot be used together?” If I’m hiring a mid-level frontend developer and they nail that one, I go “fantastic, great answer, do you have any questions for us?” And the interview can end early.
But if they miss that, no sweat, I’ll start asking more mid-level technical questions to figure out where they’re at.
It's also a mostly useless problem for determining engineer quality, in many cases.
It tests for pure coding ability, when most organizations should be optimizing for engineers that are going to be productive-if-not-spectacular, that can design and build maintainable systems.
Could I have written the above problem back in my engineering days? Probably not, since I went years not working with threads. But I also wasn't working on problems that would ever have benefited from knowing that. Most software engineering roles are essentially building CRUD or ETL systems, maybe with a user interface. Any coding problems given should be optimized for weeding out bozos (which are still plentiful), not for weeding out the most people.
I find picking good questions is hard, and many fall into similar patterns, making them something candidates can practice for.
Even your question isn't something I'd necessarily ask on the spot. Many engineers don't use parallelism in their day-to-day work(webdevs). The part about making it efficient is interesting, but feels borderline like a trick question that a good engineer could fumble.
I don't think we are over-engineering it. You want to "weed out" everyone but the best candidate for the role, or the best candidates for your open roles. It's a very hard problem to identify the "best" person from a group of people. It would be different if all programmers that are good enough are the same, but we all know the skill ceiling on programming is very high. Selecting the best devs is a critical function for any software project.
Airline pilot is not a great choice for comparison.
Airline pilots are selected to be entrusted with many lives, with the utmost professionalism, and to perform with ability under stress.
And it shows, in the amazing track record of aviation safety.
Most software developers, on the other hand, are mainly paid to mechanically type out code, cribbing from StackOverflow or ChatGPT whenever they don't know something, to "get it done".
And it shows, with the atrocious track record of the entire industry at information security, for one obvious metric.
I was Triplebyte's first engineering placement. I still remember going to a random SoMa apartment with Harj and Ammon and Guillaume and coding up tetris in ruby, having no prior experience with game loops. That landed me a job with Flexport in 2016. I doubt that I would have gotten that placement without Triplebyte. So I am quite grateful that they existed, for jumpstarting my early career.
With that said, when it came time to look for a job again a few years later, I did chat with Triplebyte but ultimately took an offer through other contacts that I had built up by then.
After my (virtual) TB interview (which I barely passed), I had onsites at 5 places. After the five on-site interviews, I had 2 job offers, one of which was a company I wanted to work for since graduating college. I took the other offer.
This was preceded by a four or five month job search. I had received two offers in that time, but nothing seemed great.
I think TB's process kinda worked, but I understand your skepticism.
I feel like they didn't cover one of the massive losses of confidence in TripleByte, which was when they opened up their database of candidates and did not really warn people ahead of time. I'm struggling to remember all the details, but I feel this happened a few years ago and really pissed off the HN community and a lot of people scrambled to hide the fact they screened with TripleByte. I feel like as an outside user this was the nail in the coffin. The dark patterns just took over.
It's not covered extensively in the article, but it was alluded to at one point via the phrase "pissed them off with anti-privacy decisions". (In context, 'them' refers to the engineering side of Triplebyte's user base.)
Anti-privacy can mean "we accidentally leaked that you visited facebook.com on Nov 15" or "we allowed your employer to see that you're actively job hunting."
I mean technically the same term is OK to use for both, but it does feel like it's burying the lede.
I felt that was an odd omission in the article. Maybe it didn’t have a huge market impact outside the local crowd, but it was a big deal then. I seem to remember the CEO posting in the threads apologizing, etc.
It sounds like the core problem is that companies don't really have trouble screening large amounts of incoming low quality resumes. It's annoying and time-consuming for the software engineers, but it's a well-known process and the CEO/CTO/COO/VP who controls the budget can just make the engineering team do it.
The core problem that companies are willing to pay for is "top of funnel". The obviously skilled, experienced software engineer that every company wants. How do you make them interested in your company in the first place? Triplebyte did not really have a cost-effective solution for that, although that's what people wanted most when buying their service.
They can just make the engineering team do it, but that's really expensive.
A failed onsite is more expensive than if the candidate walked in the door, grabbed someone's laptop, threw it out the window, and left. That's a big deal. Even a 15-minute phone screen at typical Bay eng salaries is like a nice steak dinner (particularly if you include disruption to actually go do it).
> How do you make them interested in your company in the first place? Triplebyte did not really have a cost-effective solution for that
They absolutely did! It was getting companies to agree to skipping straight to the onsite after a 15-minute recruiter call.
This lets you skip the back-and-forth of screen scheduling and result chats. In turn, that saves weeks of time during a job search, and allows you to have all of your onsites within a short time period, and then and offers come in within a much narrower window of time. You can make apples-to-apples comparisons about which role you might take that way, so it's a much lower cost for a much greater payoff to spend a few hours interviewing with a company you weren't thinking of to see if it's a fit.
In my mind, this was the absolute killer feature. I am still at a Triplebyte job 7 years later that impressed me via their onsite, and I would not have thought to apply if they had not been on TripleByte's skip-to-the-onsite list.
The article says people loved screen, and as a user myself I concur and used it for exactly what was mentioned - screening junior candidates efficiently. Theoretically if the company wasn’t venture funded they could maybe have waited for those junior candidates to turn into senior ones that people want. I wish someone would bring back that tool.
I can't help but want to draw the parallel between triplebyte and dating app: both are marketplace for 2 parties, or say networks; both networks are inherently two-tier, one premium tier and one, let's say subpar tier.
For the premium tier, the problem is less so the efficiency of the matching process because participants have powerful signals, I would say the problem is the efficiency of negotiating a fair term because both parties are not incentivized to go to a public auction, hence no market price. Hence the hiring process may be less so investigating potential candidates and more so "let' just wait two weeks and see who's the highest bidder".
For the subpar tier, the problem is less so the efficiency of the matching process (this still is part of the problem), but more so a cost-effect analysis: is it worth throwing X amount of resources to filter a huge pool of unknown quality candidates?
Obviously, one more problem is that there's no easy way to use whatever incentive to quickly bump up the supply.
So intrinsically, triplebyte/dating app tried to use tech to alleviate the pain point, but it is a part of the bigger, convoluted mating dance, and improving a non-critical part of the whole pipeline doesn't yield a so-called step function gain overall. That, plus venture backing, forces triplebyte to go down the rabbit hole and try to acquire as much of the lower subpar tier network as possible, yet this dilutes their attention and human touch on the premium tier, and goes down a death spiral.
Sorry I do not intend to bash the author or the founders for picking this problem/user need to solve, it just occurs to me that this heterogeneous network with no easy way to increase supply/lower demand, is really hard for a marketplace company and sometimes simply throwing technology onto the white-hot competing segment may not be the groundbreaking solution. I wish the author all the best with their next endeavor
We talked about the dating-app analogy a LOT, and I think it goes quite deep. It applies to almost all competitive markets, especially matchmaking ones.
> I would say the problem is the efficiency of negotiating a fair term because both parties are not incentivized to go to a public auction, hence no market price.
Are you referring specifically to salary negotiation here? I think this is one of the things that makes a trusted intermediary useful.
During signup, or during an initial recruiter call with candidates, I ask about salary with the following script:
> Okay, what are your salary expectations? To be clear, I will NOT share this with clients unless you tell me that I can. This is just so we know what jobs you might be interested in - the only thing we'll communicate is whether you match a client's salary range or not and vice-versa.
While I doubt the incentives here align to perfect honesty, it's certainly a lot better than the background, in that each side gets information on the other only after revealing semi-honest preferences used to match them.
> For the subpar tier, the problem is less so the efficiency of the matching process (this still is part of the problem), but more so a cost-effect analysis: is it worth throwing X amount of resources to filter a huge pool of unknown quality candidates?
Yeah, this is where the pitch about centralization - the main pitch I want to make with Otherbranch, since I think the full background-blind pitch was overselling things - comes in. There's some threshold where the cost-benefit analysis is neutral for companies, in the sense that there is 0 expected value to the next marginal candidate. But if we're effectively interviewing for more than one company, that threshold is lower for us, because the cost is similar and the benefit is higher.
I don't expect we're going to be able to get a mediocre developer with a mediocre resume an amazing job. I'm not really trying to do that. I'm trying to get an amazing developer with a mediocre resume an amazing job, or a good eveloper with a mediocre resume a good job - one better than they could have gotten on their own.
> > Okay, what are your salary expectations? [...] This is just so we know what jobs you might be interested in - the only thing we'll communicate is whether you match a client's salary range or not and vice-versa.
> [...] I'm trying to get an amazing developer with a mediocre resume an amazing job, or a good eveloper with a mediocre resume a good job - one better than they could have gotten on their own.
Is the reason that you can get employers and workers to commit to salary ranges like this, that you're focusing on mediocre resumes?
That is: Workers with mediocre resumes can't afford to be too greedy and play their cards too close, and companies aren't too hesitant to filter out someone with a mediocre resume?
(BTW, the below sounds like it could be a good/great thing. I'm not criticizing it, just curious whether that's also why requiring salary ranges upfront would work here, when normally both parties would be resistant.)
> I'm trying to get an amazing developer with a mediocre resume an amazing job, or a good eveloper with a mediocre resume a good job - one better than they could have gotten on their own.
> Are you referring specifically to salary negotiation here? I think this is one of the things that makes a trusted intermediary useful.
Yeah, I agree. I do feel like you're not going to take the venture-backed route this time? Cause in my opinion that's part of the reason why drive triplebyte went down this route. Based on your writing you probably concluded triplebyte model back in its heyday was going to be a decent business, just that it can't be a VC home-run.
They got me a stellar job many years ago, but unfortunately for them it's hard to monetize candidates unless they return through the same pipeline. My outcome was great, but I was one and done.
I applied to TripleByte in 2019. I passed the interview, they sent me some swag, and then had my profile "go live" for a week(?). I didn't get a single interested company, and then they took my profile down and said I can try again in 6 months. I don't really have much of a takeaway from this, I guess, other than they also had other failure cases that caused them to lose money.
I went through the process and liked it too. The engineer made critiques about my code that I found silly though, which really makes me question if I would really trust a candidate approved by them.
In my case, they didn't support my specialty (robotics/embedded) and focused the tests of DB scaling because that's what the interviewer knew. Like, I'm sure it was a good test for somebody, but it made no sense as a validation of my skills or experience.
This is a very interesting, very well written post. The part that stands out the most is:
> if you needed venture capital to produce a viable small-scale business, you didn't really have a viable small-scale business to begin with. Triplebyte's break-even status was an illusion, predicated on funding that would probably never have existed if that scale were the founders' only ambition.
I wonder if there's a way to build a viable business out of a product like FastTrack that does not depend on venture capital to get going?
Later, I think I took the quiz from a real email. I think could have scheduled an in-person or something. But I was always afraid to: just like with the quiz, it felt like another layer of high risk, where being slightly off on one thing would kind of potentially blacklist you forever. (Which sort of sounds absurd to me now, but young me was like "what if it matters?" and "it'd be humiliating to not make it through.")
I work for a company that's been invested in by reputable venture firms so I don't think I'm bottom of the barrel as an engineer or anything. But the above anecdotes are to illustrate that I always found that the platform push my deeply ingrained risk aversion buttons. Which is interesting to think about.
With job interviews it's extremely annoying to prove you bona fides over and over, with every interview or application. But having the opportunity to do so only once seems even worse.
Edit: that said, I think doing a professional exam after college wouldn't have seemed risky in the same way as a ~12 question quiz and a couple interviews being so potentially determinitive.
So it turns out you were not alone. Getting people who did well on the quiz - whose parameters were trained on interview performance, by the way, so "did well on the quiz" was synonymous with "had a good shot at the interview" - to sign up for interviews was a major problem for a while.
The solution? We told them the first booking was just a "practice" interview. And then if they did well, surprise, it was a real interview all along. (If they didn't, we'd let them try again, though it almost never changed the outcome.) IIRC this like doubled booking rates overnight. I don't really plan to do this at Otherbranch (partly because being aggressively upfront about things is a founding value, even when it comes to white lies of that sort), but it's a fun story that I don't think was actually unethical.
There's a lot I could (do [1]) say about the psychology of interviewing, and it's something I'd love to write more about down the line. Mental health - especially around anxiety and depression - is my #1 personal cause and job hunting touches so concretely on so much of it.
[1] https://old.reddit.com/r/cscareerquestions/comments/1daumg4/...
Honestly, saying it's a "practice" interview, and then saying "Oh, you did so well we're just going to let you skip the real one" isn't lying. Some people might need to believe there's a safety net in order to join. Other people might really need practice.
Maybe instead of a 'practice' interview people can just be told what to expect so they know how to prepare.
Offering practice interviews just makes it sound like the interview is a performance.
Unlike airline pilots (or I'm certain many other professions), every company in the valley insists on re-interviewing a candidate in their own custom and unique way, instead of trusting some sort of an industry-wide certificate only achievable through a standardized test. Wonder if this will ever be solved.
I remember way back in the day where it felt like Triplebyte might finally figure that one out, but it unfortunately never happened.
Furthermore, as an avowed enemy of “Clean Code”, I don’t want to see standardization because I fear that well-promoted ideas that I think are terrible would become required dogma. I prefer chaos over order that I don’t like.
Corporations can use that standard exam/license as a baseline and then focus their interviews on domain-specific questions and the like. The existence of standardization does not negate custom processes.
Well, the medial profession has a standardized licensing process. It's not perfect, but it certainly keeps the interview process to (mostly) mutual interest.
I think we can learn from the medical profession here. Otherwise, "I prefer chaos" implies that the incompetents are the ones who are the ones who will lose.
Opposite of that is the idea that work and interpersonal habits, communication skills, and domain knowledge are more important than raw programming skill for most jobs.
The airplane pilot interview process on top of the standardized government certifications includes:
- On-line application (resume and cover letter)
- On-line Psychometric/Aptitude testing (sometimes this is hands-on, on site for some airlines)
- Video Interview, SKYPE or Telephone interview
- Assessment Day (includes: Technical Questions / Panel Behavioral Interview / Scenario Based Questions / Flight Planning Exercise and sometimes a Clinical Assessment)
- Sim Test
- Referee Check
- Medical Check
The exact details differ by airline and I'm assuming the risk profile of the cargo (ie: passengers or not).
Gosh, not so different from software engineers, is it? Except you also need to do a bunch of bureaucratic certifications on top of that.
It’s amazing how often I hear about how easy interviews are in other professions, according to engineers who dislike coding interviews.
Then you look into those other professions and it turns out changing jobs is actually a lot harder than the internet comments would lead you to think.
Cataloging a list of skills is insufficient information to determine if a person is suited for a role. I don't find it likely that software engineers will be subjecting themselves to a battery of standardize cognitive function tests any time soon.
Combine that with the fact that the upper bound on pay for SWEs is considerably higher than pilots...
Because I don't know what’s out there, or who will give me the best offer. If you’re skilled and in the middle of your career, it’s easy to find a job, but if your options are wide open, a matchmaking service like this with a wide pool of companies is very valuable.
Management puts up with us and they pay us because even though they think they can do our work, they wouldn't want to.
Plumbers are licensed and unionized, two possible solutions to the problems posed in this thread.
Implement Bubble Sort, in 2 Languages of your choice, with multithreading or other language-provided parallelism, with the correct number of parallel threads to be most algorithmically efficient
Would that really not weed out a lot of people? I think it would. I know the above algorithm is hardly production-ready, but the requirements are easy to understand. (It's also a bit of a trick question - there is no optimal algorithmically efficient number of threads in a bubble sort, only the number of CPU cores in the system.)
The same was true of all three of the standard coding problems Triplebyte used. They're not quite literal fizzbuzz, but they require - at best - some basic critical thinking and basic language features, and that is a filter that eliminates 90+% of applicants to dev jobs. Now, granted, this is under time pressure. I imagine, given several hours, most could finish it (although maybe even that is overestimating things). But still.
There's an old Randall Munroe article quoting a physicist:
> The physicist who mentioned this problem to me told me his rule of thumb for estimating supernova-related numbers: However big you think supernovae are, they're bigger than that.
and I feel like this applies to recruiting: however bad you think the average applicant is, they're worse than that.
The other problem is that implementing bubble sort will tell you about their skills in a particular dimension, but being a software engineer these days may look very different depending on the job.
After that I ask a super difficult or niche trivia question like “in CSS, what two overflow properties cannot be used together?” If I’m hiring a mid-level frontend developer and they nail that one, I go “fantastic, great answer, do you have any questions for us?” And the interview can end early.
But if they miss that, no sweat, I’ll start asking more mid-level technical questions to figure out where they’re at.
It tests for pure coding ability, when most organizations should be optimizing for engineers that are going to be productive-if-not-spectacular, that can design and build maintainable systems.
Could I have written the above problem back in my engineering days? Probably not, since I went years not working with threads. But I also wasn't working on problems that would ever have benefited from knowing that. Most software engineering roles are essentially building CRUD or ETL systems, maybe with a user interface. Any coding problems given should be optimized for weeding out bozos (which are still plentiful), not for weeding out the most people.
Even your question isn't something I'd necessarily ask on the spot. Many engineers don't use parallelism in their day-to-day work(webdevs). The part about making it efficient is interesting, but feels borderline like a trick question that a good engineer could fumble.
It's not actually a mystery. The problem is that that sort of test is de facto illegal (in the US) due to the "4/5ths rule".
And that's a skill that's incredibly hard to test for.
There are many more job positions posted than will ever be hired for, despite the availability of suitable talent.
Airline pilots are selected to be entrusted with many lives, with the utmost professionalism, and to perform with ability under stress.
And it shows, in the amazing track record of aviation safety.
Most software developers, on the other hand, are mainly paid to mechanically type out code, cribbing from StackOverflow or ChatGPT whenever they don't know something, to "get it done".
And it shows, with the atrocious track record of the entire industry at information security, for one obvious metric.
With that said, when it came time to look for a job again a few years later, I did chat with Triplebyte but ultimately took an offer through other contacts that I had built up by then.
pedantry: surely you didn't go to a random apartment, rather it was their apartment.
After my (virtual) TB interview (which I barely passed), I had onsites at 5 places. After the five on-site interviews, I had 2 job offers, one of which was a company I wanted to work for since graduating college. I took the other offer.
This was preceded by a four or five month job search. I had received two offers in that time, but nothing seemed great.
I think TB's process kinda worked, but I understand your skepticism.
[1] https://news.ycombinator.com/item?id=40635518
I mean technically the same term is OK to use for both, but it does feel like it's burying the lede.
The core problem that companies are willing to pay for is "top of funnel". The obviously skilled, experienced software engineer that every company wants. How do you make them interested in your company in the first place? Triplebyte did not really have a cost-effective solution for that, although that's what people wanted most when buying their service.
A failed onsite is more expensive than if the candidate walked in the door, grabbed someone's laptop, threw it out the window, and left. That's a big deal. Even a 15-minute phone screen at typical Bay eng salaries is like a nice steak dinner (particularly if you include disruption to actually go do it).
They absolutely did! It was getting companies to agree to skipping straight to the onsite after a 15-minute recruiter call.
This lets you skip the back-and-forth of screen scheduling and result chats. In turn, that saves weeks of time during a job search, and allows you to have all of your onsites within a short time period, and then and offers come in within a much narrower window of time. You can make apples-to-apples comparisons about which role you might take that way, so it's a much lower cost for a much greater payoff to spend a few hours interviewing with a company you weren't thinking of to see if it's a fit.
In my mind, this was the absolute killer feature. I am still at a Triplebyte job 7 years later that impressed me via their onsite, and I would not have thought to apply if they had not been on TripleByte's skip-to-the-onsite list.
For the premium tier, the problem is less so the efficiency of the matching process because participants have powerful signals, I would say the problem is the efficiency of negotiating a fair term because both parties are not incentivized to go to a public auction, hence no market price. Hence the hiring process may be less so investigating potential candidates and more so "let' just wait two weeks and see who's the highest bidder".
For the subpar tier, the problem is less so the efficiency of the matching process (this still is part of the problem), but more so a cost-effect analysis: is it worth throwing X amount of resources to filter a huge pool of unknown quality candidates?
Obviously, one more problem is that there's no easy way to use whatever incentive to quickly bump up the supply.
So intrinsically, triplebyte/dating app tried to use tech to alleviate the pain point, but it is a part of the bigger, convoluted mating dance, and improving a non-critical part of the whole pipeline doesn't yield a so-called step function gain overall. That, plus venture backing, forces triplebyte to go down the rabbit hole and try to acquire as much of the lower subpar tier network as possible, yet this dilutes their attention and human touch on the premium tier, and goes down a death spiral.
Sorry I do not intend to bash the author or the founders for picking this problem/user need to solve, it just occurs to me that this heterogeneous network with no easy way to increase supply/lower demand, is really hard for a marketplace company and sometimes simply throwing technology onto the white-hot competing segment may not be the groundbreaking solution. I wish the author all the best with their next endeavor
> I would say the problem is the efficiency of negotiating a fair term because both parties are not incentivized to go to a public auction, hence no market price.
Are you referring specifically to salary negotiation here? I think this is one of the things that makes a trusted intermediary useful.
During signup, or during an initial recruiter call with candidates, I ask about salary with the following script:
> Okay, what are your salary expectations? To be clear, I will NOT share this with clients unless you tell me that I can. This is just so we know what jobs you might be interested in - the only thing we'll communicate is whether you match a client's salary range or not and vice-versa.
While I doubt the incentives here align to perfect honesty, it's certainly a lot better than the background, in that each side gets information on the other only after revealing semi-honest preferences used to match them.
> For the subpar tier, the problem is less so the efficiency of the matching process (this still is part of the problem), but more so a cost-effect analysis: is it worth throwing X amount of resources to filter a huge pool of unknown quality candidates?
Yeah, this is where the pitch about centralization - the main pitch I want to make with Otherbranch, since I think the full background-blind pitch was overselling things - comes in. There's some threshold where the cost-benefit analysis is neutral for companies, in the sense that there is 0 expected value to the next marginal candidate. But if we're effectively interviewing for more than one company, that threshold is lower for us, because the cost is similar and the benefit is higher.
I don't expect we're going to be able to get a mediocre developer with a mediocre resume an amazing job. I'm not really trying to do that. I'm trying to get an amazing developer with a mediocre resume an amazing job, or a good eveloper with a mediocre resume a good job - one better than they could have gotten on their own.
> [...] I'm trying to get an amazing developer with a mediocre resume an amazing job, or a good eveloper with a mediocre resume a good job - one better than they could have gotten on their own.
Is the reason that you can get employers and workers to commit to salary ranges like this, that you're focusing on mediocre resumes?
That is: Workers with mediocre resumes can't afford to be too greedy and play their cards too close, and companies aren't too hesitant to filter out someone with a mediocre resume?
(BTW, the below sounds like it could be a good/great thing. I'm not criticizing it, just curious whether that's also why requiring salary ranges upfront would work here, when normally both parties would be resistant.)
> I'm trying to get an amazing developer with a mediocre resume an amazing job, or a good eveloper with a mediocre resume a good job - one better than they could have gotten on their own.
Yeah, I agree. I do feel like you're not going to take the venture-backed route this time? Cause in my opinion that's part of the reason why drive triplebyte went down this route. Based on your writing you probably concluded triplebyte model back in its heyday was going to be a decent business, just that it can't be a VC home-run.
> if you needed venture capital to produce a viable small-scale business, you didn't really have a viable small-scale business to begin with. Triplebyte's break-even status was an illusion, predicated on funding that would probably never have existed if that scale were the founders' only ambition.
I wonder if there's a way to build a viable business out of a product like FastTrack that does not depend on venture capital to get going?