My experience hiring machine learning talent over several years has been that people over-hype the cost of a false positive. Both the article's false positive (expending the cost of interviewing on someone who ultimately reveals to be the wrong fit) and also a more fundamental false positive: actually hiring someone who would hypothetically fail a lot of these interview pipelines.<p>The discussions about making these pipelines more quantitative, with assessments and quizzes, always couches it with a tacit assumption that the <i>worst</i> outcome would be to actually hire someone who fails at one of these interviews. Rejecting a good person sucks, as they say, <i>but not as much as hiring one of the multitude of sneaky, low-skilled fakers out there.</i><p>And of course, everybody's got their hot new take on how to spot the supposedly huge population of fakers.<p>What I have learned is two-fold:<p>(1) That person who aced all your interviews and finally looked like the perfect person to hire probably just spent 3-6 months utterly failing at a bunch of other interviews, just to get into "interview shape," refresh on all the nonsense hazing-style whiteboard trivia about data structures that they had never needed in years at their job, etc. So it's totally asinine to believe that someone passing through all your filters must be the sort of person <i>who would rarely fail</i> some filters. <i>That person almost surely did fail filters, and the companies where they failed believe they dodged a costly false-positive bullet, while you believe you just made an offer to the greatest engineer.</i> Hopefully you can see the myopia here.<p>(2) The cost of passing up a good-but-failed-at-interview-trivia engineer is often far greater than the cost of hiring them. For one thing, "suboptimal-at-interviews" engineers are pretty damn good engineers, and they can do things that differ from esoteric algorithm trivia, such as helping your business make money. Another thing is that many engineers can generalize what they learn, generalize from example code or templates, etc., very efficiently. So while they might reveal a weakness by failing part of an interview (and <i>everybody</i> has such weaknesses), why do you really care? They can probably become an expert on that weakness topic in a matter of months if they work on it every day, or if you have existing employees who can mentor them.<p>But the biggest thing is part of what Paul Graham wrote in "Great Hackers": good engineers tend to cluster and want to work with other good engineers.<p>So if you're sitting there without already having a few good engineers on your team, then most likely, the cost to mistakenly rejecting a great candidate who happened to have a bad day, or a great candidate who happens to hate writing tree algorithms on whiteboards, leaves you running a huge risk of losing out on a good engineer who could help kickstart the phenomenon of getting the next good engineer.<p>When your team is in this stage, you absolutely can manage with a few "dud" hires who need a lot of help or who have skill gaps in key areas. The cost of adding them to the team and managing their "suboptimality" is <i>far less</i> than the continued search costs brought on by rejecting good candidates with and overly risk averse hiring threshold, and leaving your team in the state of affairs where it still doesn't have a good engineer to help attract more.<p>In other words, the loss function penalizes false negatives more severely than the combined penalty from effort spent on true negatives and suboptimality / management costs of false positives.<p>But all these skeezy interview-as-aservice businesses what you to believe that the opposite is true, that if you accidentally hire a "faker" because your hiring process was too easy, then Cthulhu is going to rise out of the sea and lay waste to your company.<p>Of course they want you to believe that. That's how they make money. Preying on your fears over what would happen if you just unclench and treat candidates like human beings with strengths and flaws and don't hold them up to ludicrous standards that lead to self-selecting macho 22-year-olds getting hired because they just spent 10 months on leetcode.<p>When you start to realize this, it becomes obvious that onerous code tests, brainless data structure esoterica, hazing-style coding interviews, and especially businesses that offer to outsource that nonsense, like TripleByte, is all just snake oil junk.