One of the problems I've seen constantly, for 10+ years (been working for 20+, interviewing others for approx. 10+) is that there has never been a "baseline" skill level that you could always count on when you got a candidate in front of you. Any given person's programming and problem solving skill level could be absolutely anywhere on the spectrum between 1. "super proficient, fluent in multiple languages, highly structured thinking, clearly articulates a solution, can mentally visualize over a range of 20 orders of magnitude" and 2. "literally has no idea how to solve a problem, program a computer to do it, or even describe coherently what they are thinking". And these two extremes can have the same resume.<p>Some first level of the hiring funnel (HR, recruiter, or something) screens the candidate by looking at the resume and maybe doing an initial phone screen, and then this person gets dumped on the interviewer, and they 100% could be <i>anywhere</i> on that skill scale. So you always seem to have to start at "how do you turn a computer on?" and "write code that finds a substring in a string." It's really frustrating for the interviewers and I'm sure also insulting to actually good candidates.<p>I've never been in one of these businesses, but I bet if a hospital interviews a doctor, they can at least count on basic knowledge of anatomy, and if a law firm interviews a lawyer, they can assume the person has read a law book. We have no such guarantees in software.
I'd probably fail the author's interview. As soon as you hit strong cultural disagreement (e.g. "new technologies" is a minefield), the interviewer has to be a mental Hulk to proceed without bias. Goodbye diversity.<p>I've also done 100s of interviews (small and large companies). What I learnt:<p>- CVs are useless except for leveling the candidate (which is done for me),<p>- CVs are dangerous because they create a bias ("looks like an amazing candidate!" -- an then you get someone who is good, but not amazing as you expected and then you rate them lower than you should -- I intentionally skip the CVs as much as possible),<p>- majority of the people will solve a simple problem at the algorithmic level, and then fail to code it up,<p>- the best value is to give the candidate a <i>SIMPLE</i> coding question which produces a decent amount of code (25-50 lines), and observe them while solving it (their approach, bugs, etc.),<p>- the signals I'm looking for: (1) can the candidate code fluently, (2) do they approach the problem in a systematic way, (3) can I communicate with them without friction.
I feel like 99% of eng interview advice sounds just like this: "I've interviewed a bazillion people, everyone else is doing it wrong, here's the one good way."<p>Honestly, I think a lot of it is akin to medieval quackery. "I've treated hundreds of patients with leeches and the large majority of them eventually recovered - you should listen to me!"<p>I wish that we, as an industry, would spend less time reinventing interviewing from first principles. There are decades of research on how to interview people effectively. I highly recommend Schmidt + Hunter's meta-analysis (<a href="https://www.researchgate.net/publication/232564809_The_Validity_and_Utility_of_Selection_Methods_in_Personnel_Psychology" rel="nofollow noreferrer">https://www.researchgate.net/publication/232564809_The_Valid...</a>) as a starting point.<p>From my read of the research:<p>1. Use work-sample tests. Yes, they're annoying, but they're high-signal. Take-home tests, online tests, in-person tests, trial periods - there are plenty of options here, but it seems clear this is the most effective way.<p>2. Use structured interviews. Engineers love to just have loose conversations and judge people based on that, but structured > unstructured. Plus (just my guess here), I suspect unstructured interviews leave a lot more room for bias.<p>3. Plenty of techniques are effective on their own (eg references), but add minimal effectiveness when you're already doing a more effective test (eg work sample) and are therefore pretty much wastes.
I’ve also done 100s of interviews, mostly for mid-senior positions and generally cv-reading and a non-quiz intro interview talking about various technologies and experiences work great.<p>That said, I disagree about lack of any exercises. I think practical but not too time-consuming take-home assignments are nice and not too annoying (as long as they’re not the first step! That would be wasting candidates’ time.), and I’ve rejected tons of candidates based on the results of those, which is different to the author’s experience.<p>Overall, I recommend people to really think about who they’re looking for, and heavily adjust their process to that. There’s no silver bullet, and the right process will vary greatly depending both on the seniority and the skillset you’re looking for.
The one thing that's always missing from this conversation is that the "reviewers", "raters", or "interviewers" _themselves_ might not have the skillsets to be able to adequately formulate a <i>roughly</i> accurate opinion of someone's strengths and weaknesses.<p>Even worse, they might not be able to communicate the actual job requirements correctly, if they even understand the requirements and necessities themselves.<p>Even worse than that, they often aren't as strong socially as they believe themselves to be. Being able to establish a comfortable atmosphere to connect with someone quickly is rare skillset within the human population, let alone in specific domains.<p>There is a PRESUMPTION that people in the interviewer seat or management position can and should do this but honestly, after a few decades? I don't see it.<p>We should be working to solve the "interviewer" problem first, not the other way around.
I interviewed recently at a large GPU-related company, and the interviewer setup a bunch of softball questions.<p>What's your favorite editor, what language do you like using for scripting, what's your favorite Linux distribution, do you have a homelab...<p>We were having a decent back and forth and getting along, and then he hit me with the final question, "we really aren't supposed to ask, but what do you like to do in your spare time?"<p>I mentioned that I had a 2 and 3 year old and joked that I didn't have much free time anymore, and said I liked working in the garage with my hands to get a break from computers.<p>I didn't realize I gave the wrong answer until an hour or two later (based upon the rest of our conversion and his questions). It would have been the right time to show a few personal projects on GitHub.
IMO the best way to interview a software candidate is just to get them talking about the stuff on their resume. Ask for overviews, diagrams, justifications, etc, etc. If you really know something you will have a very clear mental model of it and be able to discuss it fluently.<p>Leetcode challenges are a waste of everyone's time. If you are a competent developer then you can google for algorithmic solutions if you really need to, but honestly how often does the need come up, and is googling for suggestions really the hardest part of the project ?!<p>The only thing that doing well on these type of problems proves is that you've been willing to put the practice time in to get good at them.
Interviewing is the single most divisive problem in software.<p>I agree with the author on having a plan built for what you're looking for. I don't like the general "culture" questions because they're quite biased ("oh you're reading on HN! Me too!")<p>I like behavior questions ("tell me about a time when") with a reference scale that explains your. It's not perfect, but it seems slightly more data driven than the rest I've experienced with. I take lots of notes and try to base my verdict on these. I hate leet-code, except at screening level, to validate that the person knows how to type on a computer. I like somewhat job-related exercises for more senior people, code-reviews, and with junior trying to teach them about something new you taught to multiple people and see if they catch it. I betcha half of the people here hate all these methods.<p>In addition to what you're looking for, I advise trying to see what you miss with people you currently have in your team. (eg if they annoy you, find why and look for what they don't have).
> I recommend that you make a plan for what you want to learn about the candidate, e.g. “are they good at acquiring new skills?” or “do they share the same values as the team?” and then structure the interview around that.<p>Generally what I want to learn about a candidate is "are they good at understanding plain-English requirements, can talk through their thought process as they problem solve, convert that solution into something that resembles working software, and we can have a decent conversation about what was built." As it turns out, whiteboard-type algorithms and system design questions are very good at this.<p>Asking if someone keeps up with tech news (and call it a red flag if they don't) is just a way to keep your candidates within a certain social bubble. Which, honestly, is preferable if you're a < 50 person startup. But this doesn't scale. Plenty of software devs at IBM, Amazon, etc are just clocking in and out of their shifts and do a fine job of it. Not everyone needs to make career advancement in the tech industry their life.
And have less rounds. I don't really understand why companies have multiple interview rounds. One of my friends recently gave three 1 hour rounds all technical.<p>If you're afraid of making the wrong decision then if one person makes 1/5 bad hiring decisions then 3 people make 3/5 bad hiring decisions not 1/125.
I don’t disagree with the sentiment but the tools aren’t the problem, it’s the delivery. A puzzle can might be toxic if it’s a test, but constructive as a gimmick used to explore communication. Likewise for coding exercises.<p>It’s all pretty easy if respect for the human is genuine, and destructive for all when faked.
Amen.<p>My interview questions eventually settled to having the candidate bring in a sample of their code and then teaching me how it worked.<p>We'd talk about what it did, what they thought they'd change to make it better, and so on.<p>Learned a ton really quickly and didn't waste anyone's time.<p>The last tech job I interviewed for they had me spend a lot of time working a sample application. When I thought about how many person hours were going into that across all the candidates, I immediately felt the company didn't have a sense of efficiency. Sure, it didn't cost them, but the fact they were fine just wasting time like that was telling.<p>Didn't get that job. But got a better one. :)
But how do you test for psychological fit with the team?<p>You can get a candidate that nails the tests and interviews. Meets the team and first impressions are good.<p>Then 2 months in everyone realises the guy has a tantrum issue and is just an awful person when you get to know him ...
And it goes all the way up to the top. I can name CEOs who were great hires who not at all an obvious fit for a tech company and I can name CEOs who certainly looked good on paper who were disasters.
Whiteboard tests are not a good way to evaluate people. It doesn’t tell you if they’re going to be a leader. It doesn’t tell you if they’re going to take an initiative. It doesn’t tell you if they’re going to be a person that complains all the time. It doesn’t tell you if they’re going to be able to solve problems they haven’t seen before which by nature will be all problems they encounter.<p>Whiteboarding interviews are a good way to make the applicant feel bad about themselves so you can lowball them in salary, however
I can't shake the feeling that time invested in gaming interviews is a big suckers bet.<p>Seems wiser to invest that time in health & entrepreneurship.