Google recruiters call me a lot. I think I'd do a good if not stellar job working there. I've passed multiple FAANG interviews and been very successful as a senior developer.<p>In my email I have an "interview prep packet" from them that essentially tells me to brush up on algorithms and read Cracking the Coding Interview to prepare for their interview process.<p>I'm fairly happy in my job. If they offered more money or a really interesting project I'd consider working for them. But I'm pretty lazy about redo-ing college algorithms class during my free time at home to go work there, so I probably won't.<p>There's an opportunity cost with interviews like this where an M.S. and long career of getting shit done counts for very little and memorization of undergrad level topics that you can look up in two minutes in Knuth if you have a problem that requires it can make or break an interview.<p>I've made a career fixing a ton of horribly shitty, inefficient code that's been produced exclusively by people who pass these interviews.
Aside from all the things mentioned in the article, this also seems like a fairly predictable application of Goodhart's Law: "Any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes."<p>Once upon a time, skill at doing these sorts of problems might have correlated (imperfectly) with general aptitude as a programmer or software engineer. But the very act of trying to leverage that correlation for hiring purposes probably also made it go away. Now you've got a whole lot of people practicing hard on these sorts of problems, spending huge chunks of their free time grinding away on Project Euler and Advent of Code and HackerRank. That muddies the quality of this stuff as a proxy for what it was originally trying to detect: natural aptitude. I'm guessing having time to level grind like that also correlates inversely with other traits that are desirable in a programmer.
The less talked about absurdity is that successfully passing programming interviews is a skill itself. It's especially absurd because the time I spend developing that skill is less time spent developing skills and knowledge more directly relevant to my job. Yet, programming interview skill is more relevant to progressing my career.<p>edit: now if you'll excuse me, I need to do some dynamic programming problems.
Why do developers complain so much about hard interview questions that can be supposedly be gamed by studying to the test? Every high paying industry heavily engages in gatekeeping, because the number of people who want to make 400k/year is far larger than the number of 400k/year jobs available. The traditional forms of gatekeeping involve requiring that people have the right personal/familial connections, or have an elite school pedigree (finance, consulting, law), or that they spend several years and half a million dollars in post-college schooling (medicine).<p>The tech industry's preferred form of gatekeeping is asking people to do algorithmic puzzles, which is far tamer and less exclusionary than what other industries do. If you believe that the only thing standing between you and 400k/year is a few dozen hours of practicing leetcode, why are you whining about it instead of taking advantage of the situation to wildly enrich yourself with a fairly modest amount of effort?
> However, whether or not a candidate answers a question correctly is not the only source of signal during an interview. You can also evaluate their process by, for example, observing how long it takes them to finish, how clean their code is, and how much they struggle while finding a solution. Our analysis shows that this second source of signal (process) is almost as predictive as the first (correctness).<p>I seem to do OK when an interviewer:<p>- asks me a question that sounds like a real problem, not a contrived one (although on occasion I'll have fun with a contrived puzzle if the interviewer has a sense of humor & makes the process light hearted)<p>- doesn't push me down a path that requires me to implement a "simpler" solution I'd never consider (e.g. asking me a question that clearly wants an O(N) solution & then pushes me to try the O(2^n solution first)<p>- talks like a person with a problem, and not as someone who clearly knows what they want & simply won't say it<p>- doesn't try to "see how I think", because I code as much in my head as I do on a screen, meaning most of the code I throw into a text editor is the latest thought in a stream of random ideas until I get to one that works<p>- doesn't constantly interrupt me<p>- states their actual expectations, such as "I don't expect you to finish, what I'm really looking for is X"
A while ago we had a job applicant who had travelled very far and long to reach us. (literally from the other side of the world.)<p>So, as a courtesy we figured, why not spend a few hours extra with this applicant in the programming test. We set up a laptop with a clean Ubuntu install, devised a programming test that was quite involved. Not algorithmic hard, just more complex than what can normally be done within a 20-minute whiteboard interview. We expected it to take at least 2-3 hours. Google/Stack overflow/etc access was allowed and encouraged. "Just act as like you would normally do when solving a problem."<p>We spent like 2x4 hours devising this problem, based on our codebase (cutting out something somewhat easily digestible and making it able to run standalone).<p>It took like one hour to get productive. Explaining the problem, setting up editors, compilers, etc.<p>We took turns, but most of the time someone in the interview team (of two) sat next to the guy. We did give him some alone time.<p>This is probably nothing new in terms of interviewing techniques, but to us it was such a revelation. We learned so much more about the applicant. Perhaps it worked well with this guy because he happened to be a bit more outgoing than our typical successful applicant. We'd never felt so confident about giving someone an offer before.<p>I'm really looking forward towards testing out this approach with local candidates to see if we can replicate this "data gathering success".
There's a problem here. The only thing that Triplebyte can claim based on their data is that easier programming questions are more predictive of performance <i>among candidates who received an offer</i>. Since candidates who get offers are (in theory) different from candidates who don't get offers, we can't necessarily generalize from one population to the other.<p>There's also a question about how to mix question difficulty. Should you ask nothing but easy questions, or is it good to throw in a harder question or two to see how the candidate reacts to something they can't answer? I can see a good interviewer getting a lot of signal out of that, but in the hands of a bad interviewer it would not work well.
I think that asking a candidate to perform a code review can be an effective method of evaluating quite a few desirable qualities. Can they understand someone else's code? Can they engage in constructive critical discussion? Are they able to effectively refactor something to make it better? Can they spot mistakes and do they have an opinion about how to avoid such mistakes?
I see one aspect of this trend of asking programming questions that require a lot of memorization: We have had for the past ~10-15 years people in the workforce (and thus acting as interviewers) who went through a public education system where heavy emphasis was placed on passing tests that required a lot of memorization.<p>I'd be interested to know how many of these interviewers actually think they're able to identify a solid candidate this way? Not to mention, are they even factoring in how many people don't test well but are otherwise superb software engineers?<p>Ultimately it seems like there is a soft element to interviewing that is being tossed out now, which is: do I think we can work with this guy/gal? Are they someone that can become part of our team on a personal level? Can they get good work done? Fizz Buzz can't tell you that. What can tell you that is experience. It's a hard-to-put-your-finger-on-it X-factor that I think companies think they can ignore.
I usually ask what's your strongest language; then ask questions about that programming language.
f.e. if it is python:<p>> how would you explain the with statement to a junior developer ?
then increasingly difficult questions that go into the language runtime/concepts.<p>one other favourite question of mine is:<p>> Imagine, you got a standard website the serves data from a database. When a customer types in the url into the browser bar what needs to happen until the customer see the website.<p>> Go as deep as you can in the answering the question.<p>When you got an answer, you'll see frontend engineers explain more about the browser, while backend engineers talk more about the backend.<p>There was one very senior engineer, that actually talked about the ethernet layer, he talked for more than 15min. Most medior engineers are done in 5mins. ;)
What is important to employers?<p>1. A candidate who can solve puzzles but is not willing to do the dirty work with team, solving production issues, doing debugging, bug fixing with usual stuff.
2. Another Candidate, who is willing to learn, is ready to work with team and do the dirty work.<p>I am on a hiring committee as a Tech Lead, and I always try to weed out 1.<p>Works great, we hire as interns and then assess them. Someone from Google who we hired full time, <i>was</i> detrimental to team's morale, grunting, complaining about code, complaining about food and what not.<p>Another experienced smartass was self centered on his skills and didn't want to teach junior engineers anything or willing to admit he needs to update his skills. The moment he realized his skills have no values, started attending pointless conferences. Now his LinkedIn profile has "aware of block chain technology", "attended machine learning seminars".<p>I said not to interviewing at Google and FB because I don't have cycles to spend months on leetcode. Did I erred? Perhaps. But I am sure neither can provide me same work quality I execute in my current mid size company. I regret nothing :).
The best interviews I've been a part of have been a short cultural style interview followed by a (roughly) 2 hour paired-programming session on a relatively simply to-do application.<p>1. The problem is pretty well understood (but does offer room for interpretation).<p>2. Provides time to cover all key aspects (Frontend, Backend, Database, Networking, Debugging, Testing, Caching, etc) in at least some capacity. In particular, it shows you what areas the developers focus on.<p>3. Provides a more relaxed/realist environment. It's also more accommodating to developers switching stacks - familiar with good programming patterns but not the specifics of stack (e.g. "Here's how I'd do [some specific task] in [other stack]. How do I do it here").<p>4. It's clearly a throw away task so there's no concern about "interview labor". It can also be pre-prepped so you don't have to worry about jumping too far in.<p>5. You can cut short with bad candidates and expand the problem for more complex candidates.
>how clean their code is,<p>I got bit in the ass by this one as triplebyte itself. They asked me to make a tic tac toe game, and gave me iirc 30 minutes (less?) to do it. Except, it wasn't "build a tic tac to" game, first it was "draw a board to the console," "take user input from the console," etc a bunch of instructions in a convoluted path that perhaps another engineer would do when knowing from the outset that the goal was to build a tic tac toe game in 30 minutes, but not me.<p>So we'd get to a portion where I'd be writing a quick test on user input, or extrapolating something to a function, and the interviewer would say "don't worry about that, just worry about {getting the grid to print to console or whatever}."<p>Later on I got my feedback and they said they were disappointed with my user input tests and repeated, extractable code in the tic tac toe portion.<p>Triplebyte is trying to do good things in the interview space but I think they're still learning. All in all my interview with them was about as positive an experience as a harried and bad interview could be, from my perspective.
At my consultancy we recently streamlined our interview process:<p>1. Phone screen which takes 15 or 20 minutes.
2. The candidate fills out an essay, including showing us some code they're proud of.
3. If the essay ticks the boxes we conduct a 1 hour on site interview. We use the same a set of questions for every candidate, so the investment is easy to manage, and our team has a shared set of expectations on what is good or bad.
4. If the interview goes well, we give them a take home assignment. Takes between 2 and 6 hours, depending on how experienced the candidate is. Problem is in C and/or Python (or both)
5. We wrap up with a 2 to 3 hour onsite interview. We walk through the assignment and have a deeper conversation about culture and fit.<p>The results have been positive for us: we've made some great hires and weeded out some candidates who weren't a good fit.<p>We've also been able to scale it down to the process we use for interns.<p>The 1 hour interview has some typical programming interview questions, but we wrap them into a real-world example. The goal isn't to prove they know how to program, but more about allowing them to show us how they think/work out a problem.
My group likes starting with easy questions and ramping up the difficulty, not to eliminate people with wrong answers, but for two other reasons: first, to see whether people understand the questions and whether they try to make up answers, or ask questions, or say “I don’t know”. Second, to see what the limits & boundaries of their experience is. We know that people don’t know everything, and we measure more for potential than for knowledge, but it’s still useful to understand someone’s experience and exposure level.<p>More important than question difficulty to me is attitude, and I’d love to see whether attitude is measurable and how it compares to later performance, but curiosity and optimism and communication really do go further than right or wrong on math and engineer questions for me. That point might even be tired already, I know people say it all the time, but I’m going to keep saying it because we still have blog posts on question difficulty, when easy vs hard engineering questions are pretty low on my list of what matters when I’m hiring.
A few jobs back I was tasked with hiring new developers to bolster a thin front-end team. The job was very CSS/JavaScript heavy, so I asked questions that were pertinent to what the candidate would be doing if hired. Of the five candidates, only one answered all the questions perfectly, and he turned out to be the biggest bust for us.<p>The other candidates, after answering some of the harder questions incorrectly, seemed very upset with themselves. They knew they were cracking a bit under pressure, but actually showed that they knew the answers when we chatted further. I hired 3/4 of those people because of how well I felt they'd do given the opportunity. All three became leads within a year and a half.<p>I think personality has a lot to do with outcomes. If you are someone who shows they are hungry to learn and knows how to improve their skills, I will never dismiss you for screwing up a few coding questions.
One interview of mine asked something i didn't know yet I said there's no doubt I figured it out via Google. The interview pretty much ended there and I'm glad it did! Any place or interviewer that says you shouldn't use Google of OverFlow to get your work done is no place I want to work for.
I got my yearly review recently and I got very good feedback. At the same time I have been doing Leetcode at home for fun, starting with easy problems, and I get my ass handed to me.<p>I find it hard to reconcile these two experiences. How can I thrive at a top tech company while failing to solve an 'easy' coding challenge. It makes me concerned about what would be of me if I had to look for a new job now.
This is an interesting post given that up until recently TripleByte's facebook/Twitter ads were asking very nonrealworld programming questions: <a href="https://twitter.com/minimaxir/status/1054596563585052673" rel="nofollow">https://twitter.com/minimaxir/status/1054596563585052673</a><p>Now, the ads ask simpler things like floating point precision and function variable scoping (<a href="https://www.facebook.com/triplebyte/ads/?ref=page_internal" rel="nofollow">https://www.facebook.com/triplebyte/ads/?ref=page_internal</a> ); legit problems, but not sure if they are an indicator of how good a developer they'd be in the real world working on a CRUD app.
The bigger problem is the inconsistency amongst interviewers when judging candidates. All these articles from TripleByte and Gayle (who've built business on the flaws of interviewing) focus on the questions instead. Doesn't matter how hard the question is if the interviewer knows what they're looking for, are experienced enough, show no nepotism and have good communication skills.<p>My worst interview ever was with Facebook when a non-native, new college grad gave me a Leetcode hard problem in half-broken english and went back to his work without even looking up or walking with me through the problem.
If they want something that mimics the common, everyday..<p>Give the candidate a project with 300,000 loc, tell them to make the most local change possible that fixes the reported bug. Update the tests to reflect the new logic.<p>Bonus: discuss architectural changes that would have resolved the bug and/or improved performance.
,,Hard questions do filter out bad engineers, but they also filter out good engineers (that is, they have a high false-negative rate). Easy questions, in contrast, produce fewer false-negatives but more false-positives''<p>The philosophy at Google is that it's better to filter out 3 good engineers than to let in a bad one. The consequence of this is that it's really hard to get kicked out of Google.<p>The other part (whether it's more important to work on long easier questions to see how the candidate works on a large code base) is orthogonal reasoning, and that part may be true, depending on what type of engineers somebody is looking for.
Another solution to this problem is contract to hire. I realize this is kicking the can down the road to the contracting firm but hear me out: that's the business the contracting firm is in. They can get really good at their hiring practice since that's their core business. That's not our core business. We've been doing this for the past two years and it's worked out great. Now you can see how well people do the actual job and if you're not satisfied, which happens from time to time, just get someone else.
The thing that bothers me about these questions is that there isn't a shortage of practical questions to ask that will test if the candidate can truely contribute to the real problems you're trying to solve.<p>"Right now our roboticists use a hacked together QT based GUI to manage customer robot fleet data. It takes 1-10 minutes to load on a slow network and is hard to add more features to. I know you have far less information than you'd want but walk through your thought process for how you'd replace this system over the next 12 months. We can make assumptions."<p>And then the next 15 mins can be an organic conversation about the problem space. You can direct the conversation into corners most relevant to their potential role: "you mentioned using web tech because we discussed how all usage is across the internet. Can you talk about the merits of Http vs. websocket?" "How would you ensure that we don't accidentally take every single customer offline if we centralised our data store?" "What kinds of UI technology would lend itself to robot mapping? Can we just use Google Maps?"<p>If you really need to dig deeper into technical prowess, find something relevant in your conversation and dig deep into it. "We talked about saving changes to floor layout. Can you whiteboard/laptop how you might implement undo/redo for floor elements?"
Instead of doing this, read, then follow <a href="https://sockpuppet.org/blog/2015/03/06/the-hiring-post/" rel="nofollow">https://sockpuppet.org/blog/2015/03/06/the-hiring-post/</a>
Having recently gone through some interviewing (for machine learning research), a very cynical but overlooked aspect (in this thread) is the following:<p>After receiving an offer from a big tech company, the interviewing process has already completely turned me off from the idea of working there.<p>Now despite this being a dream position for many and me having no alternative but to take it currently, the smug interviewers have already gotten me in the corporate mind-set: No matter, the reputation and salary, treat it like any other job, do not bother being loyal - they will not be.<p>So the terrible interview process has at least the advantage of reminding future employees what they are signing up for.<p>I wonder what kind of psychological filtering is at play. Do employees feel loyalty after an interview process that is best described as hazing? Are they projecting the humiliation they experienced when interviewing future employees? That's always been my impression.
Specific examples of what classifies as a “hard” or “easy” interview question would be very helpful to have reference points and assess one’s own interview process.
You realize that this approach is flawed the moment there are blogposts / books ( e.g. Cracking the coding interview ) on how to crack it. The how to crack 'x' becomes a field altogether ( coaching / youtube videos / blogs / books etc ).<p>Also, platforms like hackerrank are adding fuel to fire. I read the CEO write somewhere that he wished the below "were taught in schools :<p>1) Communicating complex ideas with clarity
2) Systems thinking
3) Grunt work tasks
4) Boundaryless thinking
5) Self-awareness / EQ"<p>Please note almost all of these are not evaluated on their platform (they profit from coding tests) or during interviews and almost all are soft / intangible skills (skills which are not immediately obvious about the candidate during a typical programming interview). [ Side note : some could say that coding tests are the problem they have chosen to solve - in which case, why are they worried about these skills ? Are the companies seeing coders crack the tests on their platform, while not performing well on the above skills post hiring ? We could only speculate. ]<p>All good work is done by teams, and to be effective in a team requires a lot of intangibles which aren't even assessed in a typical interview.<p>A better approach could be from this article: <a href="https://leerob.io/blog/technical-recruiting-is-broken/" rel="nofollow">https://leerob.io/blog/technical-recruiting-is-broken/</a><p>Or :
A couple of weeks of work with a task being assigned and the mentor or interviewer looking at how the candidate is approaching the problem and whether he is able to solve the problem within the time constraints (an easy task shouldn't take long, and a hard problem shouldn't be short circuited to give a sub-optimal solution.) and other such observable traits can be evaluated.<p>My two cents!<p>EDIT : Poor wording above (i.e., couple of weeks). A task should be assigned and evaluated post a time (which is ideal for task completion as per the interviewer). No constant interaction with the candidate and spending loads of time with that candidate - that isn't scalable when the demand-supply equation is imbalanced already.<p>EDIT 2 : The idea above is not about spending weeks for recruitment. The idea was about being practical about the kind of questions / tasks that are given during interview (example : code a feature or fix an issue we have, as another user has suggested well in the comments). Took me a while to realize we have missed the point I tried to convey for the logistics of how it should be done.
I really agree. At companies I've worked at for the last 20 years (at least) the approach has been to ask pretty simple questions; people who struggle on those we don't want; people who don't struggle, even when they make silly mistakes (due to feeling interview pressure) are good.<p>Questions about the standard library of the programming language in question are good. Questions about the dusty corners of said library: bad.<p>And don't ask about floating point: most likely the candidate won't really know more than the usual things; anybody who really <i>does</i> understand them will probably give answers over the interviewer's head :-).
I like some coding in interviews. I try to make the tasks fairly real-world though. I am a UI engineer so my tasks typically fall into one of two buckets:<p>- A take home (2-3 hours) task for retrieving tabular data from an API and displaying it. Here I'm looking for general framework chops, readability, some design sense.<p>- An in-person (~45m) not-quite-pair programming task, with a real computer, tools, editor etc., for doing a typical UI operation, e.g. truncating text. Starts simple and gets more complex as time allows: make a function to truncate text to x chars; now add an ellipsis only if truncation occurred; now make sure not to truncate in the middle of a word, etc.
I've never been giving given a practical programming test. Nearly all are useless algorithms. Create a function that takes a integer greater than 0. Build an array equal in size to the argument. Fill the array with numbers that when summed together equal zero.<p>It was worded far worse than that. What exactly is that telling you about the engineer?<p>On the flip side I'm asked to code full fledged applications but not to spend too much time on them... okay...<p>Another time I was asked to code a luhn algorithm. Oh and do it while a room of people watches you on giant screen cause that's what your day to day job will look like... I failed miserably and still got the job. What?!?!?
Most programming interviews are a waste of time and energy for everyone involved and the results are, for the most part, a complete facade. Asking puzzle questions gives interviewers the feeling that they're really testing hard for top talent, when in reality they're just demonstrating their lack of interviewing skills.<p>I recently interviewed managers at two different technology companies, both nationally known, for a report I am working on. Most managers admitted their technical interviews were flawed, but didn't know any other way to assess skills. They also admitted that a significant number of people they recruit refuse to even take the technical challenge and end up working elsewhere.<p>In interviewing a couple dozen engineers, I found most just don't want to waste their evenings and weekends on a technical puzzle for a job, especially when there are a lot of companies out there who don't bother with them, so they end up searching for companies that don't waste their time with technical challenges.<p>Another funny thing I discovered during my research is that just under half of the employees at both companies I've interviewed so far were not able to successfully complete their own technical challenges.<p>Another problem with technical challenges is that often times the interviewer knows less about the topic than the interviewee. I recently went through the interview process with a local technology company who uses Elixir and Go (both of which I know). During the onsite interview, the interviewer kept saying things like, "Don't forget to..." or "You forgot..." I kept explaining that I didn't need to do as he was suggesting. In the end, my code worked, my tests worked, and I passed the interview. In spite of this, I was rejected because the interviewer, "Wasn't feeling it."<p>I still have a lot of research to do, but I haven't found anything, so far, that suggests that technical interviews predictably result in top-talent getting hired. It seems to be the same crapshoot interviewing people without using technical challenges is, because in the end, most people decide within the first couple of minutes if they like someone and hire based on that, regardless of the rest of the interview process.
I just finished reading Bill Kilday's Never Lost Again. It was amazing that four engineers produced the ground-breaking Google Maps in 16 months. Algorithm/Math questions used to be really effective at finding engineerings like those four engineers. There was a reason that Microsoft resorted to algorithm puzzles in its hay days as well.<p>It's just unfortunate that there's so much prepping materials online nowadays that the programming puzzles have become ineffective. It gets worse as many interviewers were not good enough to ask follow-up questions. For instance, addition with big integers is a pretty easy interview question, right? But if a candidate can go as deep as this article: <a href="https://bearssl.org/bigint.html" rel="nofollow">https://bearssl.org/bigint.html</a>, I can be pretty confident that the candidate is really really good.<p>That said, I personally don't find it necessary to join the rat race. Instead, I'd suggest engineers just take time to thoroughly study just one book on algorithm designs. In fact, an introductory book, such as Kleinberg's Algorithm Design or Udi Manber's Introduction to Algorithms, will be good enough. It may not get you into Google, but it will likely get you into another damn good company. The best part of this approach is that passing interview is really just the byproduct of you trying to become a better engineer.
Another clarification: this data only applies to general programming exercises and not domain-specific knowledge. For example, if your role requires bio-informatics knowledge (or ML or AI etc.), then by all means ask about bio-informatics—even if it means asking harder questions. (We do, however, recommend keeping the content of general programming exercises as vanilla as possible, so that you don't filter out someone because they lack mastery of a subject that you don't actually intend to measure.)
If you are asking interview questions that has one beautiful precise answer, you are doing it wrong. Good interview question should start with something very simple that even very beginner can think and answer, then gradually add complexity and constraints little by little.<p>Example:<p>1. Write function that multiplies two integers.<p>2. What if these numbers were real numbers but computer can only operate on integers? How do we use same number of bytes as ints to hold a real number?<p>3. What if I wanted infinite precision? What would be run time of your algorithm and storage complexity? (don't insist that candidate must hit the known optimal).<p>4. Can I have complex numbers as well?<p>5. Imagine complex numbers not only has "i" but also "j" and "k". How do we handle this?<p>It is astonishing how many candidates won't be able to move past #2.<p>The key is to look at how candidate approaches handling complexity, create representations and use it to craft clean solutions. Whether they eventually arrive at known optimal/great answers is unimportant.
>Harder questions ultimately filter out too many qualified candidates to be optimal.<p>This was the key sentence if anyone missed it. "Optimal" for whom, exactly? Not for FAANG certainly. They don't need to worry about filtering too many candidates out, because they have a nearly infinite pool of applicants, and infinite money to conduct a search.<p>They can ask as difficult questions as they want, because they can pass hundreds and thousands of qualified candidates, and still have plenty more where that came from.<p>Edit: If you consider "optimal" to be the expected cost of a hire compared to the expected profit, it is fully plausible that if your margins are big enough, asking hard questions is the most effective way to ensure low false-positives. But as everyone knows, comments on articles about interviews are never about the economics, it's only about human ego of feeling rejected.
Is it a surprise to anybody? My favourite example of interviews going bad is a tech company in SF asking to find cross currency arbitrage loops with all known currencies over the phone in the first round hiring data engineers for the ETL team. I am glad I did not pass that round.
I've seen this borne out in practice after administering dozens of phone screens and in-person interviews over the years. I started off with questions that were a little too hard and couldn't get a good read of the candidate unless they happened to have already done the problem on some interview prep site.<p>Switching to more practical, simpler problems allowed me to really observe how they work and solve a coding problem. As the article said, I was also able to add requirements or features to the problem which let me see how the candidate adapts to changing requirements, or refactors their own solution to handle a new edge case. Simpler is generally better if you are timeboxed to 45 minutes.
I changed careers from another technical non programming career to web development. I got a handful of interviews but I never felt like I was showing them what I could do... learn.<p>Until one finally gave me a fairly straightforward homework style project. It was probably more apt for a noob and I threw myself into it and submitted my work and explained what it over the phone. I was in the office the next day and we talked about it and I had an offer.<p>The homework style interviews are understandably controversial, but at least I got to show my work, me doing my work, my thought process, outside of a few moments at a keyboard.
I always thought that the primary quality measured by this popular interviewing style is "being-like-your-interviewer" which leads naturally to the avalanche effect due to which the practice becomes even more popular within the company and simply common sense after a while. The interesting part, to me, is how exactly similar the methodologies should be for companies different by size and domain. Even if you assume that some flavour of technical interview worked for Microfaceboogle, it might as well kill a copycat company (or may be completely irrelevant to its success or failure).
I wonder if hard questions, especially hard algorithm questions, has high false positive rate as well. Being good at solving those questions simply requires you to do tons of practice on leetcode, and those who are willing to spend tons of time are probably those who find it difficult to get jobs (or new grads). I don't think the skills for solving hard algorithm questions correlate well with the actual performance at work. Especially if someone is a new grad, they could be good at thise questions but don't have a clue about how real software systems are built.
Hold on, who is qualifying the outcome?<p>At one of my jobs I nailed it as the top candidate, by far, of the 79 people they interviewed in person. The interviewers were looking for competence, freedom from frameworks, experience and so forth. I have in this line of work more than 20 years and do it as a hobby so I nailed the interview.<p>At the job though I worked with a bunch of fresh juniors who only know how to write code the one way they learned in school. According to them I am shitty developer because I didn’t write code in the one way they understand.<p>Who qualifies the outcome?
This article makes a lot of good points. After going through the "implement a red-black tree on this whiteboard" experience as a more junior dev, I always promised myself I would never use this kind of stupid questions to hire.<p>Now, 13 years later, I mostly rely on "homework" type exercises. I think they address most of the issues. They are more "real world", no time pressure, etc. However, even those now are being heavily criticized. What's left to be used?
Given the 5th & 6th paragraphs about "correctness signal" and "process signal" it seems like the obvious solution would be to do both, i.e. ask questions that are easy alongside those that are hard. The easy ones are "process questions" and the hard ones are "correctness questions." Or maybe you have a range of difficulty from easy to hard, and each question has a number attached to it from 0 to 1 that reflects its intended use.
One issue I've noticed is that the elite companies are trend setters, and their interview methods are being copied by non elite players.<p>Perhaps Facebook needs and wants engineers that can bust out A* on the spot, but I doubt Nordstrom or Starbucks needs that level of talent.<p>This has changed the field where now to even get an average job at an average company you need to study at the new normal, whose interview ideas were designed to look for the top 1% of the field.
I totally agree with this. You shouldn't ask a question where some amount of luck of figuring out the problem is necessary. Instead ask something practical and relevant to your domain. Also, leave room for exploration if they sail through it and recovery for someone that goes down a bad path.<p><a href="https://github.com/spullara/interviewcode" rel="nofollow">https://github.com/spullara/interviewcode</a>
I’d agree but on the condition interviewer actually know ehat they are doing.<p>As I gained more experience (300 interviews and counting, baby) I realized that I pick up more on candidates skills that are not directly related to their performance on particular question. At this point the question itself is just a conversation starter and so it’s better if it’s simpler and more broad because it leaves lots more avenues for the conversation to go.
I ask what they've built, and how it was done.<p>This shows you way more than what a technical question can answer.<p>There are edge cases to this. Expertise positions specifically.
Another use of these kind of interviews is age discrimination to filter out of older candidates who don't have time or motivation to prepare for this BS.
This is just another fad.. In late 90s there was fad for MSFT style puzzle interviews..then it went out of fashion.. this will also go out of trend.. just wait for couple of more years.
"Easier interview questions are also less stressful, which is an important advantage. Stress causes candidates to under-perform. But, on the other hand, when candidates are more comfortable, they perform their true best, which actually makes interviews more predictive."<p>I cannot believe that <i>any</i> interview situation is comfortable.
Interviews are like taking the SAT or ACT. For better or worse they’re highly standardized and you just need to prepare. There are a bunch of different prep service like InterviewCake.com, Pramp.com, and PracticeCodingInterview.com that are good if you feel like you need more than just leetcode grinding.
Where I earn my money is thinking of a solution to a hard problem while not actually at work. It probably only comes up a few times in a year.<p>Questions you can answer immediately simply don't test if you can do this. I have no idea how you would test to see if you can do this.
Isn't this just saying that easier questions are more correlated with overall interview performance? That might be a good thing (a sign of consistency), but it could just as easily be a bad thing (i.e. it doesn't provide unique signal).
I took a technical interview today over the phone, and you know what, they were pretty fair with their questions. I don't know what kind of competition I'm up against for the position, but very fair questions.
I've worked for several companies who made up tests that nobody on staff could pass, including the authors. But the test did imply that the existing staff was superhuman, since no applicant could pass.
Interesting post! I appreciate this being shared.<p>Would Triplebyte mind sharing the data? I'd love if the numbers could speak for themselves, rather than having to rely on an interpretation of the numbers.
It seems like this article would be stronger with some examples. What are some examples of good and overly-difficult questions, according to TripleByte?
Let it stay here <a href="https://twitter.com/dhh/status/834146806594433025?lang=en" rel="nofollow">https://twitter.com/dhh/status/834146806594433025?lang=en</a><p>"Hello, my name is David. I would fail to write bubble sort on a whiteboard"
That's because professional outcomes for programmers aren't related to aptitude. Nobody ever got promoted for writing good code. (But the reverse is very common.)
Because they're designed that way. It all comes down to:<p>* Rejecting far more candidates than you need to -- so you can feel like you're hiring "the top 1 percent"<p>* Giving yourself the feeling that you have an objective hiring process (when really you don't)<p>* Making your own team members feel like they're super brilliant and special when really they're not<p>That's what the modern hiring process is designed to do. And in fact it works quite well, to serve this purpose.
I don't give questions like this primarily because:<p>- Workers are going to be around for 15 months or less and they have domain expertise on 1 stack already and I don't need to screen for how they would hypothetically function across all stacks<p>- Worker's process and resource finding skills are more indicative of the time they will spend on a task<p>- Worker's process includes collaborate use of version control and code reviews, if they pass the screening but can't really integrate on these things then thats what will get them booted from the team<p>It isn't always more expensive to have a not great developer. Look in your organization and see if what I experience is true for you, and you'll save everyone a lot of time.