I'm glad to see this getting roasted in the comments, as it's a really good example of how companies put out self-serving pseudo-statistical nonsense in an effort to promote themselves.<p>There's no effort to quantify what "technical" or "communication" skills are - these are left to the interpretation of a the interviewer. It makes no effort to show where these engineers are going, what the interview process they're completing looks like, what impact demographics had on this, etc.<p>I find this stuff repugnant. It perpetuates the myth that there's something really special about Silicon Valley engineers, while making only lazy and perfunctory efforts to examine any alternative explanations than "this is where the rockstar ninja coders work." Shameful.
> There’s also the issue of selection bias. Maybe we’re just getting people who really feel like they need practice and aren’t an indicative slice of engineers at that company.<p>Or that your interview preparation platform prepares candidates better for Dropbox's interview process than it does for Microsoft's. Or that the people who were confident in their interview skills for Facebook decided not to use your platform. Or that these companies have different interview processes and selection criteria (they obviously do) so ranking "best" based on performance on different tests doesn't tell you that much.<p>There's hundreds of different ways to slice this data to come up with different hypotheses about what's actually occurring.
Dropbox was known as a hard place to interview since 2014, the tech companies that had the hardest technical interviews are Quora, Palantir and Dropbox(honorable mention fog creek)(this was from a few years back so things may have changed). Just because the company makes it extremely difficult to get in does not mean the company is generally all around awesome or pays great or employs the greatest engineers. It optimizes for people who generally come straight out of an elite CS program with those learned concepts fresh in their mind, and for people who grind out on leetcode or who are great at interviewing. Of the three companies I mentioned above I would not work for any of them now.
Kudos to interviewing.io to share this analysis. I agree with the many issues in methodology and analysis that others have raised here, and I agree there's a risk that a face-value reading of the blog post is highly misleading. But this is true for all data, and poo-pooing the analysis without crediting the sharing just leads to less sharing. To be clear, I'm supportive of the criticism, but let's also give credit where it's due.<p>Technical interview performance is a high stakes field for which almost all data is cloistered in individual hiring companies or secretive gatekeepers. In my mind, all efforts, even imperfect ones, to share data is a great step here. We should encourage them to continue to share, including pursuing options to anonymize the raw data for open analysis. The field deserves more transparency and open discussion to push us all to be better.
This is super interesting! Worth noting that another possible title is "... where the best performers were trying to leave during the study timeframe".<p>Really interesting to see dropbox so high - would be curious to see some other data to corroborate that they (at least used to) employ the best engineers.<p>From my time interviewing, I've seen clusters of very good candidates often be more reflective of which top companies were having a hard time, internally or externally. There was a while where my company hired a lot of people from Uber; right now we're getting Amazon and Facebook/Meta.
"best performers work" With a title like that you really just cannot take this study seriously lol. Not to say it's not interesting but that is one crazy claim, at best title should be "most effective interviewees." Also, I work in FANG but signed up for this website and can't even participate, so how you chose all these candidates is also questionable.
As an ex-Dropboxer, Dropbox asks legitimately tough questions. I only got in because I got asked the exact set of questions that I could figure out the answer for, once I joined and went through interview training I realized I would have failed about half of the questions that Dropboxers ask.<p>I'm also not sure how it is at other companies (at Google but haven't gone through interview training yet), but Dropbox's rubrics are also pretty strict. Doing "well" on a question requires getting through multiple parts normally with close to zero bugs that you don't catch yourself.
I think people put too much effort and time in this whole interview business.
My suggestion, based on my experience, is to spend a reasonable time like a month, brushing up on most common data structures and algo.<p>Then just take your chances. Rinse and repeat.
Does this happen in any other field? If I was a doctor and wanted to work at some other company, would I need to study the MCAT every year in order to pass a screening interview based on one possible question? What's the equivalent in other fields? Closest I can think of is an acting audition but even then they give you the script beforehand. I'm beginning to think that the industry somehow settled on this approach not so much as a skills verification process but, by making the process so onerous on the candidate, talent retention is a lot easier.
I cannot take any of this seriously.<p>- First of all, it assumes that `interviewing.io` is some sort of certification standard (which I'm willing to bet is the actual point of publicizing these 'studies' in the first place. It's 'fact' manufacturing)<p>- Then there's selection bias about engineers actually using one platform vs the other<p>- Touting the data set size in order to give the 'study' some credibility is a red flag for me. You can analyze millions of the same technical interview and deduce all sorts of conclusions.<p>- The use of 'best performers' is deceitful. It means 'best performers in the interview context'. But using it in the context of where do they work, it implies something like 'the best performing engineers are at company X'. Which is bullshit. More like 'best trained engineers to pass these interviews work at company X'.<p>Garbage. I'm flagging this as it's nothing more than self-serving marketing.
"Of course, the really interesting question in all of this is the holy grail of technical recruiting: Does performance in interviews reliably predict on-the-job performance? "<p>And until you can really say for sure this is the case, any speculation about the value of technical interviews other than just being a barrier to entry is really moot.
Going to be real interesting to see how I do at technical interviews whenever I decide to jump back into the job market.<p>At the job I'm leaving tomorrow, I just did a 2 hour long video session / training where I started by teaching how to read call graphs, and that led to over a hour of me trying to sort out a horrible performance issue in real time.<p>The problem, solution and the iterated debugging to deal with all the edge conditions that the extensive unit tests called out (and I wrote all the unit tests that blew up, so I get to take credit for all that -- although I also wrote the bug I fixed) should show that I'm very high functioning engineer. And I had identified the problem previously at a higher level and had a fix that papered over the problem, but during the video I correctly figured out that the real source of the problem was deeper in the code, and had existed before the change which surfaced the problem, and managed to do a data-driven analysis to track down the perf bug and go from 15% of CPU time in one subsystem to 1% of CPU time in the same subsystem for a 15x speedup on my problem (and probably closer to a 90x speedup for the customers who were reporting it--including a large customer everyone is familiar with here due to headlines they're involved in).<p>Meanwhile I forgot that it was obj.send(:method, *args) in ruby and tried to obj.call(:method, *args) and had to look that up because my brain was derping a bit on that, and the night before I forgot it was JSON.generate in ruby and not encode/decode and just in general my brain is a mash of too many different programming language syntaxes. At one point I caught myself trying to use `%` for comments because I had been doing Matlab writing an Iterated Chebychev Picard Method IVP ODE solver the prior weekend. If I can't work with the command line or an IDE and with google I'm just going to be a mess of trivial mistakes due to crossed wires.<p>I've also never reversed a linked list in my life and the correct answer to that question is probably to never use a linked list due to TLB cache thrashing at the very least.
Interesting how everyone here reads the title differently than me.<p>I read “where the best performers work” as not where do the best coders/employees/whatever work… I read it as people who perform the best work. Where do people who do the best performance work? And in the context of interviewing - I don’t find this particularly weird to look at. Interviews are a performance.<p>That said - I’m apparently a weirdo!<p>Also, limiting this to companies with 50+ employees who have used the platform is going to definitely only let very large companies in the analysis. Even where I’m at now (1500 eng) I would be surprised if 50+ had used the platform because 75%+ were hired in the last 1-2 years.<p>Also - the way everyone here critiques the rating system - what a joke. You think the rubric at these big companies is really any better? You just find a way to shove your core feeling into the rubric and that’s it.
Many others have commented on possible biases here. That's true, but I also find this a really interesting observation based on a huge dataset. Once you can correlate it to actual job performance - performance reviews, whether you were fired or not, maybe even salary -, it might be one of the best ways to see if technical interviews actually measure anything significant.
I would never want the best developers. Imagine the giant pain in the back of these primadonna always thinking they can get paid more in their next job. And how you should give them this or that, or how you should change your development stack, language, processes, chair color, etc. I only need very few geniuses and a lot of normal ones. 80/20, remember?!
This just sounds like Dropbox, Google have a similar interview to interviewing.io. So if you're able to clear the Dropbox interview bar, you'll do well on their test. It doesn't really say anything about general work performance, only performance on a specific interview.
I always enjoy your work, @leeny. I remember quite a few years ago you were working as a recruiter and I liked working with you then, although we didn't wind up with a placement for me.<p>Best wishes!
It would be interesting to show error bars on the graphs. Seeing 95% confidence intervals would give a much better idea of whether we're looking at meaningful signal or just noise
The fallacy to these findings and conclusions is that you're measuring processes of your own brand and flavor; the biases are so deep you likely don't recognize them. 'technical', 'problem solving', 'communication' these are all of your own construction, setting, presentation and scoring. Look at the failings of the IQ test for example.<p>I'm curious if people just really need to learn the talk and the walk of a silicon valley employee to land a job at a FAANG.
All this post says is that among the candidates who scored highest on this model, a higher proportion worked at these companies.<p>But there's no inverse analysis: of people who worked at these companies, how predictive was that overall of a higher score on this particular assessment?<p>'Our five highest scores ever were all people who wanted to leave FooCo' tells you little about the overall quality of FooCo employees. Maybe the rest of them are terrible and these five needed to get away?
If Dropbox engineers are so good across the board, why are they interviewing at all? Surely the desire to move away from this kind of technical nirvana would be almost zero?
Despite all the caveats / problems on statistical significance and methodology, 1 key takeaway:<p>“Unemployed” engineers communicate better than those at Uber, Twitter, Amazon, and Google.<p>:)
I honestly don't care which company retains the best performers. Is none of my business. Maybe if you are an investor in these companies, might be a useful signal to know such trivia. But from a candidate standpoint, this page is super useful -
<a href="https://interviewing.io/recordings" rel="nofollow">https://interviewing.io/recordings</a>
This reminds me, a friend of mine recently indicated that Dropbox has become an Amazon graveyard due to the mass exodus of talent from Dropbox to other companies that feel like Dropbox did about 5 years ago. Without their perks and fancy food, Dropbox is a boring product company with not much upside at this point.
I have worked with a few developers who went on to work for FAANG companies. They were OK but not the best I have worked with. They did however have an inflated opinion of their own skills. Perhaps that helps when interviewing?
"If things go well, they skip right to the technical interview at real companies (which is also fully anonymous)."<p>So the most I'll get out of this is to skip the initial screening interview and jump straight to the "real" interview? Or do I even get that?
It's interesting that Google was not higher for communication. Aren't they known as the big tech company with the "writing culture"? I guess 'communication' means something different here.
How is it that Amazon was better than both Facebook and Google overall, but worse when comparing Technical, Problem Solving and Communication. What are the other undisclosed ratings that makes Amazon better overall?
I wonder if people give folks high communication scores as a consolation prize. Like, “he didn’t do well on anything else but he can talk well”.<p>Anyway, interesting results. Let me do some Dropbox outreach and see how they are.
Rather than showing the top ten for statistical significance, wouldn't it make more sense to use PCA on the ratings and show each component's top 10 instead?
My conclusion so that Dropbox has a lot of smart people who want to leave. Higher numbers of interviewees means more folks want to leave and are practicing to do so.
> If you’ve hired engineers from some of the companies in this post, have they performed better than others?<p>It should be: have you hired engineers from interview.io<p>This author is sadly blinded by the echo chamber of their own creation