The original article - <a href="http://mobile.nytimes.com/2013/06/20/business/in-head-hunting-big-data-may-not-be-such-a-big-deal.html" rel="nofollow">http://mobile.nytimes.com/2013/06/20/business/in-head-huntin...</a>
Distraction free reading and without all the annoying cruft of Quartz.<p>Fascinating use of "Big Data" to cut through the bullshit. Wonder if it will change anything. I suspect the "tough" interview plays well into a company's PR.
I don't understand people's problem with estimating. It's a useful skill. Perhaps it would be better if the questions actually related to technology, rather than golf balls - but the principle is the same.<p>For instance - "how many hard drives does Gmail need?" requires a rough guess of how many users Gmail has (if you're interviewing at Google, you should know it's 1e8-1e9). How much space each one takes (probably nowhere near a gigabyte on average - let's say 1e8 bytes). And that the current capacity of hard drives is (1e12 bytes).<p>Then you can say that they probably need 1e5 hard drives, link it to redundancy, availability, deduplication, backups etc. You can comment that it's feasible to build a datacenter with that many hard drives.<p>No one cares that the actual number is 12,722 - but you've demonstrated a broad set of knowledge about the current state of technology. Saying "dunno - a billion?" is not going to get you anywhere, and with good reason.<p>The Monopoly question is crap, though.<p>I'd like to know how useful <a href="http://google-tale.blogspot.com/2008/07/google-billboard-puzzle.html" rel="nofollow">http://google-tale.blogspot.com/2008/07/google-billboard-puz...</a> was.
It's a crutch. Nobody knows how to interview. Interviewing properly is a lot of work. There are two people who can do interviews--people who have knowledge of the job and people who have time to interview--and they are so infrequently the same people. These sorts of things were appealing because they were easy, a way to not spend a lot of time on interviewing, or a way to not need a lot of knowledge about the job.<p>And these things are important, because job candidates are not people, they are OEM replacement parts being order from Pep Boys. Call up the recruiter and requisition a J6-252: Programmer, seasoned 5 years, with degree from MIT. Oh, those ones are too expensive. Guess I'll take the knock-off version, but I refuse to pay full price!<p>Hopefully, because it's Google saying it, everyone will cargo-cult on this bandwagon too.
From the original New York Times article that Quartz has linkspammed here: "On the hiring side, we found that brainteasers are a complete waste of time. How many golf balls can you fit into an airplane? How many gas stations in Manhattan? A complete waste of time. They don’t predict anything. They serve primarily to make the interviewer feel smart."<p>Long before this was reported in the New York Times, this was the finding of research in industrial and organizational psychology. A valid hiring procedure is a procedure that actually finds better workers than some different procedure, not a hiring procedure that some interviewer can make up a rationale for because it seems logical to the interviewer. We have been discussing home-brew trick interview questions here on Hacker News for more than a year now.<p><a href="https://news.ycombinator.com/item?id=4879803" rel="nofollow">https://news.ycombinator.com/item?id=4879803</a><p>Brain-teaser or life-of-the-mind interview questions do nothing but stroke the ego of the interviewer, without doing anything to identify job applicants who will do a good job. The FAQ on company hiring procedures at the Hacker News discussion linked here provides many more details about this.
There are questions that are actually fun and I can sort of see them starting a conversation with the right kind of interviewer that tells both parties a lot about who they're dealing with. From the article:<p><pre><code> > How much should you charge to wash all the windows in Seattle?
</code></pre>
Basic economics estimating - probably not that useful and a bit dull, but hey why not. At least the problem has several angles to it that might be fun to explore.<p><pre><code> > Design an evacuation plan for San Francisco
</code></pre>
That's a nice one. Kind of open-ended, a lot of things to consider, a lot of ideas to be had.<p><pre><code> > How many times a day does a clock’s hands overlap?
</code></pre>
Why? What happens to the interview after you counted them (possibly on a whiteboard)? It's a dead end and the question is dull.<p><pre><code> > A man pushed his car to a hotel and lost his fortune. What happened?
</code></pre>
Now this has the potential to be great or absolutely horrible, depending on the intent behind the question and the nature of the interviewer. If it's taken as a "fill in the blanks" kind of challenge it would be a fun way to explore the candidate's imagination. But I'm guessing it's not. It's probably one of those "clever" questions that have only one "right" answer that makes no real sense except creating a few moments of uncomfortable silence.<p><pre><code> > You are shrunk to the height of a nickel and your mass is proportionally reduced so
> as to maintain your original density. You are then thrown into an empty glass blender.
> The blades will< start moving in 60 seconds. What do you do?
</code></pre>
Again, this could be a fun physics and chemistry question and I see a couple of possible solutions that might or might not work out - might be fun exploring them. But again, it <i>sounds</i> more like a trick question with one standardized answer. Bad.<p>The problem with trick questions and standardized answers is that the nature of the question makes the candidate uneasy and even if they eventually figure it out, nobody will have learned anything during the process. It's more like a hazing, not a hiring interview.
I've never seen any citation that Google <i>ever</i> used these kinds of question. Especially the idiotic one about pushing a car to a hotel. I think it was just an urban legend and a good piece of linkbait.<p>There must be thousands of people on HN who interviewed at Google over the years. Did anyone ever get a question like this?
When I interviewed at Google 5 years ago they weren't using those brainteasers.<p>There are many posts online about the actual, CS-y questions that you can expect in a Google interview, I had just assumed that the mentions of brainteasers were merely urban legend.
I was contacted by a Google recruiter a few months ago, I had no intention of changing my day job at the time, but for shits and grins I went through a couple phone interviews. The position they were hiring for wasn't an area I have any experience in (the recruiter had made a mismatch), but I thought the questions were reasonable for somebody who works in that field and were kind of fun. They were quizzy, but could be practical. It was a management position so there weren't any coding questions, but things like basic cost estimating that sort of thing.<p>I had fun and wouldn't mind it again, it didn't feel like a bunch of stupid random brain teasers like I've experienced before (how many t-shirts would it take to make sea worthy sail? why are manholes round?) etc.
This topic/discussion reminds me of a movie I saw recently It was called "That guy...who was in that thing". It is a documentary about working actors. Not Big time superstars like Tom Cruise, but the small time 'character' actors.<p>Anyways, there was one part in the movie where they start talking about auditions. All four or five of the actors they were interviewing for the movie unanimously spoke badly about the typical audition process. Some quotes taken from memory:<p>"I love acting, but I hate auditioning"<p>"You've seen my demo reel, you've seen me when I was on Star Trek, you know I can act, then why not just give me the part? Why make me go through this tedious audition process"<p>"90% of acting is reacting. You can't fully demonstrate your full acting abilities when you're standing in front of a panel of producers 'acting' out a scene that consists of 5 lines of dialog"<p>What the actors were saying about how they hate the audition process reminded me a lot of my frustrations surrounding hiring during tech interviews. Making an engineer do puzzles like FizzBuzz is a lot like making an actor act out a 20 second scene without any time to prepare or a proper "scene partner" to act alongside of.<p>I wish I could like to a youtube of the movie, but I can't find one. Its on netflix though.
They aren't using the brain teasers right. The Idea is not to create a barrier to entry, nor is it to stress the candidate. The objective of the brain teaser is having the candidates think slow enough that the interviewer can observe how he approaches a problem.<p>It's hard, when using problems that are common, to really understand how the candidates gets to the answer. Often, he's building on pre solved sub problems he encountered on his professional life, so the resolution process didn't even occur at the interview.<p>I personally don't use brain teasers, because they stress out valid candidates who do not work well under pressure. However, I think teasers, when properly used, are valid tools in an interviewers toolbox.
I took an i/o psychology course during school and a chunk of it dealt with interviewing and finding best candidates (from an employer stand point and equity standpoint), as lots of people who took the course tend to pursue education with the idea of obtaining an HR-related certificate.<p>The comment about brainteasers vs structured rubrics is sort of surprising to me, given Google's reputation for quantitative data. Speaking from a very high level, structure was really what was emphasized for interviews. It's interesting how culture can get in the way of proven 'fact,' and I love that Google is using their own (much larger data sets) to make these improvements and in/validate other research
How to drive clicks in four steps:<p>1. Invent a bunch of silly riddles that a non-technical reader might accept as tech interview questions.<p>2. Pull a major tech company out of a hat (today it's Google), and claim with no evidence that their interviews are based around silly riddles. The article will be cited for years as proof that people working at $COMPANY are weird and obtuse.<p>3. Wait a couple years. Ignore all evidence that $COMPANY does not use silly riddles in interviews.<p>4. Once traffic on the original article dies down, write another article claiming $COMPANY has "admitted" silly riddles aren't useful for interviews.
A problem I see with many of these sorts of questions is that they often require the candidate to have some supposedly common knowledge which is not required for the job itself. Cryptic word games surely are much more difficult for a non-native speaker of the language in use. Questions related to facts about cities probably require local geographic knowledge. Surely the evacuation plan for SF must consider the capacity of various bridges? Someone who has lived in northern CA for most of his life would have a much easier time thinking through the logistics of moving people off a peninsula. And, of course, there's the Monopoly question (which I had to Google).<p>I like estimation questions in general for many of the reasons other commenters have cited. However, I wish those using them would consider the knowledge implicitly required of a candidate.
> <i>Years ago, we did a study to determine whether anyone at Google is particularly good at hiring. We looked at tens of thousands of interviews, and everyone who had done the interviews and what they scored the candidate, and how that person ultimately performed in their job. We found zero relationship.</i><p>Can we see the study?<p>Also note that performance on the job is a noisy measurement, because people who get to work on impactful projects (through luck or people skills) get rated higher than others. I wouldn't be surprised if interview scores were a better measurement of "true" skills.
Sounds great, although like with any retraction I doubt this will be enough to stop the spread of interview puzzles. Even I'm guilty of asking my share before I realized that the only thing that matters about the candidate is whether they can sit down and start writing code (and the quality of said code).
The best book on hiring, no doubt, is Who: <a href="http://www.amazon.com/Who-The-A-Method-Hiring/dp/0345504194" rel="nofollow">http://www.amazon.com/Who-The-A-Method-Hiring/dp/0345504194</a><p>We used it to build our hiring process for <a href="http://www.thinkful.com/" rel="nofollow">http://www.thinkful.com/</a> and it consistently proves valuable.<p>We also use it to help our students prepare for job interviews.
"How many gas stations in Raleigh?"<p>I had a couple questions like this at a couple of interviews more than a few years back now. In both cases, I sat for a minute, and asked a few questions back, like "do you mean the city limits of Raleigh, or the metro area?", "how do you define gas station - do we include public-only, or private fueling places?", etc. Part of this was buying some time, because the question caught me off guard, but I think my questions back caught him off guard a bit too.<p>That interviewer told me I was the only person who asked clarifying questions before blurting out an answer or walk through. Another one was "take this marker and design a house on the whiteboard for me". So I took the marker and asked questions like "how many people will live here, do you want one or two story, do you need a garage/shed/basement, etc?" And again, was told I was the only person who'd asked questions before starting to draw.<p>I don't think the intention behind those brain teasers was necessarily to determine how you react to those sorts of problems, but it may have been a useful determining factor for some interviewers nonetheless.
Every time I click any link on HN that points to qz.com I get QZ without any reference to the article in question. Currently it points to "Why Tesla wants to get into the battery-swapping business that’s failing for everyone else"... in Chrome. Firefox seems to work. Terrible website.
These sorts of questions didn't start with Google. They're known as Fermi Problems for a reason: they're named after Enrico Fermi, the physicist.<p><a href="http://en.wikipedia.org/wiki/Fermi_problem" rel="nofollow">http://en.wikipedia.org/wiki/Fermi_problem</a><p>Knowing how to quickly estimate something <i>is</i> useful.<p>I imagine that Larry Page does a few quick estimates every day. How many Loon balloons would it take to bring Internet to 90% of Africa?<p>But not everybody at Google has a job like Larry Page. It's gotten to be a big company full of accountants, HR people, and other jobs that don't require much thinking in unfamiliar territory.<p>In other words, guesstimation is a useful skill, but not for every Google employee, so it's not going to show up as useful on average.
Some of the more flippant sounding ones could be useless but I thought the idea of the simpler ones (how many golf balls etc.) is to get a feeling for how people's minds work and whether they can make sensible best guesses in the abscence of concrete facts and make judgements based on those guesses. Weed out the ones who have no appreciation for how the volume of a golf ball relates to the size of a bus.<p>Good logical thinking shown here could indicate an ability to rapidly prototype systems without getting hung up on too fine detail.
I think you'll find this response by @gayle to be spot on. SORRY, FOLKS: GOOGLE HASN’T CHANGED THEIR INTERVIEW QUESTIONS BY GAYLE MCDOWELL, EX-GOOGLE ENGINEER & HIRING COMMITTEE MEMBER <a href="http://blog.geekli.st/post/53477786490/sorry-folks-google-hasnt-changed-their-interview" rel="nofollow">http://blog.geekli.st/post/53477786490/sorry-folks-google-ha...</a>
Insofar as this interview speaks to the relevance of brainteasers to actual software development / engineering, it fails to provide a meaningful topic of conversation. It surprises me that nobody's pointed out that at best the conclusions are relevant to engineering "leadership" performance, rather than -- as I expected for "Google" and "head-hunting" -- coding performance. Sure, people skills and team skill are important, but if you're going to get good at selecting for leadership and ignore selecting for productivity, to the extent they're not related you're not going to be very good at creating and maintaining software. Although software isn't 100% of Google's success and coding productivity isn't 100% of software success, it's pretty important.
Why are they talking about "Big Data" rather than just "data"? I doubt the data sets they used were so large that they could not be easily analysed on a cheap laptop using normal statistical packages.<p>When trying to work out what best predicts job performance, the quality of your data is by far the most important thing to focus on. I would very much like to know more about the details of their internal studies. There are a lot of difficult problems in trying to use statistics to improve interview processes. One of the big problems is that you will always have a truncated sample of only those people who were selected: you would then expect the importance of certain variables, such as GPA or test scores, to be lowered because those who scored lower on such metrics will have had compensating characteristics...
If you're a hiring manager who uses these things, you should know that I (try) to train/prepare my students to answer them. I do think there is some utility in watching how people approach an unconventional problem, but don't be too impressed with people that can solve them easily, compared to those who don't do well the first time they see them. I see a huge improvement in the quality of answers of most students, once students know it is a gag and once they've been shown how to estimate things. Most students are constrained by having been in a learning environment that provides them with well-defined boundaries within which to form their answers. IMO failing to perform well with these problems is not always a failing of the student as much as it is their educators.
The article also mentions:<p>"It’s also giving much less weight to college grade point averages and SAT scores"<p>In 2004 I interviewed for a Creative Maximizer position. I received a glowing review from my brother who was a Googler. I studied all the ins-and-outs of adwords back then and the British interviewer confirmed: "You did very good on the assessment" (which was working through real ads that needs to be maximized). My opinionated experience has been that in these kinds of situations, Brits embellish less than Americans.<p>However, she told me that my college GPA was "a major question mark" because it was 2.99 and Google only hires people with 3.0 and above (I didn't know what I wanted to do in college). Looking back I'm glad I was never hired, but that burned me bad for a while.
The only true way to tell if an interviewee will be a good employee is actual work/product output with the right amount of responsibility. Product focused people not just coders looking to code lots of tricks to compete.<p>Contract to hire is one way, another is what they have done previously as a good predictor. It is a risk for sure but that really is the only true way in the end.<p>Plenty can be gained from just letting the interviewee talk and maybe looking at some of their code they have done previously while they talk about it. Whiteboard coding should not apply as it is completely out of element for many coders.<p>The type of person they are can't really be detected correctly until they are in the team and delivering because everyone is selling themselves on an interview.
I think companies would be better off hiring people not based on their IQ or skill level but by hiring people who love what they do, have done side projects and achieve flow in their work. People who achieve flow in their work will work harder and are more creative than others because they enjoy the process of solving problems. So the interview process should be to identify how often the given candidate achieve Flow (as defined by mihaly csikszentmihalyi)<p><a href="http://www.ted.com/talks/mihaly_csikszentmihalyi_on_flow.html" rel="nofollow">http://www.ted.com/talks/mihaly_csikszentmihalyi_on_flow.htm...</a>
>> After two or three years, your ability to perform at Google is completely unrelated to how you performed when you were in school, because the skills you required in college are very different. You’re also fundamentally a different person. You learn and grow, you think about things differently.<p>While the analysis is correcting some beliefs about interviewing techniques, do I sense them draw a conclusion again not supported by data? How did they conclude the lack of correlation is "because" the skills required are different and people think differently a few years out from college.
This is great news both for Google and the candidates. Of course as long as the behavioral indicators for the competencies will be defined right - according to actual goals and tasks on the job.
So how useless exactly were they? As long as you are looking for a "right" answer not a correct one, they are a very good metric for testing problem solving skills.
This always seemed so overhyped to me. I did hundreds of interviews at Google and I never once asked anyone a question anything like the ones described. It was generally stuff like "oh hey, you're going to do deep work on our unix systems? What is the difference between kill and kill -15?" We also didn't care about GPA. This all seems like super old information if it was ever true at all.
I'm still a student and like to interview at a lot of places, shop around, and keep practicing my interviewing skills.<p>I STILL go to interviews where I am ONLY asked these kinds of questions...It's embarrassing. If you ask me these questions for a 2 hour long interview then I'm not going to work for you...it's that simple
Why so serious? Isn't hiring about maxing out the potential of a company?<p>Anyone can help maximing it out. I know for myself that having 'clue' reduces self-esteem. Which can be balanced by having the right co-workers. End result: Maxing out the potential.<p>A(Technical intelligence) + B(Social Intelligence) = (Innovative potential)