A major problem with data in the job market is that there are bad actors on both sides, employer and employee, with incentives to lie about themselves; ie companies with toxic cultures won't reveal that, nor will malicious or incompetent job candidates. Many readers of HN know that companies can be bad, but don't recognize that, say, competent software engineers are not the median candidate applying for most roles. The other side has a filtering problem, too.<p>A second major problem with data is that it's often hard to know why something didn't work out. Yeah, not everybody's a good fit for a role. But where would you get an objective source of truth about an individual's performance. Or ... transpose it to dating: would you trust a group of ex-girlfriends' or ex-boyfriends' to give an accurate assessment of someone as a partner? Might they have incentives to distort the record? Where does good data come from?<p>So it's not just that AI lacks the data. It's that there are structural problems with ever gathering the data. It's not like the data's out there and we just don't have access. If it even exists, it's poisoned at the source.
This is more subtle and profound that it might seem.<p>I work for a Big Tech and do tons of interviews, and not even us we have the data about what works and what doesn't (and if that data does exist, it's a very closely guarded secret). Even if we had hard data, we would still be blindsided by our inherent biases i.e. we know how candidates we accepted did, but not how candidates we rejected could have done.
I have seen human recruiters and companies blatantly lie about what they want, and what they are willing to offer as well.<p>Bad data problem doesn't go away because there's a human doing the matchmaking manually. If anything, it's adversarial against job seekers if the process isn't done algorithmicly.
AI doesn’t have the data. Neither do humans. So why are humans better again? I see it answered in another part. Humans are better at deciding who they like working with. Which is fair enough. Although AI might help remove some biases that creates.<p>By the way this is a very hard problem because it is like solving “war” in a sense. People need money to not die. They are very motivated to get a job. Therefore any system anyone comes up with gets gamed. There are 2 types of skill: ones you use in jobs and ones you use in interviews and there is very little overlap. It will be hard to ungame something where people need a job.
Totally agree with this. I was tasked with getting some insights from hiring data once and everyone was disappointed we just couldn’t get anything meaningful. There were echos of something. But no signal could rise above the noise into the realm of statistical significance. Seems like we just don’t know how to measure what matters.
If you wanted ideal data, you'd want an employer to hire large numbers of people for a single role, hire them as randomly as possible, and be willing to train them. You could then see what sort of candidate was successful there.<p>There are a handful of militaries that effectively do that, but it's pretty much impossible for a normal employer.
AI can't do hiring because it knows the best hire is the one you don't do: why bother with a pesky human when it can do the job itself cheaper/faster/soonish better.<p>Aside: having LinkedIn is a red flag in my book. I understand that one's desperation could lead to try getting a job using that cesspool of anti-patterns glued together with pure spite for the user, nevertheless, a red flag. Or to put it otherwise, if having a LinkedIn is a requirement for you to get hired, your job will disappear in 5 to 7 years (due to AI or other corporate movements).
The type of ML algorithm that would do "automated hiring" has nothing to do with LLM and the current AI mania.<p>Classification algorithms work well in objective contexts where the data contains fundamental and stable associations between attributes and the property one wants to predict.<p>When applying this stuff to human affairs, whether that is credit scoring or dating apps or employee scoring you must be comfortable with very poor and unstable performance, extreme biases, and gaming behavior.<p>The alternative is ofcourse the HR department so pick your poisson.
"AI" (LLM) hype is so great, people are faulting it for not solving issues they overlook. Which means the solution isn't in the data that it's trained on.<p>Most people don't want to hire an employee that is competent. That alone is the truth. They value other non-technical factors highly, even when those factors might negatively affect performance.<p>As I learnt from a senior developer, the best developers are the people who have horrible impostor syndrome and constantly downplay their own abilities. These guys usually turn out to be geniuses, no matter where they've been (or haven't been) employed before.<p>Plus, when selecting for teamwork, instead of following the usual dogma, it's always better to hire someone that's been though shit rather than hiring someone who's always been cushy. People that have been through the ringer know how to cooperate well and even better, know <i>why</i> such a thing is necessary in the first place. The other kind of people usually are horrible backstabbers. Yes, they've had extensive corporate experience, that's usually not a good sign.
But it will. Because it doesn't matter if it's fundamentally better than a good recruiter if it's orders of magnitude cheaper. If you can have it pursue far more leads, maybe the outcomes are going to be the same or better. And if you used it to replace a bad or a mediocre recruiter? In any case, you might not care: hiring is a crapshoot anyway, and AI is saving you millions of dollars.<p>You want to weed out people who are clearly unqualified, but that's not rocket science. Beyond that, every company has a different hiring bar, a different process... and approximately zero data that their approach works better than anybody else's. Interview performance is a poor predictor of job performance. Whether the bar is high or comparatively relaxed, around 70% of the people you hire will be good, and the rest will underperform, leave after a couple of months, have difficult personalities, and so forth.
I've never been interested in the automated matching aspect of HR but this piece has me inspired to attempt the challenge.<p>To add a point, my favorite reason to use recruiters, is having them do the negotiation process for me.<p>High quality post as usual.
It doesn't lack data, it lacks cultural. Nuances change rapidly AI would have to have data from these nuances and so AI is more fit as an assistant. You'd be surprised how we have to adapt to cultural changes when hiring. Things last year are no longer acceptable today.
From: <a href="https://interviewing.io/blog/why-ai-cant-do-hiring#user-content-fn-4" rel="nofollow">https://interviewing.io/blog/why-ai-cant-do-hiring#user-cont...</a><p>"The future of technical assessments, in the age of generative AI is a noble and difficult topic, though, and it’s something we’ll tackle in a future post."<p>This is exactly what we have built (are now applying to YC as well).<p>Can share more details soon (we are integrating some much needed love on our site), but we are:<p>- FAANG + Startup + Fintech founders<p>- super technical and our expertise is in Product Engineering + ML-for-products<p>- keeping everything 3.5-turbo based free, and probably will have to charge for 4<p>Our early users (Google/Meta SWEs and a few small AI startups) are pretty shocked with the results they are seeing, which they consider to be "as good if not better than human interviews + feedback".<p>I'll post here next week with a link to try!
Largest problem with data is that it can be falsified. So if you look at jobs like on Glassdoor they are well under the going rate. This is what is used to supress the job market.
fortunately the employers are now tracking like everything we click on and every time we move a mouse, every key we type, every second we are logged on... also every single IM/teams chat we send and receive, every phone call we make...every email we send, every sentence we typed out and then deleted because it sounded wrong,... so... they will have the data pretty soon.
There's always a lot of talk about meritocracy, but I want to convince you that this is an unachievable thing. We often hear "it doesn't matter your university's prestige/your credentials, just the quality of your work." As if the two don't strongly correlate.<p>But the question is why they correlate. Social networks matter, a lot. If you're at a top 10 university your university's career fair is going to be filled with top companies. This isn't true for less prestigious universities. If you're a grad student, you also know that the connections your advisor has strongly correlates with your ability to get a job (nepotism playing a significant role here).<p>We also see everyone using LeetCode to filter candidates but there's so much evidence that this is just a noisy filter. Not only do people cheat and succeed[0] but just consider how it is a meme that this doesn't correlate with the actual job. Our whole community has the position "study for the test, then forget it". Honestly, this feels insane to continue doing this, and to even pump more money into lifting this system up. We also know that grades are noisy[1] and does anyone remember those brain teasers that google used to do (also [1])? The ones that people still use?<p>So how do we hire? Well, to do that we need to look at what the job actually requires. But the job will change. We also need to know how well a candidate will work with the team. These are really difficult questions to answer and it would be insane to think that a resume or in person interview could be a non-noisy process (or at least where noise doesn't dominate). Nepotism has succeeded because it is a decent filter. It has a lot of false rejects but it works because you are outsourcing a lot of questions to someone else that has intimate knowledge. It is easy to determine this, logically. You have three candidates who all perform equally well on their resumes, interviews, and whatever. You only have this information for candidate A. Candidate B was recommended by a close friend who you trust. Candidate C also knows a close friend but that friend dislikes them. Who are you going to hire? Why? The nepotism is actually giving you more information. This is not an argument for nepotism but rather a illustration about how the blind interview process is highly noisy and that there is still a lot of missing information.<p>CS seems to be very weird in a lot of these respects. In a standard engineering job (e.g. Aerospace, electrical, mechanical) you generally send in your resume, talk with a few people (2-3 interviews) where a few technical questions will be asked, and that's about it. Resume -> phone screen (~30 minutes) -> In person interview (30min - 90 min). No whiteboard problems/puzzles. No take home tests/projects. In person interviews have several engineers on the team that is hiring and they ask behavioral questions as well as a few relevant technical problems. At most you'd have to do a back of the napkin calculation. Why do these firms do it this way? They've recognized that the system is noisy and that essentially you apply a few filters and then just hire the person. In 3-6 months if they don't work out, you let them go. If the hiring process is easy/cheap then you can also turn over bad hires easily. It also isn't uncommon to get a call 3-6 months down the line from your final interview. So you don't always have to repeat the whole process because there are still often available candidates. I've met plenty of people who work at FAANG jobs and brag about how they work 20hrs a week making $150k/yr. I've never met an engineer like that. Maybe that means something, maybe it doesn't.<p>[0] <a href="https://news.ycombinator.com/item?id=35885342" rel="nofollow">https://news.ycombinator.com/item?id=35885342</a><p>[1] <a href="https://www.cnet.com/culture/google-gpas-are-worthless/" rel="nofollow">https://www.cnet.com/culture/google-gpas-are-worthless/</a>