I don't know if my current company does, but when I first implemented them for a company I worked for ~15 years ago we definitely did.<p>At that company (which was a ~200 engineer, privately held, software company) we found a few things:
- in person tests were less predictive than take home tests.
- tests that did not provide automated test cases as examples were less predictive than those that did.
- there was virtually no predictive power to 'secret test cases' that we ran without providing to the candidate.
- no other part of the interview pipeline was predictive at all. Not whiteboarding, not presenting, not personality interviews, not culture fit testing, not credentials, or where experience came from, nothing. That was across all interviewers and candidates.<p>A few caveats about this:
- this was before take home testing had become widespread and many companies screwed it up. At the time we were doing this it was seen as novel and interesting by candidates, not as just one more painful hoop they had to jump through.
- we never interviewed enough candidates to get true statistical relevance.
- false negatives were our biggest concern, they are extremely hard to measure (and potentially open yourself up to lawsuit). The best we ended up doing was opening up our pipeline to become less selective to account for it. This did not seem to reduce employee quality.<p>In a more meta-sense, that experience led me to believe that strict hiring pipelines are largely not useful. Bad candidates still get through and good candidates don't. Also, many other things have a much bigger outsized impact on productivity than if a candidate was 'good'. It turns out, humans do not produce at consistent levels all the time and things outside of what you can interview for make more impact (company process, employee health, life events, etc. all have way more impact on employee productivity than their 'score' at interview time).