2013-06-23

Interviewers don't know jack

At least, that's the impression given by Google's HR head Laszlo Bock:

Years ago, we did a study to determine whether anyone at Google is particularly good at hiring. We looked at tens of thousands of interviews, and everyone who had done the interviews and what they scored the candidate, and how that person ultimately performed in their job. We found zero relationship. It’s a complete random mess, except for one guy who was highly predictive because he only interviewed people for a very specialized area, where he happened to be the world’s leading expert.
For anyone who's done any substantial amount of interviewing, this will ring a bell. Without the data following-up on hired candidate performance in their jobs, there's very little that the individual interviewer receives as feedback on their interviewing. If you consistently have 4+ people doing interviewing the same candidate on closely-related topics you can compare their score and identify interviewers who give diverging scores - but then you have no idea whether the outlying interviewer was correct or not. I sometimes wonder whether in these situations companies should hire a small fraction of candidates who score highly with divergent interviewers but moderate-to-low for the rest. Of course, that's an expensive way of stats gathering.

Megan McArdle analyzes the interview and draws conclusions that I think are slightly off:

Resume and past work history are much better predictors of future performance [than brainteasers]. The problem is that in most fields, these are hard to ascertain unless you're pretty prominent.
I take a little more hope from Bock's analysis. I'd agree with McArdle (and Bock) about the relative useless of brainteasers. I would disagree to some extent with the resume and work history as predictors. What resume and work history really give you as an interviewer is a baseline for what to expect of an interviewee's performance, and to give the interviewer a pointer to the work-related questions to ask.

Example: the resume (CV) claims that the interviewee has experience building distributed systems, and has 8 years of Perl development. Immediately you, as hiring manager, know that one of your interviewers should throw a (company-standard) distributed systems development at them, and expect them to nail most of the high points. All interviewers should expect them to be able to write some Perl on demand and expect it to parse, use modern idioms, and employ efficient and suitable constructs. Falling short on any of these indicates that either the resume is "generous" with the facts or that the interviewee does not learn and increase their ability with experience as quickly as could be expected.

To make use of these facts, as Bock notes, you need to have standardised assessments of your candidates - a smallish bank of interview questions with a calibrated range of possible responses. This may well not be the world's best predictor of ability in a job, but at the very least it's a reliable way of screening out the under-performers and the outright resume fabricators.

No comments:

Post a Comment

All comments are subject to retrospective moderation. I will only reject spam, gratuitous abuse, and wilful stupidity.