Fighting bias with structured interviews

Hiring for Tech
June 29, 2020

Concrete pillars

A solid structure helps reduce bias in your hiring process. Photo by Mirko Blicke on Unsplash

Algorithmic interview questions have an objective criteria. Either the candidate produces the correct algorithm, or they don’t. These interviews, for all their faults, are an example of structured interviews. As Google puts it, structured interviews use a predetermined set of questions, and independently evaluates candidates based on uniform criteria.

For this reason, I was really happy to see an emphasis on structured interviews in Meghan Langill’s post, How do I recruit diverse candidates? Her outline for how to fight bias in the recruiting process has a theme:

I recommend you read her post, as it has actionable takeaways for improving your hiring process. But I had a few follow-up questions, which Meghan was kind enough to answer.

Before the interview

How does sourcing fit into the structured interview process?

Sourcing is crucial to the recruiting lifecycle. Understanding these biases and bringing them to your consciousness will allow the individual sourcing to have a more diverse perspective. An example of this would be searching for a backend engineer that has a degree from MIT or Cal Tech exclusively. A better approach would be to identify those key skill sets needed for the position (ie. large scale distributed systems and Java) and searching based on the core criteria.

Also, sourcing is not the only medium to further introduce diversity to the recruiting lifecycle. Ensuring your job descriptions are not heavily gender weighted or tailored to a certain demographic is just as important.

What are we still missing?

Large tech companies already employ structured interviews: candidates are asked to solve whiteboarding problems with well-defined solutions. Still, the data shows a lack of diversity in tech roles. How do biases enter these structured processes?

Unconscious biases enter into the interview process immediately and unconsciously. Everyone has a locus of inference which impacts how they see the world. Viewing the world through this lens will cause the individual to have preference for what they believe to be true before they are able to acknowledge it.

For example, if a former fraternity brother of Delta Delta Delta interviews another Delta Delta Delta fraternity brother, the interviewer will have a bias toward this individual due to the visceral feelings towards shared experiences. This individual would try to find commonality through this bias in order to build rapport, especially during soft skills (aka culture or cross functional collaboration) interviewing.

This point aligns especially well with something Meghan mentioned in her post: “Create a more diverse group of people making final hiring decisions. This can help omit the chances of stereotyping affecting final decisions.”

Toward better hiring

Based on Meghan’s answers and what I’ve been talking about in my writing, there are some key takeaways:

  1. We need to double down on structured interviews, even if the questions need to change. Areas like sourcing and soft skill interviews still need a structured approach.

  2. Avoid bias built into the structure. This is the point I’ve focused on in the past, where algorithmic questions may be objective, but they only select for specific types of people.

  3. A lack of diversity is perpetuating itself. In addition to being aware of our biases, we need diverse decision makers to reduce the prevalence of these biases in the first place.

I’ve talked quite a bit about the last two points, but expect some posts in the near future talking about the first one.

This post was sent out on the June 29, 2020 edition of the Hiring For Tech newsletter. Subscribe to get future editions sent to you by email, once a week.

powered by TinyLetter