Updated: Apr 10
My name is Alex. I am an ex-lawyer, turned entrepreneur, and co-founder of CodersFirst. We help programmers discover their engineering strengths in an unbiased way and route them to startups where they can add the most value and spread their wings.
A few years ago, I didn’t see anything wrong in the recruitment process based on credentials. What’s more, what I put in the CV was my obsession. Sobriety came when I applied to a prestigious scholarship, and I got rejected because of my gender. The interview hadn’t any formal structure*. I was asked random questions by the interviewer who was/is a misogynist** (both male and female alumni publically called him this way). A scoring system was his opinion.
First, this situation drew my attention to the problem of gender inequality in the labor market; however, over time, I have noticed that the problem is much bigger than only gender discrimination.
At that time, I worked at a startup where I was responsible literally for everything, even for hiring programmers. We had followed a standard recruitment process: a CV screening, take-home task, a pair programming session, and culture fit meeting. It works for others, why shouldn’t work for us, right? But it didn’t work. The team was continuously fighting who should we hire as we had nothing but a gut feeling. I felt it’s not right and started mapping our hiring process, analyzing FAANG’s hiring process, and read papers regarding recruitment and assessments.
The answer was clear and distinct: we need to reduce human bias*** and find an interview format that extracts as clear data as possible — this of how the idea of CodersFirst was born.
From this time, we managed to convince 47 tech companies to use our credentials-blind and skills-focused recruitment process. We have interviewed 630 programmers and still counting. 52% of Final Interviews through CodersFirst result in an offer, while the industry standard is 20%.
Changing the hiring process and the status quo is challenging because people are complicated. Being biased doesn’t always mean conscious discrimination. Imagine that you’ve just walked past a woman in a hijab. Do you think she is a mother? A refugee? Or a victim of oppression? Or do you think she is a cardiologist? A barrister? Or maybe your local politician? (…) You don’t need to be a racist to be biased. We are all biased against what’s different to our social norms: how we think, what we do****.
Here we are, consistently experimenting with the interview format and questions to find the best way to extract as clear data as possible while assessing programmers to reduce human bias significantly. It’s worth trying because it will allow anyone, from anywhere, to be assessed on real abilities, instead of gender, origin, or education, and in the process of finding a dream job. Companies, on the other hand, have access to better programmers and can build great products!
If this topic is just as close to your heart, we would love to see you join codersfirst.com on our mission to a more diverse tech environment and hear your feedback about our hiring approach.
*I heard that a year after this accident the institution mentioned here change their approach and worked out a new unbiased way of selecting people. I am not following them, so I can’t say if this is true, but I am keeping my fingers crossed for people who have noticed this unfair situation and decided to change it.