- 60% of candidates wouldn’t have been hired through a typical CV sift.
- 28% of candidates of ‘non-white ethnicity’
Applied is a hiring platform designed by behavioural and data scientists to remove bias and improve the predictive validity of hiring decisions and is the first spin out of the Behavioural Insights Team. Applied is designed to remove the opportunity for unconscious bias through an algorithm which anonymises, chunks, and randomises the order of candidate applications. By reducing reviewers’ access to information we know is not predictive of on the job performance, organisations can focus on what matters: the ability to do the job.
A 2016 validation study showed marked differences in candidate shortlisting using this methodology: over half the graduates that were ultimately hired wouldn’t have been if CVs had been used. Focusing on skill, not pedigree meant candidates from a wider array of sociodemographic and educational backgrounds were hired. Organisations ranging from UK government departments to Penguin Random House, Berwin Leighton Paisner, and LEK Consulting are now using the platform to ensure fairer, more evidence-based hiring.