Gender biases in hiring decisions remain an issue in the workplace. Also, current gender balancing techniques are scientifically poorly supported and lead to undesirable results, sometimes even contributing to activating stereotypes. While hiring algorithms could bring a solution, they are still often regarded as tools amplifying human prejudices. In this sense, talent specialists tend to prefer recommendations from experts, while candidates question the fairness of such tools, in particular, due to a lack of information and control over the standardized assessment. However, there is evidence that building algorithms based on data that is gender-blind, like personality - which has been shown to be mostly similar between genders, and is also predictive of performance, could help in reducing gender biases in hiring. The goal of this study was, therefore, to test the adverse impact of a personality-based algorithm across a large array of occupations.
The study analyzed 208 predictive models designed for 18 employers. These models were tested on a global sample of 273,293 potential candidates for each respective role.
Mean weighted impact ratios of 0.91 (Female-Male) and 0.90 (Male-Female) were observed. We found similar results when analyzing impact ratios for 21 different job categories.
Our results suggest that personality-based algorithms could help organizations screen candidates in the early stages of the selection process while mitigating the risks of gender discrimination.