Online hiring is often assumed to reduce biases based on gender, age or race because such information is often not explicitly revealed by job applicants. But a new study shows that biases are alive and kicking thanks to subtle cues from applicants’ names and photos.
Jason Chan and Jing Wang looked at gender bias, which is well-documented offline to be tilted against females. Their surprising finding is that the opposite happens online: employers tend to favour female applicants, and this bias is more pronounced for employers who are less experienced in online hiring environments, for female employers, for workers from developing countries, and for feminine-typed jobs such as administrative support (although there is no corresponding bias for male applicants for masculine-typed jobs).
“By 2020, online marketplaces are expected to be a US$16 billion industry. Given the enormous scale, hiring decisions by online employers will have significant social and economic consequences for millions of workers worldwide. This makes it imperative to consider how employers’ hiring preferences disproportionately affect workers’ access to job opportunities and to examine systematic biases that may affect hiring outcomes in this new labor market,” the authors said.
Their findings were based on an examination of job postings and applicants from a large online labor platform in 2012-13 that amounted to 264,875 postings and more than 5.7 million applicants. Utilizing machine learning techniques via name and picture recognition, they were able to infer workers’ gender for their empirical analysis.
They also conducted tests on smaller datasets to understand the nature and pathways of the biases they detected. The tests included an examination of six worker dimensions – knowledge, professionalism, job fit, trustworthiness, cooperativeness and attractiveness – of which the latter three are not related to competency. The authors found they nonetheless had a greater influence on hiring decisions.
“We reason this is related to the fact that employers online are able to assess the competency of applicants through online ratings and they use stereotypical cues to infer subtle interpersonal traits, such as trustworthiness. While such a bias may appear counter-intuitive in light of offline markets, the finding is not entirely unexpected to scholars. Trust is a crucial factor in enabling relationships and transactions online and female applicants tend to be deemed more trustworthy and co-operative,” the authors said.
They suggested the gender bias could be mitigated if platform owners allowed workers to use pseudonyms or avatars on worker profiles that are publicly viewable, while collecting real names for internal authentication. Employers could also take into consideration the ways in which gender bias plays out when they are hiring.
“Although online marketplaces offer benefits of reduced search costs and efficient matching, which are largely welcomed by both employers and workers, the expansion of these markets also ushers in social and economic challenges observed in traditional marketplaces. Our study helps to address some of these issues by shedding light on the prevalence and nature of hiring bias in online labor markets,” they said.