Artificial Intelligence (AI) is one of the biggest technological trends of 2022. AI has the possibility to deliver huge amounts of business value for different organizations. One department where AI is increasingly being used, is the recruiting department (HR) (Montesa, 2022). Artificial Intelligence adds value to the recruiters by helping them become more efficient, personalizes the recruitment process, and helps recruiters in making data-informed decisions (Montesa, 2022).
One aspect that has been addressed in the news and through social media platforms is the bias that is present in recruiters and the recruitment process of new employees. People being selected, or deliberately not selected, because of race, gender, or age is a big problem that needs to be fixed. Men are still preferred over women as they are viewed as being more skilled and white-sounding names were 50% more likely to receive an interview request (Hinde, 2021).
AI is now being used to get rid of biased recruitment processes and give all potential new employees an even and fair chance. AI is designed to focus solely on whether or not someone is qualified for the job and not take into account race, gender, or age. AI builds on datasets from which it can make decisions on a potential employee. To design AI to be unbiased, the datasets it builds on should all be unbiased (Innodata Inc., 2022).
But new research shows that the promising AI systems are still failing to reduce recruitment bias or improve diversity within companies (Vallance, 2022). It was discussed by Cambridge researchers that AI cannot be trained to select employees solely on job-related characteristics, as what we believe are essential components of being a good employee are connected to both gender and race (Vallance, 2022). It was also discovered that wearing glasses or wearing a headscarf still led to the applicant being less likely to be selected for the job.
I believe that AI is a very promising digital trend and that it could help (for a big part) solve bias in the recruitment process. But, with the implementation of AI come risks which include risks regarding ethics. Datasets should be built and further improved for AI to function better and further work on eliminating bias in the recruitment process.
We all deserve a fair chance in life after all.
Bibliography
Hinde, G. (2021, May 10). 4 Hiring Bias Study Statistics That May Shock You. IQ PARTNERS. Retrieved October 13, 2022, from https://www.iqpartners.com/blog/4-hiring-bias-study-statistics-that-may-shock-you/
Innodata Inc. (2022, August 3). Eliminating Bias from Hiring Using Artificial Intelligence.
Retrieved October 13, 2022, from https://innodata.com/eliminating-bias-from-hiring
using-artificial-intelligence/
Montesa, M. (2022, April 11). AI Recruiting in 2022: The Definitive Guide | Phenom.
Phenom | Intelligent Talent Experience. Retrieved October 13, 2022, from
https://www.phenom.com/blog/recruiting-ai-guide
Vallance, B. C. (2022, October 13). AI tools fail to reduce recruitment bias – study. BBC
News. Retrieved October 13, 2022, from https://www.bbc.com/news/technology-63228466
Hi Miguel, very interesting post on a quite relevant topic! I remember a use case of AI at Amazon during their hiring process which had a high bias towards men. This was considered very unethical and Amazon got rid of this AI model as soon as they found out. As you mentioned, the dataset should be unbiased which is completely true. AI in itself is not the one who is failing when judging incorrectly/biased, because it is fed with biased data. It is the one who is feeding the AI with unbiased data, and this is most often done by humans. I do think that AI will become part of the hiring process, but not in the coming five years, as to me, it still is too inaccurate and unethical because of the incorrect use of datasets. In my opinion, this problem could be solved by having an overarching regulatory framework on the ethicality in datasets that are used by AI models.
Hello Miguel (and Jordi),
Interesting topic indeed. Let me first say, it is already part of the hiring process at some big companies. My sister did her PhD in this field, and I got some insights from her experience at AB InBev (large Belgium beer producer). AI has a prominent role in the selection process (although indeed there are discussions on the social/ethical decision-making aspects). The selection process generally consists of multiple rounds (5-7 rounds) where the first 2/3 rounds are conducted solely by the AI. Think of cognitive games and digital AI interviews. They may implement a feedback-loop, through which the outcome of the final selection is fed back as data input for the AI to train on. This way, the AI ‘better understands’ what type of people the company is hiring in the end, and therefore what type of people the AI should consider as more favorable. This may be one reason why it is difficult to solve bias in the recruitment process.
Hello Miguel (and Jordi),
Interesting topic indeed. Let me first say, it is already part of the hiring process at some big companies. My sister did her PhD in this field, and I got some insights from her experience at AB InBev (large Belgium beer producer). AI has a prominent role in the selection process (although indeed there are discussions on the social/ethical decision-making aspects). The selection process generally consists of multiple rounds (5-7 rounds) where the first 2/3 rounds are conducted solely by the AI. Think of cognitive games and digital AI interviews. They may implement a feedback-loop, through which the outcome of the final selection is fed back as data input for the AI to train on. This way, the AI ‘better understands’ what type of people the company is hiring in the end, and therefore what type of people the AI should consider as more favorable. This may be one reason why it is difficult to solve bias in the recruitment process. One of the considerations of the research was that in the end a human should decide who to hire and who not. Not the AI. That being said, the AI can play an important role in the first stages of the application process by filtering out, without bias (if there is no feedback loop), candidates that are not fitting for the role.
Hi Miguel and Ian,
Interesting to hear that bigger companies already use AI in the application process. I also agree that AI can be a very helpful tool to solve bias in the recruitment process. I am a corporate recruiter at TOPdesk, a software consultancy company, so I am very familiar with the selection process. Although I believe that AI can be a good tool, I do think that it really depends on the company if it is desirable. At TOPdesk, we think it is very important that candidates have a good match with the company culture – the first interview only goes about looking whether there is a fit with the candidate and company culture. As you mention that AI can be helpful in the first few rounds to do digital AI interviews, I have a hard time believing that this could be implemented at TOPdesk. I think it is very difficult to teach AI how to see if there is a fit with a company’s culture. However, I do believe that at companies that select on hard skills in the first few rounds, AI could be helpful. It all really depends on a company’s selection process, I would say!
Hi Miguel, thanks for sharing this interesting topic. I know some large advisory companies indeed using AI in the recruitment process. I can expect the bias is hard to be completely solved by AI since the unbiased training dataset is truly not easy to build and wrong models could even mess up the recruitment. However, I still think it is more efficient to apply well-trained AI in the first stage of recruitment such as filtering the resumes. For further interviews, I think they would be difficult to be replaced by AI since if a person’s characteristics could match with the current team is really essential and it would be hard to tell without real interactions.