The role that AI plays in diversity in recruitment is still being debated.
Critics of AI are worried it will replicate human biases that already exist. Supporters believe AI can help us avoid unconscious bias and increase diversity in recruitment.
I’m biased (of course), but I’m a supporter of using AI in recruiting.
Here’s why: knowing AI has the potential to replicate an existing bias means we can monitor for it. And if a bias does occur, we know how to remove it.
Here are 4 promising ways in which AI is being used to help diversity in recruitment.
1. Using AI to reduce biased language in job descriptions
Research has found that the wording used in job descriptions may be a barrier for attracting a diversity of candidates.
For example, studies have shown job descriptions that include masculine-type words such as “dominant” and “challenging” are less likely to attract female applicants.
Using a technique called natural language processing, AI is much better at understanding what we say and write these days.
AI-powered virtual assistants like Siri, Alexa, and Google Home are prime examples of this improvement.
AI can help you identify potentially discriminatory language and suggest more inclusive alternatives to help attract a more diverse talent pool.
2. Using AI to avoid unconscious bias during screening
AI promises to help us avoid unconscious bias during candidate screening.
Biases related to demographic information such as race, gender, and age can be triggered by information on a resume such as the candidate’s name, the schools they’ve attended, and the dates they’ve held previous positions.
AI can be programmed to avoid these types of biases by ignoring this information when screening resumes. In theory, this means AI will help improve diversity in recruitment.
However, AI also has the potential to learn biases that might already exist in your data.
For example, if your company has historically recruited candidates from a handful of colleges, then AI trained on this data may learn to rank graduates from these schools as more qualified candidates.
The nice thing about AI is that unlike human biases, it’s much easier to audit and reduce those biases.
AI can be tested for bias by testing the demographic breakdown of the applicants it screens. If the AI is disproportionately excluding a group of people, this exclusion can be corrected.
3. Using AI to find diverse candidates during sourcing
The flip side of AI ignoring demographic information is that AI can be used to find diverse candidates.
A lack of workplace diversity is often blamed on the pipeline problem. For recruiting, a big part of the candidate pipeline corresponds to sourcing.
An example of how AI can be used to find diverse candidates is using it to predict candidates’ gender and race based on their first and last names.
These predictions can be used to help companies source candidates of certain groups in order to fulfil their diversity goals.
4. Using AI to communicate more objectively with candidates
Whenever we talk to anyone, our brains experience biases.
For example, similarity attraction effect is the tendency for us to like people who are more similar to us.
In recruiting, this bias leads to preferring candidates who attended the same university that we did or plays the same sport or comes from the same hometown.
Another common bias in recruiting is the halo effect.
This is when we assume because someone is good at activity A, they’re going to be good at activity B, C, and D. This effect comes into play when we like a candidate and then assume they’ll be good at the job.
One way these types of biases can be mitigated is by using chatbots.
Similar to using AI for sourcing and screening, AI-powered chatbots can interact with candidates far more objectively by ignoring their race, gender, age, and other personal information that can be used to discriminate against them.
Summary: 4 ways AI is helping diversity in recruitment
By adding AI to the top of the recruiting funnel, we can help remove bias and improve our chances for increasing diversity in 4 major ways:
- Removing biased language in job descriptions
- Avoiding unconscious bias during screening
- Finding diverse candidates during sourcing
- Communicating more objectively with candidates