Tech News
← Back to articles

A New Standard for Ethical Hiring: Combining Human Judgement with AI Fairness Tools

read original related products more articles

The ability of AI-powered systems to make decisions based on complicated sets of data has introduced speed and efficiency to hiring, the loan approval process, fraud detection, and more. But has it enhanced accuracy? Many would argue no, especially those impacted by biased hiring algorithms, including employees and employers.

One of the most elusive conundrums is how to fight age-based bias by AI-powered hiring systems. For the 2024 IEEE International Conference on Big Data and Smart Computing (BigComp), the University of Northern Colorado’s Christopher G. Harris recently took on the task of studying two potential solutions: human-in-the-loop (HITL) systems and AI fairness toolkits. By examining how each performs, both in isolation and together, Harris provides recommendations regarding how organizations can use them to reduce age-based bias and implement ethical AI hiring practices.

How AI Algorithms Introduce Age Bias in the Hiring Process

AI algorithms introduce age bias in the hiring process because of the data they’re trained on and the keywords they look for.

For example, historical hiring data is often based on hiring practices that were made by biased humans. A hiring manager may see that someone graduated from university in 1997 and immediately start looking for age-based weaknesses in their CV. These kinds of biased decisions get included in the training data, leading to biased AI algorithms.

There’s also the issue of training data not including enough older applicants. If an algorithm gets trained based on data from mostly younger applicants, it may be more likely to recommend interviewing applicants with similar attributes.

In addition, keywords and skill descriptions are problematic because an AI algorithm may look for specific phrases, particularly those used by younger applicants, especially in a CV used to apply for a tech job. The absence of these phrases doesn’t mean the candidate is less qualified, but the algorithm may make that assumption.

Two Ways to Fight Bias

Human in the loop (HITL) and AI fairness tools are two very different methods that reduce the risk of an algorithm making a biased decision.

Human in the Loop (HITL)

... continue reading