Avoiding Data Bias When Hiring

One of the significant trends in hiring is using artificial intelligence (AI) to streamline some of the menial parts of the process. While AI can make a big difference in the efficiency of the hiring process, it can also introduce a problematic side effect: data bias.

Employers must take care to avoid data bias if they decide to use AI technology to drive their hiring processes. To avoid data bias, the first thing that employers should understand is how using AI in hiring can lead to bias and discrimination.

It’s no secret that discrimination in the workplace is common. According to a 2019 survey conducted by Glassdoor, 61 percent of workers in the United States report that they have “witnessed or experienced discrimination based on age, gender, race, or LGBTQ status in the workplace.” The survey also indicated that, while the problem of discrimination in the workplace is an issue everywhere (Glassdoor also surveyed respondents from the United Kingdom, France, and Germany), the U.S. is the worst offender. 55 percent of workers from the U.K. said that they had witnessed or experienced discrimination in the workplace, compared to 43 percent in France and just 37 percent in Germany.

Discrimination in hiring is illegal, and many employers include disclaimers on their websites or job applications (or both) that speak to a commitment to equal employment opportunity standards. At backgroundchecks.com, we educate our customers about the Fair Credit Reporting Act, the Equal Employment Opportunity Commission, and other resources and regulations that can help employers to avoid bias or discrimination in a background screening.

Despite these factors, discrimination can sneak into the hiring process—sometimes due to the use of AI in hiring. This revelation is surprising to many employers. A common refrain among hiring managers is that they implement AI in hiring in part to eliminate unconscious bias in the hiring process.

For example, a common form of AI in the hiring process is an automated resume sorting system. By teaching an AI program or algorithm to sort resumes based on specific criteria, the employer will be able to judge resumes objectively and remove any bias that a hiring manager might bring into the equation. The problem is that, while the software is technically sorting the resumes on its own, it is doing so based on criteria set by the hiring manager. As such, the technology framework can include and perpetuate the same gender biases, racial biases, and other biases of the hiring manager.

Data bias is not new, minor, or invisible. In 2015, Amazon abandoned an AI recruiting tool after realizing that it had a gender bias. The software was reviewing resumes and vetting candidates based on criteria that favored male applicants. This bias was not purposeful: it reflected the fact that the tech industry at the time was dominated by men. The AI was perpetuating that gender imbalance rather than providing a way to help Amazon avoid it when filling technical roles.

How can employers avoid data bias in the hiring process? The first option is not to rely on AI for hiring in the first place. This option isn’t ideal for a few reasons. Hiring managers aren’t always free of biases, and AI does have potential benefits for employers in streamlining the hiring process.

A better option is for employers to use software to aid their hiring process but not to trust that it is free of flaws. When purchasing applicant tracking systems or other new software tools for hiring, employers should vet the software carefully. From reading reviews to asking the vendor questions about the data that the software considers, pre-purchase scrutiny can help employers identify potential data biases before they impact an organization.

It is also critical for employers to test new software both before and after it goes into use. Before you implement a new software framework, run a mock trial. For a resume sorter, you might start by running a batch of resumes as a test to see whether the results indicate a bias. Even if the software passes your mock trial, you will want to audit your software after each hiring process—at least to start—to watch for any signs of applicant discrimination.

AI in hiring is an effective way to drive smarter hiring decisions while saving time and labor. However, understanding that data bias does exist—and knowing how to combat it—is a must.

Get monthly updates on  background check news, industry trends, and changes in laws and regulations.

Michael Klazema

About Michael Klazema The author

Michael Klazema is the lead author and editor for Dallas-based backgroundchecks.com with a focus on human resource and employment screening developments

More Like This Post