Governments Face Challenges in Regulating AI and Hiring Algorithms for Background Screening

For many employers, two priorities take center stage during hiring: finding the best candidate and finding them quickly. Lengthy hiring processes are expensive and leave the business less equipped to tackle its needs. In competitive and crowded job markets, finding good candidates quickly is easier to say than do—and that means technology has always been an appealing option. As advanced algorithms and low-level artificial intelligence applications begin to reach the general public, many companies now ask how they might use these technologies to improve background screening.

Despite many misgivings from the public, some employers have confidently expressed a desire to move ahead with turning some or all the hiring process over to computers. In such a system, an advanced algorithm or AI scans resumes and cover letters and aims to elevate candidates for consideration when they meet certain criteria. In a hypothetical situation, an employer could tell its AI to find someone suitable for a specific job. In minutes, it might provide a shortlist of five candidates from hundreds or thousands of applications never seen by human eyes. An AI could even someday assess the results of background reports.

For a business, the advantage is obvious: push a button and get candidates. However, the reality is far from being so simple and trouble-free. In reality, there is concrete evidence that such systems can contain inherent biases—and sometimes, they can even "learn" to bake racial and social bias into their considerations. In New York City, lawmakers passed an ordinance several years ago that would ban such tools from use until they went through a "bias audit" conducted by an independent agency.

A fierce lobbying effort against the law places its passage on hold, with many employer-led groups fighting to make the scope of the rule as narrow as possible. Advocates for fairness in hiring claim that these efforts have resulted in a toothless and confusing law that will be difficult or impossible to enforce, potentially unfairly leaving many applicants on the cutting room floor. So fierce has the debate been that the city has postponed the implementation of the rule yet again as they continue to assess public comments.

Even the federal government has taken note of the potential dangers of this new technology. The Equal Employment Opportunity Commission, as of 2023, plans to include new enforcement actions targeted at AI hiring tools in its next set of regulations. It is expected that the EEOC will formulate rules that better define how and where AI is acceptable in hiring while also defining what constitutes discriminatory conduct. 

For now, the traditional hiring process remains effective, especially when robust employment background screening solutions make quick work of some of the most common hiring bottlenecks. Big regulation changes could be on the way in this space as locales grapple with the technology and the federal government formulates new guidelines. Employers should carefully monitor these developments, especially while background screening companies continue exploring such technology.

Get monthly updates on Background Checks for Employers

Michael Klazema

About Michael Klazema The author

Michael Klazema is the lead author and editor for Dallas-based backgroundchecks.com with a focus on human resource and employment screening developments

Michael's recent publications

More Like This Post

State Criminal Search

Virginia Criminal Search

Order a criminal record search for Virginia and get your report in 1-3 days for 10$.

Order a Search for Virginia