AI in the hiring process is already controversial, but less attention has been paid to the growing intersection of AI and tenant screening. With companies and brands seeking to integrate various AI tools into their workflows rapidly, many employers have also begun investigating this technology. It is easy to see the potential problems with taking more of the human element out of hiring, but what about housing?
To understand the current state of the field and its impact on tenancy, it's important to understand two things: first, how AI works in screening, and second, how AI tools can introduce unintended bias into the process.
What Is the Big Deal About AI?
Finding suitable, safe tenants that can provide a steady revenue stream is a significant challenge for property owners. This effort is further complicated in crowded metropolitan areas where there can be intense competition for every open housing opportunity. AI background checks and tools that can automatically analyze a tenant's application could, theoretically, help managers speed up the process and designate new tenants faster.
AI's strength lies in its ability to learn and evolve as it goes through data. Therefore, it can be adapted to use natural language capabilities to parse an application, order a credit report, or place a request for a background check. Some developers envision tools that analyze these reports directly to provide a yes or no suggestion for property managers. One day, it may even be possible for a voice and chat AI to contact references on behalf of managers.
How Can an AI Have Bias?
AI isn't truly artificial intelligence; it's a system for making predictions and decisions based on prior data. The way an AI performs is intensely tied to its training data. As a result, the inherent biases of AI developers can make their way into the training methods. AI may also draw unfair conclusions or reach discriminatory outcomes based on the quality of its training.
We've already seen examples of this in early AI chatbots that quickly began to espouse racist or sexist beliefs when prompted. Such problems could exist in the same tools used to evaluate tenant background checks, leading to concerns that AI tools could become a serious barrier to fair housing and employment. Currently, numerous working groups in the government, including the EEOC and FTC, are formulating proposals for AI regulations.
How Can Those Using AI Tools Work Around Bias Issues?
For those interested in bringing AI into the process, the best practice today is to move cautiously. Be aware of local restrictions on how you can conduct a real estate background check, as some locales have already begun to move towards restricting their use. For example, New York won't allow such tools until they complete an independent screening for bias.
For property owners and employers, it is vital to keep a human element in the process. Automating more of the workflow is smart and can save time and money. However, the ultimate decision-making should always come down to a human making decisions about other humans. This step provides the opportunity to identify flaws in AI reasoning, to think critically about an applicant's suitability, and ultimately to come to an independent (if AI-supported) conclusion.
Looking to the Future
Ultimately, the long-term impacts of AI are hard to predict at the present moment. Although the gears of government regulation turn slowly, they are moving towards what could be a new regulatory environment for these tools in the future. However, that could be years away.
For now, employers and property managers alike should approach these tools with caution and beware of "silver bullet" claims by developers. Though AI and tenant screening could go hand in hand, it is also important to be mindful of its growing pains and to maintain a human element throughout the process.
About Michael Klazema The author
Michael Klazema is the lead author and editor for Dallas-based backgroundchecks.com with a focus on human resource and employment screening developments