Department of Justice Shows Interest in Algorithm-Based Tenant Selection Case

Algorithms are increasingly being used by landlords and property managers to select tenants. These algorithms use data such as credit scores, criminal records, and rental history to determine a potential tenant’s suitability.

One common algorithm used in tenant selection is the “Tenant Score.” This algorithm assigns points based on such factors as income, rental history, and credit score, and then generates a score that indicates the likelihood of the applicant being a good tenant.

While these algorithms may help landlords make more objective decisions, they are not without controversy. Some argue they may perpetuate biases and lead to discrimination against certain groups. This is the basis for Louis et al. v. SafeRent et al., a lawsuit currently pending in the U.S. District Court for the District of Massachusetts. This suit alleges that the defendant’s use of an algorithm-based scoring system to screen tenants discriminates against Black and Hispanic rental applicants in violation of the Fair Housing Act (FHA). The Department of Justice (DOJ) and HUD have now filed a Statement of Interest in which the agencies explain the FHA’s application of algorithm-based tenant screening systems.

The suit was filed on behalf of two plaintiffs. Both are Black rental applicants who use housing choice vouchers to pay part of their rent. The plaintiffs applied for rental housing but allege they were denied due to their “SafeRent Score,” a score derived from algorithm-based screening software.

According to the suit, the underlying algorithm relies on certain factors that disproportionately disadvantage Black and Hispanic applicants, such as credit history and non-tenancy-related debts, while failing to consider one highly relevant factor – the use of housing choice vouchers. These HUD vouchers make tenants more likely to pay their share of the rent due to the risk of losing the voucher if the tenant’s rent is not paid.

Through the Statement of Interest, DOJ and HUD are seeking to assist the court by correcting two questions of law erroneously represent in the defendants’ motion to dismiss the case. First, the statement sets out the appropriate standard for pleading disparate impact claims under the FHA, and second, the statement clarifies that FHA text and case law support the FHA application to companies providing residential screening services.

While landlords may use algorithm-based scoring systems as part of an application approval process, final decisions relative to applicant approval must be made by humans – not artificial intelligence. Landlords cannot avoid liability under the FHA by using as a defense the fact that the algorithm-based scoring system failed an applicant. These scoring systems do not make rental decisions – landlords do.

Menu