Skip to main content
Menu Icon
Close

InfoBytes Blog

Financial Services Law Insights and Observations

DOJ, HUD say Fair Housing Act extends to algorithm-based tenant screening

Federal Issues Courts DOJ Fair Housing Act Artificial Intelligence HUD Algorithms Discrimination Disparate Impact

Federal Issues

On January 9, the DOJ and HUD announced they filed a joint statement of interest in a pending action alleging discrimination under the Fair Housing Act (FHA) against Black and Hispanic rental applicants based on the use of an algorithm-based tenant screening system. The lawsuit, filed in the U.S. District Court for the District of Massachusetts, alleged that Black and Hispanic rental applications who use housing vouchers to pay part of their rent were denied rental housing due to their “SafeRent Score,” which is derived from the defendants’ algorithm-based screening software. The plaintiffs claimed that the algorithm relies on factors that disproportionately disadvantage Black and Hispanic applicants, such as credit history and non-tenancy related debts, and fails to consider that the use of HUD-funded housing vouchers makes such tenants more likely to pay their rents. Through the statement of interest, the agencies seek to clarify two questions of law they claim the defendants erroneously represented in their motions to dismiss: (i) the appropriate standard for pleading disparate impact claims under the FHA; and (ii) the type of companies that fall under the FHA’s application.

The agencies first challenged that the defendants did not apply the proper pleading standard for a claim of disparate impact under the FHA. Explaining that in order to establish an FHA disparate impact claim, “plaintiffs must show ‘the occurrence of certain outwardly neutral practices’ and ‘a significantly adverse or disproportionate impact on persons of a particular type produced by the defendant’s facially neutral acts or practices,’” The agencies disagreed with the defendants’ assertion that the plaintiffs “must also allege specific facts establishing that the policy is ‘artificial, arbitrary, and unnecessary.” This contention, the agencies said, “conflates the burden-shifting framework for proving disparate impact claims with the pleading burden.” The agencies also rejected arguments that the plaintiffs must challenge the entire “formula” of the scoring system and not just one element in order to allege a statistical disparity, in addition to providing “statistical findings specific to the disparate impact of the scoring system.” According to the agencies, the plaintiffs adequately identified an “essential nexus” between the algorithm’s scoring system and the disproportionate effect on certain rental applicants based on race.

The agencies also explained that residential screening companies, including the defendants, fall under the FHA’s purview. While the defendants argued that the FHA does not apply to companies “that are not landlords and do not make housing decisions, but only offer services to assist those that do make housing decisions,” the agencies contended that this misconstrues the clear statutory language of the FHA and presented case law affirming that FHA liability reaches “a broad array of entities providing housing-related services.”

“Housing providers and tenant screening companies that use algorithms and data to screen tenants are not absolved from liability when their practices disproportionately deny people of color access to fair housing opportunities,” Assistant Attorney General Kristen Clarke of the DOJ’s Civil Rights Division stressed. “This filing demonstrates the Justice Department’s commitment to ensuring that the Fair Housing Act is appropriately applied in cases involving algorithms and tenant screening software.”