Maggie Martin

In 2019, Carmen Arroyo (“Arroyo”) and the Connecticut Fair Housing Center filed suit against CoreLogic Rental Property Solutions, LLC (“CoreLogic”) on the basis that CoreLogic violated the Fair Housing Act (“FHA”).[1]  Arroyo sued on behalf of her son Mikhail, who was previously “injured in an accident . . . which left him unable to speak, walk or care for himself.”[2]  Arroyo attempted to transfer Mikhail to a new apartment owned by WinnResidential.[3] 

For its tenant applications, WinnResidential outsourced tenant screenings to CoreLogic, which conducted screenings through its algorithm CrimSAFE.[4]  These screenings “use[d] an algorithm to interpret an applicant’s criminal record and provide housing providers with a decision on whether the applicant qualifies for housing.”[5]  Ultimately, the CrimSAFE algorithm concluded that Mikhail was “not qualified for tenancy,” even though he was “never . . . convicted of a crime.”[6] 

Arroyo is not alone in her experience.  Housing law attorneys explain that “thousands of people . . . are mistakenly flagged” by this software.[7]  Further, this risk has only “accelerated over the last two decades as the rental market has increased and the . . . real estate analytics market has boomed.”[8]  Accordingly, applicants now face a market where almost all landlords utilize this software “to find who they consider to be the highest-quality tenants.”[9]  In effect, screening companies are now “functionally mak[ing] rental decisions on behalf of landlords” simply through an algorithm.[10]

Notably, these algorithms often determine that consumers of color are not the “highest-quality tenants.”[11]  Given that these algorithms flag arrests without requiring an ultimate conviction, the algorithms disproportionately screen out African American and Latino applicants because “arrests merely show that a police officer suspects criminal activity” and is not “actual proof of misconduct.”[12]  Therefore, these algorithms correspond to established trends in racial profiling where law enforcement officers arrest people of color solely on the basis of race.[13]

Like these national trends, the plaintiffs alleged that CoreLogic violated the FHA by discriminating on the basis of race and national origin.[14]  Fundamentally, the FHA was passed to stop discriminatory housing practices,[15] and it states that it is an unlawful discriminatory act to “make unavailable or deny, a dwelling to any person because of race, . . . or national origin.””[16]  In line with FHA requirements, the plaintiffs argued that CoreLogic’s algorithm has a disparate impact on African Americans and Latinos.[17] 

After surviving a motion for summary judgment, this case went to a bench trial on March 14, 2022.[18]  The U.S. District Court, District of Connecticut now has the opportunity to broadly interpret the FHA and capture within its purview what is just a technologically-sophisticated form of housing discrimination.  By applying the FHA to tenant screening companies, these algorithms could no longer “use search criteria that are ‘proxies’ for race and gender, such as criminal history and eviction data.”[19]   Accordingly, this private action poses the possibility that housing practices may change radically for the wellbeing of Latino and African American renters. 

Cases such as CoreLogic provide a chance for the courts to see the need for further action.  While these companies “are subject to some regulation under federal law, more needs to be done to ensure that these reports are accurate, unbiased, and compliant with state and local efforts to reduce the stigma associated with and barriers created by a criminal record.”[20]  Moreover, the continued presence of “[d]iscrimination in the credit and housing markets has led to disparities in wealth, homeownership, and economic opportunity for consumers of color . . . .”[21]  Therefore, without taking steps to prevent discrimination in tenant screening, fundamental goals of equality will be frustrated.

Like most interaction in 2022, modern discrimination in housing seems to occur more often behind a computer rather than face-to-face.  Accordingly, permitting tenant screening services to evade federal law, perpetuate racial inequality, and heighten the stigma surrounding criminal histories violates the stated goals of the FHA.  Although tenant screening companies are already covered by the Fair Credit Reporting Act,[22] stories such as the Arroyos demonstrate that this statutory scheme is not enough to protect renters.  The current framework allows for an unseen, intermediary entity to continually disadvantage African American and Latino renters, and it forces individuals to seek their own judicial recovery, rather than relying upon the federal government to challenge discriminatory housing practices.

Ultimately, the current success of the FHA is largely attributable to private enforcement actions,[23] and this case might further the FHA’s application to account for the current housing market. Although the future of the CoreLogic case is still unknown, the plaintiffs rightfully warn the “housing industry [to pay] attention to this trial, as it will set an important precedent for tenant-screening technologies moving forward.”[24]                       

[1] Connecticut Fair Housing Center v. CoreLogic Rental Property Solutions, LLC, 369 F. Supp. 3d 362, 366 (D. Conn. 2019).

[2] Id. at 367.

[3] Id.

[4] Id.

[5] Id.

[6] Id. at 367–68.

[7] Cyrus Farivar, Tenant Screening Software Faces National Reckoning, NBC News (Mar. 14, 2021, 7:00 AM),          

[8] Id.

[9] Id.

[10] Connecticut Fair Housing Center, et al. v. CoreLogic Rental Property Solutions, Cohen Milstein (last visited Apr. 8, 2022).

[11] See Farivar, supra note 7.

[12] Tex Pasley et al., Screened Out: How Tenant Screening Reports Deprive Tenants of Equal Access to Housing, Shriver Ctr. on Poverty L. 1, 16–17 (Jan. 2020),

[13] Racial Profiling: Definition, ACLU, (last visited Apr. 8, 2022).

[14] CoreLogic, 369 F. Supp. at 369.

[15] Shivangi Bhatia, To “Otherwise Make Unavailable”: Tenant Screening Companies’ Liability Under the Fair Housing Act’s Disparate Impact Theory, 88 Fordham L. Rev. 2551, 2556 (2020) (citing Tex. Dep’t of Hous. & Cmty. Affairs v. Inclusive Cmtys. Project, Inc., 135 S. Ct. 2507, 2521 (2015)).

[16] 42 U.S.C. § 3604.

[17] CoreLogic, 369 F. Supp. at 369.

[18] Emma Whitford, Trial to Ask: Was Tenant Screener’s Conduct Discriminatory?, L. 360 (March 11, 2022, 7:46 PM),                       

[19] Bhatia, supra note 15, at 2582.

[20] Screened Out, supra note 12, at 20.

[21] Nat’l Consumer L. Ctr., Comment Letter on Request for Information and Comment on the Financial Institutions’ Use of Artificial Intelligence, Including Machine Learning, at 3 (July 1, 2021),           

[22] Tenant Screening and Selection: How it Works in the Twin Cities Metro Area and Opportunities for Improvement, Fam. Hous. Fund and Hous. Just. Ctr. 1, 17 (Mar. 2021),

[23] Deborah Kemp, The 1968 Fair Housing Act: Have Its Goals Been Accomplished?, 14 Real Est. L.J. 327, 343 (1986).

[24] Housing Discrimination Trail Against Tenant-Screening Firm Begins in Connecticut, Black Star News (Mar. 14, 2022),

Photo by Pixabay from Pexels