In the News

Class Action Lawsuit on AI-Related Discrimination Reaches Final Settlement

AP

November 20, 2024

Mary Louis’ excitement to move into an apartment in Massachusetts in the spring of 2021 turned to dismay when Louis, a Black woman, received an email saying that a “third-party service” had denied her tenancy.

That third-party service included an algorithm designed to score rental applicants, which became the subject of a class action lawsuit, with Louis at the helm, alleging that the algorithm discriminated on the basis of race and income.

A federal judge approved a settlement in the lawsuit, one of the first of it’s kind, on Wednesday, with the company behind the algorithm agreeing to pay over $2.2 million and roll back certain parts of it’s screening products that the lawsuit alleged were discriminatory.

. . .

The lawsuit alleged SafeRent’s algorithm didn’t take into account the benefits of housing vouchers, which they said was an important detail for a renter’s ability to pay the monthly bill, and it therefore discriminated against low-income applicants who qualified for the aid.

The suit also accused SafeRent’s algorithm of relying too much on credit information. They argued that it fails to give a full picture of an applicant’s ability to pay rent on time and unfairly dings applicants with housing vouchers who are Black and Hispanic partly because they have lower median credit scores, attributable to historical inequities.

Christine Webber, one of the plaintiff’s attorneys, said that just because an algorithm or AI is not programmed to discriminate, the data an algorithm uses or weights could have “the same effect as if you told it to discriminate intentionally.”

. . .

Louis’ attorneys, along with the U.S. Department of Justice, which submitted a statement of interest in the case, argued that SafeRent’s algorithm could be held accountable because it still plays a role in access to housing. The judge denied SafeRent’s motion to dismiss on those counts.

The settlement stipulates that SafeRent can’t include its score feature on its tenant screening reports in certain cases, including if the applicant is using a housing voucher. It also requires that if SafeRent develops another screening score it plans to use, it must be validated by a third-party that the plaintiffs agree to.

Read Class Action Lawsuit on AI-Related Discrimination Reaches Final Settlement.