January 12, 2023
US Attorney says algorithms are susceptible to human biases
Two Black Women from Massachusetts are at the center of what could become a landmark federal case about whether software that screens potential tenants is illegally biased against Black and Hispanic applicants.
Rachael Rollins, the US attorney for Massachusetts, weighed in on the case, Louis vs. SafeRent Solutions, in a court brief this week, arguing that the technology used by tenant screening companies must comply with anti-discrimination rules. “Algorithms are written by people. As such, they are susceptible to all of the biases, implicit or explicit, of the people that create them,” Rollins said in a statement. Rollins said her filing “recognizes that our 20th century civil rights laws apply to 21st century innovations.”
SafeRent Solutions is a company used by landlords to screen potential tenants. SafeRent gives rental applicants a risk score based on their credit history, other credit-related information including non-tenancy debts, and eviction history.
Mary Louis, 54, of Malden, and Monica Douglas, 65, of Canton, both Black women with subsidized housing vouchers, are the plaintiffs, along with the Community Action Agency of Somerville, which helps people with vouchers find housing. They sued SafeRent and Metropolitan Management Group, a Boston-based apartment management company, in August in US District Court in Massachusetts. Both women were denied apartments by Metropolitan because of SafeRent scores. The plaintiffs are represented by attorneys from Greater Boston Legal Services, Washington, DC-based firm Cohen, Milstein, Sellers & Toll, and the Boston-based National Consumer Law Center.
The lawsuit alleges that SafeRent “assigns disproportionately lower SafeRent Scores to Black and Hispanic rental applicants compared to White rental applicants,” partly because it measures credit history, which includes non-tenant-related debt.
Read the article on CommonWealth.