SafeRent, an AI screening tool used by landlords, will no longer use AI-powered “scores” to evaluate whether someone using housing vouchers would make a good tenant. On Wednesday, US District Judge Angel Kelley issued final approval for a roughly $2.3 million settlement to prevent SafeRent from discriminating against tenants based on income and race.

The settlement stems from a 2022 class action lawsuit filed in Massachusetts. The suit alleged that SafeRent’s scoring system disproportionately harmed people using housing vouchers — specifically Black and Hispanic applicants. In addition to violating Massachusetts law, the complaint also accused SafeRent of breaking the Fair Housing Act, which prohibits housing discrimination.

As outlined in the initial lawsuit, SafeRent’s scoring algorithm uses factors like credit history and non-rental-related debts to assign a SafeRent Score to potential tenants. Landlords can then use this score to determine whether to accept or deny someone’s rental application. The lawsuit claimed the process isn’t transparent, as SafeRent doesn’t tell landlords how it came up with a person’s score. And the system allegedly assigned lower scores unfairly for Black and Hispanic tenants, as well as people who use housing vouchers, leading landlords to deny their housing applications.

  • nondescripthandle@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    3
    ·
    1 month ago

    Do credit scores next if were worried about non transparent algos that hurt the poor and minorities the most and use unrelated debts to deny people from opportunities.