Correctional Offender Management Profiling for Alternative Sanctions

  • 2025-09-17
  • 별칭: COMPAS

문제점:

Gebru saw the consequences of [systemic biases] in her academic research. She came across an investigation into software being used in the US criminal justice system called COMPAS, which judges and parole offices used to help make decisions about bail, sentencing, and parole.

COMPAS used machine learning to give risk scores to defendants. The higher the score, the more likely they were to reoffend. The tool gave high scores to Black defendants far more than white defendants, but its predictions were often erroneous. COMPAS turned out to be twice as likely to be wrong about future criinal behavior by Black defendants as it was for Caucasian ones, according to a 2016 investigation by ProPublica, which looked at seven thousand risk scores given to people arrested in Florida and checked if they’d been charged with new offenses in the next two years. The tool was also more likely to misjudge white defendants who went on to commit other crimes as low-risk. America’s criminal justice system was already skewed against Black people, and that bias looked set to continue with the use of inscrutable AI tools.