Cross Post: Biased Algorithms: Here’s a More Radical Approach to Creating Fairness
Written by Dr Tom Douglas
Our lives are increasingly affected by algorithms. People may be denied loans, jobs, insurance policies, or even parole on the basis of risk scores that they produce.
Yet algorithms are notoriously prone to biases. For example, algorithms used to assess the risk of criminal recidivism often have higher error rates in minority ethic groups. As ProPublica found, the COMPAS algorithm – widely used to predict re-offending in the US criminal justice system – had a higher false positive rate in black than in white people; black people were more likely to be wrongly predicted to re-offend.
Corrupt code.
Vintage Tone/Shutterstock
Read More »Cross Post: Biased Algorithms: Here’s a More Radical Approach to Creating Fairness