Fall 2021
For this assignment, you’ll read a report from ProPublica about COMPAS, a software a system made by a company called Northpointe that is widely used in the US criminal justice system. It’s pretty detailed, and when you get to the core of the issue, the details are very important. Try to read it with a focus on the technical aspects: in particular, look out for different choices of evaluation metrics and how “accuracy” and discrimination are defined.
Read the article. Read the ProPublica article here: https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing. You may want to (but are not required to) check out their more detailed writeup of their methodology and/or look at the Jupyter notebook containing the code for their analysis.
Pre-discussion prompt. Submit to Canvas a short paragraph (3 or 4 sentences at most) addressing the following:
What is the purpose of the COMPAS system?
Did you identify any weaknesses in ProPublica’s argument that the system is biased against black defendants?
What is your position on the use of a system like this? This could be something like one of the following claims, or something else:
In-class discussion. Come to class on Wednesday, 12/1 prepared to discuss both the technical and ethical aspects of COMPAS and ProPublica’s report. At the end of class, you will complete a short in-class writing assignment that will be handed in for credit.
As usual, this assignment will be graded on effort, thoughtfulness, and clarity. If you produce thoughtful and clearly-written responses, you will receive full credit.
If you want to read more about bias in machine learning systems, here are some other articles you may find interesting.