Deconstructing Spatial Clustering Algorithms To Explore Biases in Crime Analysis

Jump to: navigation, search

Faculty Mentor:​ Shion Guha

Crime analysis has become extremely popular in most urban areas of the United States. Milwaukee is no exception. Crime labs all across the country are receiving significant infusions of federal and private funding to upgrade their facilities, personnel and infrastructure as data-driven crime analysis using state-of-the-art machine learning algorithms get used daily. Naturally, questions of gender, racial, ethnic and other demographic biases arise because very few people understand or know how crime analysis is actually done.

In 2016, the ACLU of Wisconsin brought a case before the state Supreme Court - Wisconsin vs Loomis - the central thesis of this case was that algorithms developed by a private third party were being used to make sentencing decisions about criminals. In response to this, our lab has been studying the criminal justice system pipeline and trying to answer questions at the very beginning of this process - what are the processes and practices of mapping and analyzing crime? What are the ethical implications of improper analysis of crime?

In this summer project, we will focus on deconstructing and visualizing popular spatial clustering algorithms to understand and explain human-centered bias by crime analysts. We have built an alpha version of an interactive, web-based system populated with Milwaukee crime data from 2005-2016 that would enable us to do this. The REU student will extend this system from alpha to pre-beta while conducting experiments in visualization. In order to do this, the student needs to have introductory programming experience (at the level of COSC 1010 or equivalent). Familiarity with popular clustering algorithms and web programming is a plus, but not required. Based on our findings, we will jointly write a paper together for submission to CHI 2019 - the top conference in HCI, the deadline for which is in September 2018.