Ngozi Okidegbe is an assistant professor of law at Cardozo School of Law. She researches and writes in the areas of criminal procedure, evidence, technology, and racial justice. Her recent work explores the ways in which the use of predictive technologies in the criminal justice system impact racially marginalized communities. Before joining Cardozo, Professor Okidegbe served as a law clerk for Justice Madlanga of the Constitutional Court of South Africa and for the Justices of the Court of Appeal for Ontario. She also practiced at CaleyWray, a labor law boutique in Toronto.

Talk: Discredited Data

Zoom ID: 916 2186 2931 / pw: techlaw

Abstract: Jurisdictions are increasingly employing pretrial algorithms as a solution to the racial and socioeconomic inequities in the bail system. But in practice, pretrial algorithms have reproduced the very inequities they were intended to correct. Scholars have diagnosed this problem as the biased data problem: pretrial algorithms generate racially and socioeconomically biased predictions, because they are constructed and trained with biased data. 

This talk contends that biased data is not the sole cause of algorithmic discrimination. Another reason pretrial algorithms produce biased results is that they are exclusively built and trained with data from carceral knowledge sources – the police, pretrial services agencies, and the court system. Redressing this problem will require a paradigmatic shift away from carceral knowledge sources toward non-carceral knowledge sources. This talk explores knowledge produced by communities most impacted by the criminal legal system (“community knowledge sources”) as one category of non-carceral knowledge sources worth utilizing. Though data derived from community knowledge sources have traditionally been discredited and excluded in the construction of pretrial algorithms, tapping into them offers a path toward developing algorithms that have the potential to produce racially and socioeconomically just outcomes.