Document Type
Book Chapter
Publication Date
2024
Editor(s)
Meg Leta Jones and Amanda Levendowski
ISBN
9780520388543
Publisher
University of California Press
Language
en-US
Abstract
We live in an age of predictive algorithms.1 Jurisdictions across the country are utilizing algorithms to make or influence life-altering decisions in a host of governmental decision-making processes—criminal justice, education, and social assistance to name a few.2 One justification given for this algorithmic turn concerns redressing historical and current inequalities within governmental decision-making.3 The hope is that the predictions produced by these predictive systems can correct this problem by providing decision-makers with the information needed to make fairer, more accurate, and consistent decisions.4 For instance, jurisdictions claim that their turn to risk assessment algorithms in bail, sentencing, and parole is in order to de-bias decisions made in these areas. However, this hope has not borne out in practice. Rather than de-biasing decision-making, algorithms have tended to operate to reinforce it.5 A primary reason is that these systems tend to produce disparate predictions that track existing social inequities and facilitate harmful outcomes for marginalized communities, particularly racially and otherwise politically oppressed communities.6 To compound the issue, since these systems tend to be applied to an entire sector, the predictions produced operate to maintain existing inequities, social hierarchies, and the resulting political, economic, and social oppression of our current moment.7 Professor Safiya Umoja Noble’s work has provided us with a language and a framework to understand this state of affairs. She employs the term “algorithmic oppression,” which she uses to refer to how algorithms “serv[e] up deleterious information about people” and resultingly “reinforce oppressive social and economic relations.”8 By cementing existing political, social, and economic hierarchies, these algorithmic systems—as Professor Dorothy Roberts explains—exacerbate marginalized communities’ vulnerability to state-sanctioned violence, resource deprivation, and other precarious outcomes that hamper their ability to exercise full citizenship in this country.9 When viewed in tandem, the multifaceted effects of algorithmic oppression threaten to “lock in” our unequal status quo into the future.10
Recommended Citation
Ngozi Okidegbe,
Chapter 16: Revisioning Algorithms as a Black Feminist Project
,
in
Feminist Cyberlaw
200
(Meg Leta Jones and Amanda Levendowski ed.,
2024).
Available at:
https://scholarship.law.bu.edu/faculty_scholarship/3576
Included in
Civil Rights and Discrimination Commons, Law and Race Commons, Law and Society Commons, Science and Technology Law Commons
Comments
e-book is open access