Both research and public discourse have recently drawn attention to the downsides of algorithmic decision-making (ADM), highlighting how it can produce biased and discriminatory outcomes and also pose threats to social justice. We address such threats that emanate from but also go beyond algorithms per se, extending to how public agencies and legal institutions respond or fail to respond to the consequences of ADM. Drawing on a case study of the use of an ADM system in public school administration, we explore the practices through which public institutions avoided engagement with the detrimental consequences of ADM, leading to injustice. We provide a conceptual model outlining how organizational ignoring practices can lead to social and institutional blackboxing of an ADM system, engendering both social and legal injustice. Our work paves the way for interdisciplinary research on the multilayered blackboxing of ADM. We also extend algorithmic injustice research to include a legal dimension and provide practical implications in the form of a legal framework for ADM in the public sector.
When Justice is Blind to Algorithms: Multilayered Blackboxing of Algorithmic Decision Making in the Public Sector
In stock
SKU
48.4.13
Open access
Publication History
Received: June 30, 2022
Revised: February 28, 2023; Deptember 5, 2023; October 30, 2023; February 9, 2024
Accepted: February 12, 2024
Published as Forthcoming: August 6, 2024
Published as Articles in Advance: September 24, 2024
Published in Issue: December 1, 2024
https://doi.org/10.25300/MISQ/2024/18251
This work is licensed under a Creative Commons Attribution 4.0 International License.
Abstract
Additional Details
Author | Charlotta Kronblad, Anna Essén, and Magnus Mähring |
Year | 2024 |
Volume | 48 |
Issue | 4 |
Keywords | Algorithmic decision making, social justice, organizational ignoring, institutional blindness, multilayered blackboxing, legal justice, public sector |
Page Numbers | 1637-1662 |