Hostname: page-component-89b8bd64d-x2lbr Total loading time: 0 Render date: 2026-05-09T01:28:10.692Z Has data issue: false hasContentIssue false

Fair equality of chances for prediction-based decisions

Published online by Cambridge University Press:  09 November 2023

Michele Loi*
Affiliation:
Politecnico di Milano, Milano, MI, Italy
Anders Herlitz
Affiliation:
Institute for Futures Studies, Stockholm, Sweden Department of Philosophy, Lund University, Lund, Sweden
Hoda Heidari
Affiliation:
Machine Learning Department, Carnegie Mellon University, Pittsburgh, PA, USA
*
Corresponding author: Michele Loi; Email: m.loi@icloud.com; URL: www.micheleloi.eu
Rights & Permissions [Opens in a new window]

Abstract

This article presents a fairness principle for evaluating decision-making based on predictions: a decision rule is unfair when the individuals directly impacted by the decisions who are equal with respect to the features that justify inequalities in outcomes do not have the same statistical prospects of being benefited or harmed by them, irrespective of their socially salient morally arbitrary traits. The principle can be used to evaluate prediction-based decision-making from the point of view of a wide range of antecedently specified substantive views about justice in outcome distributions.

Information

Type
Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2023. Published by Cambridge University Press
Figure 0

Table 1. Confusion table and predictive accuracy

Figure 1

Table 2. Confusion table of the grader’s performance in the Buddhist population

Figure 2

Table 3. Confusion table of the grader’s performance in the Muslim population