Hostname: page-component-7c8c6479df-fqc5m Total loading time: 0 Render date: 2024-03-26T18:04:18.779Z Has data issue: false hasContentIssue false

“Hey SyRI, tell me about algorithmic accountability”: Lessons from a landmark case

Published online by Cambridge University Press:  10 January 2023

Maranke Wieringa*
Affiliation:
Governing the Digital Society, Utrecht University, Utrecht, The Netherlands Parell Datavision, Arnhem, The Netherlands
*

Abstract

The promised merits of data-driven innovation in general and algorithmic systems in particular hardly need enumeration. However, as decision-making tasks are increasingly delegated to algorithmic systems, this raises questions about accountability. These pressing questions of algorithmic accountability, particularly with regard to data-driven innovation in the public sector, deserve ample scholarly attention. Therefore, this paper brings together perspectives from governance studies and critical algorithm studies to assess how algorithmic accountability succeeds or falls short in practice and analyses the Dutch System Risk Indication (SyRI) as an empirical case. Dissecting a concrete case teases out to which degree archetypical accountability practices and processes function in relation to algorithmic decision-making processes, and which new questions concerning algorithmic accountability emerge therein. The case is approached through the analysis of “scavenged” material. It was found that while these archetypical accountability processes and practices can be incredibly productive in dealing with algorithmic systems they are simultaneously at risk. The current accountability configurations hinge predominantly on the ex ante sensitivity and responsiveness of the political fora. When these prove insufficient, mitigation in medias res/ex post is very difficult for other actants. In part, this is not a new phenomenon, but it is amplified in relation to algorithmic systems. Different fora ask different kinds of medium-specific questions to the actor, from different perspectives with varying power relations. These algorithm-specific considerations relate to the decision-making around an algorithmic system, their functionality, and their deployment. Strengthening ex ante political accountability fora to these algorithm-specific considerations could help mitigate this.

Type
Research Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2023. Published by Cambridge University Press

Policy Significance Statement

As we delegate tasks to algorithmic systems, we need to find new ways to hold them accountable. This article studies how such accountability practices for algorithmic systems work and where they fall short. SyRI, functions as a landmark case involving algorithmic decision-making within the Dutch public sector, abroad, and in academic literature. By analyzing accountability practices related to SyRI we can discern what new, medium specific, algorithmic accountability considerations arise. We found that in particular ex ante political accountability is vital but simultaneously precarious. When ex ante political accountability fails, it is hard to fully mitigate the ensuing accountability gaps. Political accountability can potentially be strengthened by paying extra attention to the decision-making around a system, its functionality, and its deployment.

1. Introduction

Heralded as being more efficient and efficacious, algorithmic systems are increasingly implemented in public sector organizations as part of their data-driven innovation strategies. However, as many incidents have shown, they can have disastrous effects (e.g., Eubanks, Reference Eubanks2018). Algorithmic delegation raises questions about remedying such algorithmic harm and ensuring accountability. Such pressing questions are increasingly attracting scholarly attention (e.g., Meijer et al., Reference Meijer, Lorenz and Wessels2021; Neyland, Reference Neyland2016; Pasquale, Reference Pasquale2015), but many questions as to how different accountability practices around algorithmic systems are done and interrelate in situ still remain. This paper brings together perspectives from governance studies and critical algorithm studies (CAS) and analyses the Dutch System Risk Indication (SyRI) as an empirical case.

SyRI was an algorithmic system used by the Dutch State and local governments since 2015 to detect the increased risk of potential fraudulent behavior of people receiving welfare benefits. SyRI has been used exclusively in neighborhoods with significant rates of poverty, crime, unemployment, and welfare beneficiaries. Such neighborhoods are referred to by the State as “problem neighborhoods.” The system was leveraged by the State as an instrument to detect discrepancies in the data of residents on social benefits and enhance the efficacy of the State’s legitimate aim to combat fraud by shortlisting people for investigation. For several years, SyRI was the topic of much tumultuous public debate both on the national and international level. The upheaval culminated in a lawsuit, in which the Court eventually overturned the legislation underlying the system due to conflict with higher law: the lack of transparency and accountability played a major role in their verdict.

Within the Dutch public sector, and even abroad, SyRI functions as a key incident in the public sector’s struggle to implement algorithmic systems while complying with legal and social norms for transparency and accountability (e.g., Bekker, Reference Bekker, Otto, Werner and Wessel2021; Gantchev, Reference Gantchev2019; Vetzo, Reference Vetzo2021).Footnote 1 It is helpful to investigate the accountability practices that surrounded a key case; to investigate how we eventually ended up taking an algorithmic system to court. Insight into SyRI’s legal proceedings and the accountability practices coupled to it can highlight how to strengthen existing accountability practices and mitigate future accountability gaps.

In this paper, we will use the SyRI case to illustrate a discrepancy inherent in algorithmic accountability. Dissecting a concrete case teases out to which degree archetypical accountability practices and processes function in relation to algorithmic decision-making processes, and which new medium-specific considerations (Thon, Reference Thon, Ryan, Marie-laure and Lori2014) emerge therein. On the one hand, we see that algorithmic accountability denotes a “kind of accountability relationship where the topic of explanation and/or justification is an algorithmic system” (Wieringa, Reference Wieringa2020). In this instance traditional accountability practice and theory is beneficial. On the other hand, we see that algorithmic accountability, due to its nature, also comes with new, algorithm-specific, considerations (ibid.). In media theory, such characteristics and practices that are unique to a given medium are denoted with the term “medium specific” (see Carroll, Reference Carroll, Carrol, Di Summa and Loht2019 for an in-depth discussion of the term). In examining this discrepancy of accountability as a process or practice, and on the other hand the medium-specific content of the associated account, we bring together two perspectives: accountability theory and CAS. The case study enriches accountability theory with an empirical investigation into accounting for algorithmic systems, and simultaneously brings accountability theory as a useful lens to CAS and related fields. Thus, taking SyRI as a case in point, this paper asks to what extent existing accountability practices still suffice and which medium-specific considerations surface in algorithmic accountability practices. If accountability falls short, how can accountability gaps be mitigated? The case is approached through a qualitative analysis of “scavenged” material (Gusterson, Reference Gusterson1997; Seaver, Reference Seaver2017, pp. 6–7).

To answer the central questions, we will first briefly discuss relevant literature, then introduce archetypical forms of accountability relevant to the SyRI case: administrative, political, social, legal/judicial, mediatized accountability, and their respective characteristics. Subsequently, we introduce SyRI. The accountability practices around this case are hereafter analyzed. We will then highlight accountability risks, gaps, and relevant mitigation strategies. We will conclude with what this case can teach us for strengthening future algorithmic accountability practices.

2. On Investigating Algorithms

Algorithms are at heart instructions to solve a given problem (e.g., “bubble sort”). They need not be computational, yet we often understand them as such. This paper departs from a sociotechnical perspective on algorithms. That is, we do not look solely at the technical instructions and implementation, but rather at the algorithmic mishmashes of technology, social practice, and culture that we find, encounter, and engage with in situ. Taking such a sociotechnical stance we aim to add to CAS and the investigation of algorithmic harm with an analytical, normative praxiography of accountability relations around a harmful system. In other words: how is an algorithmic system that perpetuates and exacerbates historical and societal inequity held to account, by whom and in what way? In bringing together accountability theory and CAS, this paper thus provides actionable insights about accountability practices in our future dealings with harmful systems. In the following, we will first briefly introduce CAS and its sociotechnical approach of algorithmic systems, and the importance of such an approach in identifying and preventing algorithmic harm. We will then introduce our main interest in this paper: algorithmic accountability.

In CAS, algorithmic systems are investigated as social concerns, bringing together disciplines such as computer science, science, and technology studies (STS), sociology, anthropology, media studies, law, communication studies, and many more (Seaver, Reference Seaver, Vertesi and Ribes2019; The Social Media Collective, n.d.). Key to the CAS discussion of algorithmic systems is that algorithms are taken to be not merely technical constructs that exist in isolation, but rather are viewed as sociotechnical systems (e.g., Seaver, Reference Seaver2017; Wieringa, Reference Wieringa2020). That is, algorithms are “always-already” enmeshed with cultural and social norms, and are then deployed within social work practices, which are situated in culture. Moreover, an algorithm is done (“enacted”) differently by different people in different contexts (for more on enactment see the work of Mol, Reference Mol2002). These different enactments are called “algorithmic multiples” (Seaver, Reference Seaver2017). As we will see later in our discussion of the SyRI-case, some actants would enact the system as a functional, technical tool, whereas others viewed it as a sociotechnical intervention undermining some of the Dutch society’s core beliefs such as the presumption of innocence.

Concerns such as these are not only heard in society, they are also articulated and investigated by CAS scholars usually under the nomer of “algorithmic harm” (e.g., Malik et al., Reference Malik, Viljanen, Lepinkäinen and Alvesalo-Kuusi2021; Marjanovic et al., Reference Marjanovic, Cecez-Kecmanovic and Vidgen2022). That is: how do algorithmic systems disproportionally impact or exacerbate existing harm to, especially, marginalized communities and individuals (e.g., Buolamwini and Gebru, Reference Buolamwini and Gebru2018; Eubanks, Reference Eubanks2018; Noble, Reference Noble2018; O’Neil, Reference O’Neil2016) because of how they are designed, deployed or leveraged. Much of the work in the field of fairness, accountability, and transparency of algorithms, ML, and AI––where many CAS scholars operate––is focused on finding technical solutions to such problems but, while part of the solution, the underlying, problematic, assumptions and injustices, and power imbalances remain unaddressed (Birhane, Reference Birhane2021). While technical solutions such as explainable AI (XAI) are definitely ingredients to more just and fair algorithmic systems, they are not a technological catch-all for many of the fundamental problems which are exacerbated by algorithmic systems. Another approach that is gaining traction in controlling and assessing algorithmic systems is AI auditing (e.g., Raji et al., Reference Raji, Smart, White, Mitchell, Gebru, Hutchinson, Smith-Loud, Theron and Barnes2020). AI auditing can be done internally (ibid.) or by hiring an impartial third-party auditor (Costanza-Chock et al., Reference Costanza-Chock, Raji and Buolamwini2022). However, as Constanza-Chock et al. (ibid.) note, there are no set definitions, practices, standards or guidelines as of yet for such audits. At present, this line of inquiry of accountability practice is thus promising but also still very much developing.

The question thus remains: how can we do meaningful accountability for harmful algorithmic systems at present? In many cases, ethics alone is insufficient as it needs “teeth,” for example, regulation, to be enforced (Yeung et al., Reference Yeung, Howes and Pogrebna2019). Explanations such as those provided by XAI initiatives are not enough in themselves, as transparency does not equal accountability (e.g., Ananny and Crawford, Reference Ananny and Crawford2018; Kemper and Kolkman, Reference Kemper and Kolkman2018). An algorithm-specific form of auditing is still inventing itself. Instead, we propose to investigate accountability practices in situ, as they are currently done and adapted to facilitate the inquiry into algorithmic systems.

This brings us to our central concern: algorithmic accountability. The use of computer and algorithmic systems (Dekker, Reference Dekker2018, p. 3), governmental or otherwise, is not new, nor are the calls for accountability around such systems (e.g., Friedman and Nissenbaum, Reference Friedman and Nissenbaum1996; Johnson and Nissenbaum, Reference Johnson and Nissenbaum1995; Lessig, Reference Lessig1999; Nissenbaum, Reference Nissenbaum1994; Pasquale, Reference Pasquale2015; Rosenblat, Kneese, and Boyd, Reference Rosenblat, Kneese and Boyd2014). However, there has been renewed and intensified attention to this topic in the last years. “Algorithmic accountability” became the rallying term under which this renewed interest was articulated (e.g., Diakopoulos, Reference Diakopoulos2015), and particularly the accountability model of public administration scholar and political Mark Bovens (Bovens, Reference Bovens, Ferlie, Lynn and Pollitt2007b, Reference Bovens2007a, Reference Bovens2010) became dominant in the field (Cooper et al., Reference Cooper, Moss, Laufer and Nissenbaum2022) after a literature review using it (Wieringa, Reference Wieringa2020), though there are also other takes on accountability that is being explored (Kacianka and Pretschner, Reference Kacianka and Pretschner2021). One of the difficulties that academics face is how to operationalize and make accountability in practice (e.g., Cobbe et al., Reference Cobbe, Lee and Singh2021; Kroll, Reference Kroll2021). This paper adds to this strand of research through the analysis of accountability practices around a real-world case.

Taking together the acknowledgement of algorithmic systems as being multiple, and potentially harmful, we turn to accountability practices as a possible avenue for responsible algorithm usage. Moreover, we are interested in the multiplicity of accountability across the different enacted algorithmic systems. We propose to do an analytical “praxiography” of accountability practices and their gaps around such an algorithmic multiple. That is: we are interested in how accountability around an algorithmic system is done. Where do accountability gaps become visible and what is done to mitigate these gaps? How far does classic accountability theory take us, and what new––algorithm specific––considerations come to the fore?

3. On Accountability

To analyze to what extent traditional forms of accountability still suffice when dealing with algorithms in the public sector, we need to develop a conceptual understanding of what accountability is, which different types can be distinguished and how it “works.” Accountability, as a “process-related” value, encompasses the weighing of driving values (e.g., efficiency) and anchoring values (e.g., privacy) as well as the justification thereof (Wetenschappelijke Raad voor het Regeringsbeleid (WRR), 2011). As such, it is a crucial element of the pressing debates about responsible and value-sensitive “algorithmization” (Meijer and Grimmelikhuijsen, Reference Meijer, Grimmelikhuijsen, Schuilenburg and Peeters2020).

Accountability comes in many shapes and sizes, but at the basic level it entails “a relationship between an actor and a forum, in which the actor has an obligation to explain and to justify his or her conduct, the forum can pose questions and pass judgment, and the actor may face consequences” (Bovens, Reference Bovens2007a).Footnote 2

Accountability applied to algorithmic systems is termed “algorithmic accountability” (e.g., Diakopoulos, Reference Diakopoulos2015). This, according to Wieringa (Reference Wieringa2020), “concerns a networked account for a sociotechnical algorithmic system, following the various stages of the system’s lifecycle.” Algorithmic accountability comes with an inherent discrepancy. On the one hand, it encompasses algorithmic systems merely figuring as the topic of traditional accountability practices. On the other hand, algorithmic systems also come with algorithm-specific considerations in light of accountability, such as those of a networked account for the algorithmic system which is distributed amongst many different actors and fora (ibid.). That is, the processes of accountability may largely be similar whether it discusses a nuclear power plant or an algorithmic fraud risk detection algorithmic system, but the kinds of questions figuring in the account will be wildly different.

3.1. Accountability types

This article empirically inquires how far algorithms figuring as the topic accountability practices take us and what medium-specific considerations figure in the account. It assesses the networked account throughout the algorithmic system’s “life” (Kopytoff, Reference Kopytoff and Appadurai1986), which accountability deficits can be identified, and how these can be mitigated. Taking SyRI as a case study, we will examine the different kinds of accountability types in play, assess their strength and their function. These archetypical forms of accountability will help to analyze how, for instance, power is distributed amongst actants. The analysis mobilizes five accountability types identified in accountability literature (Bovens, Reference Bovens2007a; Jacobs and Schillemans, Reference Jacobs and Schillemans2016):

  • Administrative accountability

  • Political accountability;

  • Social accountability;

  • Legal accountability;

  • Mediatized accountability.

Administrative accountability refers to “a wide range of quasilegal forums, exercising independent and external administrative and financial supervision and control” (Bovens, Reference Bovens2007a, p. 456). Administrative fora are, for instance, Data Protection Authorities and ombudsmen. Political accountability can be said to be the inverse and direct consequence of delegation (Bovens, Reference Bovens2007a, p. 455). Examples of such fora are the House of Representatives, the Senate, and municipal councils. Social accountability can take the form of “more direct accountability relations between public agencies, on the one hand, and clients, citizens and civil society, on the other hand” (Bovens, Reference Bovens2007a, p. 457). Legal accountability is a kind of scrutiny “based on detailed legal standards, prescribed by civil, penal or administrative statutes, or precedent” (ibid., p. 456). Courts, for instance, are legal fora. Finally, mediatized accountability is a relationship wherein “[m]edia can stimulate actors to reflect on their behaviour, trigger formal accountability by reporting on the behaviour of actors, amplify formal accountability as they report on it or act as an independent and informal accountability forum” (Jacobs and Schillemans, Reference Jacobs and Schillemans2016).Footnote 3 Examples of its fora are newspapers, television, and radio.

3.2. Characteristics of accountability types

Each of these accountability types comes with specific characteristics. In the following analysis we discuss the formal/informal nature of the relationship, whether or not it is public, which perspective it mobilizes and what power relation it entails (see also Table 1).

Table 1. Accountability types and their characteristics

Formal accountability practices “consist of measured outcomes, codified outcome standards, and certain consequences for reaching or for not reaching the standards” (Hoffer, Reference Hoffer and Wagner2013, p. 530). It is often reserved for institutions such as parliaments and regulators (Jacobs and Schillemans, Reference Jacobs and Schillemans2016). In brief, formal accountability “is a set of institutional arrangements (rules and procedures) that are created, communicated and enforced by the state or state bodies such as constitutions, statues, laws, regulations, courts, legislatures, and bureaucracies” (Helmke and Levitsky, Reference Helmke and Levitsky2004 in Vu and Deffains, Reference Vu and Deffains2013, p. 333). Informal accountability is not shaped by such a measured or codified approach. Instead, it rests on “shared norms and facilitative behaviours” that aim to ensure “collective outcomes,” drawing on informal punishment (e.g., diminished reputation) and rewards (e.g., favors) that aim to stimulate particular kinds of behavior (Romzek et al., Reference Romzek, Leroux and Blackmar2012). It is often practiced in the “shadow of hierarchy” (Scharpf, Reference Scharpf1994), and frequently anticipates potential escalation to superiors (Schillemans, Reference Schillemans2008). Within the SyRI case, this distinction can help us to identify a phase of accountability practices that is situated nearly exclusively within various institutions.

Just as accountability can be formal or informal, it can also be public or non-public. Public accountability is transparent and open to all, engaging with an object of public importance, and accountability is required in the public interest. Other accountability relationships, such as accounting for one’s actions to one’s parents are non-public (Bovens et al., Reference Bovens, Schillemans, Goodin, Bovens, Goodin and Schillemans2014). As will become clear throughout this paper, many accountability practices around SyRI have been public, but they have not perse managed to attract public attention and/or public debate. Thus, not all public accountability practices are equally public. In the remainder of this paper, public accountability can help us to distinguish a second phase in the SyRI case: that of public debate and attention.

Another characteristic that is important to consider is the power relation in a given accountability relationship. Bovens (Reference Bovens2007a, p. 460) notes three different kinds of accountability, based on the nature of the power relation which exists between the actor and the forum: vertical accountability, horizontal accountability, and diagonal accountability. In vertical accountability, “the forum formally wields power over the actor” (Bovens, Reference Bovens2007a, p. 460). On the other end of the spectrum stands horizontal accountability. This accountability relation based more on a moral imperative, instead of a formal, codified, requirement. Diagonal accountability is an in-between form of accountability where the forum has no or little formal power over the actor. It is quite often found in administrative accountability settings, for instance in relation to ombudsmen or auditors (Bovens, Reference Bovens2007a, p. 460). As will become clear throughout the analysis, the power relations between various fora are crucial factors to understanding how accountability plays out in situ.

Finally, accountability may serve different functions, that is, there may be different reasons for accountability to begin with, and different expectations as to what “good” accountability entails. Willems and Van Dooren (Reference Willems and Van Dooren2012) distinguish three different functions in accountability: constitutional, democratic, and performance functions. The first function deals with the prevention of abuse of power. The second deals with representation of the citizens. The third function deals with “what government actually accomplishes” (p. 1023; emphasis in the original). Willems and Van Dooren note there need not be a “‘unidimensional relationship” between the forum and the function of accountability” (p. 1026; emphasis in the original). Though every forum has a “central” function, they argue, that fora can draw upon several functions if need be, and that each function can draw upon several fora (p. 1027). As we will highlight later in the analysis, these functions or perspectives (constitutional, democratic, performance) have ramifications for how different fora can replace/complement one another across the five archetypes (administrative, political, social, legal, mediatized).

We follow Bovens (Reference Bovens2007a), Jacobs and Schillemans (Reference Jacobs and Schillemans2016), and Willems and Van Dooren (Reference Willems and Van Dooren2012)’s identifications of the characteristics of the various accountability types (Table 1). This multifaceted view on accountability serves as a framework through which the SyRI case can be assessed for its accountability practices. It will help to see to which degree algorithmic systems figure as the topic of “same old, same old” accountability types, or whether there are new considerations that need to be considered when dealing with algorithmic systems. Moreover, it the division in these five types will help to see which kinds of accountability practice put what kinds of questions and concerns on the table. Before we can move on to the analysis, we will first sketch succinctly what SyRI is and the historical developments surrounding the system. After introducing the case we will analyze the interwoven accountability practices around SyRI. The analysis will touch upon the aspects of the system which are central to these accountability types and their respective actants.

4. SyRI

SyRI is a tool developed by the Dutch government and leveraged by municipalities between 2015 and 2019 to detect indications of possible fraud in data of welfare beneficiaries.Footnote 4 SyRI is, according to the State, “a simple decision tree,”Footnote 5 that checks for “discrepancies in the data.”Footnote 6 However as Van Bekkum and Zuiderveen-Borgesius (Reference Van Bekkum and Zuiderveen Borgesius2021) note, the SUWI legislation underlying SyRI facilitates other algorithmic techniques as well and it remains unclear what algorithmic technique is used, precisely. Within SyRI, various data sources are coupled, as it tries to employ nearly all the data the government has about its citizens. Concretely, it could draw on data related to:Footnote 7

  • Employment;

  • Penalties and sentences;

  • Taxes;

  • Properties;

  • The denial of welfare benefits;

  • Residency;

  • Identity;

  • Integration;

  • Compliance with law/regulation;

  • Education;

  • Pension;

  • Reintegration;

  • Debts;

  • Welfare benefits;

  • Permits and exemptions;

  • Health care insurance.

The tool is part of a neighborhood-centered approach aimed to increase livability in what the State terms “problem neighborhoods” with high rates of poverty, crime, and welfare beneficiaries. SyRI has exclusively been applied in such neighborhoods.Footnote 8 SyRI is the successor of two earlier algorithmic systems used for similar purposes: Waterproof (2004–2007) and Black Box (2008–2014).

Before its implementation, in 2014, the decree that facilitated SyRI was heavily critiqued by both the Data Protection Authority and the Council of State for its inadequate proportionality and subsidiarity.Footnote 9 Codified in the Dutch ABBB (General Principles of Good Governance) and the Awb (General Administrative Law Act), proportionality is the principle that a decision or a measure aimed to benefit the public good should not disproportionally affect or harm stakeholders. Subsidiarity, in turn, means that given several options, the least impactful or “heavy” option should be selected. The DPA noted, for instance, that the explanatory note accompanying the decree failed to make an adequate case for subsidiarity as it compared unequal situations and thus did not provide a reasonable argument for SyRI being the least intrusive option possible.Footnote 10 Despite these critiques, the decree passed through both the House of Representatives and the Senate without debate: a so-called hammer piece.Footnote 11 Afterward, SyRI has been mobilized five times by four different municipalities with varying levels of success. In 2015, Eindhoven was the first municipality to leverage SyRI.Footnote 12 Rotterdam followed in 2016, but canceled the project after one month due to lack of capacity.Footnote 13 Capelle aan den IJssel kicked off a third SyRI project in April 2016.Footnote 14

At the end of 2016, a coalition of civil society organizations and individuals (“the Privacy Coalition”) convened to file a freedom of information (FOI) request with the minister of Social Affairs and Employment in which they asked several questions about the workings and use of SyRI (see also Van Bekkum and Zuiderveen Borgesius, Reference Van Bekkum and Zuiderveen Borgesius2021).Footnote 15 After the FOI request was, partially, granted, the Privacy Coalition argued that they received little to no explanation of the working and use of the system.Footnote 16 The coalition took further action in March 2018, suing the state for its use of SyRI. At this point, two SyRI projects were finalized (Capelle aan den IJssel and Eindhoven), and two had just started: one in Haarlem, and yet another project in Rotterdam, focusing on different residential areas than their earlier canceled project.Footnote 17 In the beginning of June 2018 MPs Verhoeven (D66) and Buitenweg (GL) filed a motion to make SyRI transparent, this was denied by the State Secretary.Footnote 18

Later in 2018 the first SyRI report, from Eindhoven, was presented.Footnote 19 In this report it becomes clear that the project team experienced great trouble with SyRI and that for half of the investigation they by and large had to work around the system as it did not function as expected due to, amongst others, data quality and combination problems. For the other half of the investigation, they did not use SyRI at all. In October the end report of the project in Capelle aan den IJssel is presented. It read that SyRI did not lead to any insights that were not already found via other methods, and that none of the leads it produced were pursued.Footnote 20 Based on the latter report, MP Buitenweg (Groenlinks) asked the minister various questions about the usage of SyRI in the House of Representatives.Footnote 21

Increasingly, SyRI is scrutinized and the ongoing projects have to deal with setbacks. Starting from March and up until July 2019 SyRI is an intermittent topic of debate in the municipal council of Rotterdam where one of the projects is situated.Footnote 22 The Rotterdam scrutiny intensified when several inhabitants of Rotterdam’s targeted residential areas Hillesluis and Bloemhof joined a union in protests against the municipality in June.Footnote 23 In July mayor Aboutaleb of Rotterdam announced that the SyRI project concerning Bloemhof and Hillesluis was canceled.Footnote 24 A bit earlier, in May, it became known that the SyRI project in Haarlem was canceled as well, but for different reasons. Haarlem cited lack of time as its reason.Footnote 25 Days before the hearing in October 2019, the UN special rapporteur on extreme poverty and human rights presented the Court with an amicus brief in which he underlined much of the points of the Privacy Coalition.Footnote 26

After the hearing, in November, MP Buitenweg again questioned the State Secretary about SyRI and the situation in Capelle aan den IJssel.Footnote 27 MP Buitenweg, amongst other things, inquired into when the State Secretary was made aware of the, now public, evaluation of Capelle aan den IJssel in which SyRI proved to be unproductive. While waiting for answers from the State Secretary, SyRI won the Big Brother Award later that month.Footnote 28 This award is a Dutch satirical prize for privacy violators, handed out by privacy organization Bits of Freedom. The State Secretary, Tamara van Ark, answered Buitenweg’s questions on December 20, 2019. Yet MP Buitenweg is still unsatisfied with what, according to her, remain incomplete and evasive answers. On January 21, 2020, MP Buitenweg again asks the State Secretary for clarification.

On February 5, 2020, the judges rendered verdict. In its verdict, the Court acceded to the most important points the Privacy Coalition put forth, while also acknowledging the pressing need of the State to use such systems to this end. However, the Court argued, there needs to be a “fair balance” to the system which so far was missing.Footnote 29 Because of this, the Court ruled that the SyRI chapter of the SyRI enabling decree was non-binding due to conflict with the European Convention on Human Rights (ECHR) article 8, the right to privacy (for an in-depth discussion of the judgment see Van Bekkum and Zuiderveen Borgesius, Reference Van Bekkum and Zuiderveen Borgesius2021).Footnote 30 On April 23, 2020, the Ministry of Social Affairs and Employment announced that it would not appeal the verdict but wants to learn from the SyRI case and create a new system.Footnote 31

5. A Note on Method: Scavenging as a Method of Data Gathering

This paper aims to provide an empirical understanding of the algorithmic accountability practices surrounding SyRI. In order to inform the in-depth analysis of the algorithmic accountability of the SyRI we had to adopt “ethnographic tactics” to obtain the needed material and information (Seaver, Reference Seaver2017). Specifically, we used “scavenging” as a way to gather our data. Scavenging is a practice that has been successfully applied by ethnographers who studied secluded communities such as nuclear physicists (Gusterson, Reference Gusterson1997). It is a very pragmatic and eclectic way of data gathering, in which researchers can work around rigid access barriers, and has been suggested as an ethnographic tactic useful in studying algorithmic systems. As Seaver (Reference Seaver2017, p. 7) writes: “A great deal of information about algorithmic systems is available to the critic who does not define her object of interest as that which is off limits or intentionally hidden.” While we are thus not privy to the intimate details of SyRI’s operations, we can scavenge a lot of information about the accountability practices enmeshed with it.

SyRI is an intentionally opaque system and getting access– particularly under the circumstances of an ongoing lawsuit and general upheaval––proved difficult. Circumventing this, we thus “scavenged” our material in heterodox sites and in various formats (Gusterson, Reference Gusterson1997; Seaver, Reference Seaver2017, pp. 6–7). The material we scavenged comes in various formats and from different sites. Concretely, our scavenging consisted of document research and in-person observations. To give some examples, we analyzed seminars where relevant actants presented, we combed through dozens of documents released through the FOI request by the Privacy Coalition, we attended the legal proceedings, we gathered relevant newspaper clippings, we worked through political inquiries, and reports from organizations such as the Data Protection Authority and so forth (see Table 2 for an overview of all scavenged material). Together, this scavenged material can help us reconstruct events (see data availability at the end of this paper for a timeline), so we can shed light on how accountability practices around SyRI took shape, as we analyze it though the accountability lens presented earlier in this paper.

Table 2. Overview of “scavenged” material

Our methodology is thus different than Mol’s approach to praxiography. We were not able to always follow the doing in action, as she was able to see how atherosclerosis was done in different places in the hospital. Though we were able to observe the hearing, and the media attention, other accountability practices were reconstructed through reports after the fact. While we were present in some, we were not present in all the rooms where accountability was done. Nevertheless, we find scavenging a suitable and pragmatic approach for our present purposes as accountability itself necessarily comes in different shapes, forms, timelines, and sizes, some of which are public others are not. Scavenging allows for following the traces left by accountability practices and reconstructing these in order to analyze them.

Concretely, the analysis was done through reconstructing a timeline (see data availability at the end of this paper), and writing a “biography” (Kopytoff, Reference Kopytoff and Appadurai1986) of SyRI’s “life” as a sociotechnical system and the accountability practices involved. This was then subsequently analyzed in light of accountability theory (e.g., type of accountability relation, actor, forum, consequences, account, function) after which an interpretive approach was used to further analyze the case study (e.g., the division in phases, the focus of the account).

6. Assessing Traditional Accountability Practices

Leveraging the traditional accountability archetypes set out in Section 3, we will describe and assess the accountability practices around SyRI case (Section 4). The data underlying this analysis is “scavenged” (Section 5), and ranges from political inquiries in the House of Representatives, to a court hearing, and from talks by Privacy Coalition members, to FOI documents. As we will see in the following sections, the SyRI case is complex and comprises a vast number of actants. The scope of this paper does not allow for a discussion of each detail of the case. Instead, we will focus our attention to two phases in SyRI’s “life” (Kopytoff, Reference Kopytoff and Appadurai1986). We term these the institutional phase and the public phase. Initially, accountability practices were located within formal settings (i.e., DPA, House of Representatives), and though public, there was no public attention to them. This changed with the FOI request of the Privacy Coalition and the subsequent lawsuit. This public attention to the case is what helps distinguish this second phase.

6.1. The institutional phase (±2012–2016)

At SyRI’s conception its accountability practices were firmly rooted in an institutional setting. That is a formal, and largely public kind of accountability. SyRI is a case of public importance and argued to be in the public’s interest, yet there was little public attention to the case initially. At the time, its prime fora––the actants weighing an actor’s conduct––were the administrative fora of the Data Protection Authority (DPA), the Council of State, and the political fora of the House of Representatives and the Senate, which “hammered” the legislation enabling SyRI into existence.Footnote 32 This early period in SyRI’s “life” (Kopytoff, Reference Kopytoff and Appadurai1986), which is up to the FOI request, we will call “the institutional phase.”

6.1.1. Administrative accountability

Administrative accountability practices involved the Data Protection Authority and the Council of State. The DPA focused at times on functionality and deployment, particularly with regard to SyRI’s direct predecessors, and paid additional attention to the decision-making around the system.Footnote 33 The Data Protection Authority serves an administrative forum that can impose diagonal consequences on the actor, in this case the government.

The DPA did two kinds of work: ex post assessment/critique of the system (predominantly for SyRI’s predecessors), and ex ante consultation on the legal expansion facilitating SyRI in 2012 (cf. Wieringa, Reference Wieringa2020). Their ex ante work touched upon SyRI direct predecessors, Waterproof and Black Box. The DPA’s critiques on Waterproof (lack of anonymization) and Black Box (lack of a legal ground for the system) spurred the subsequent redevelopment and optimization of the system in the guise of SyRI. In their ex ante activities, the DPA was joined by the Council of State. Both organizations negatively advised the government about the proposed legal changes.Footnote 34 For the DPA the problem resided in proportionality, subsidiarity, and insufficient purpose limitation. Moreover, they highlight that citizens on the “potentially risky” list, need to be informed thereof.Footnote 35 The Council of State formulated similar remarks: the proposed decree was not specific enough.Footnote 36 In 2013, the DPA was asked to review the revised legal expansion proposal. Though some of their 2012 concerns were resolved, others, such as the concerns regarding proportionality and subsidiarity were not addressed.Footnote 37

6.1.2. Political accountability

The administrative accountability efforts focus predominantly on the constitutional character of algorithmic systems. Political accountability, on the other hand, deals with the democratic character thereof (Willems and Van Dooren, Reference Willems and Van Dooren2012). Within the SyRI case we see that in the period prior to the FOI request there has been little such attention to the system. This is perhaps connected to the feeling at the time that systems like SyRI are instrumental and “uncontroversial”; that is, the feeling that such systems were not political per se, but rather administrative tools to ease governmental tasks.Footnote 38

The legal expansion enabling SyRI, which has been critiqued by the DPA twice and once by the Council of State, passed through the House of Representatives without a debate.Footnote 39 The second democratic hurdle, the Senate, similarly passed the expansion without debate.Footnote 40 As a “hammer piece,” the acceptance of the expansion was a mere formality.Footnote 41 Despite warnings of the administrative fora that the proposed legislation which would sign SyRI into effect was not proportional or subsidiary, the House of Representatives and the Senate did not pay attention to the potential ramifications of the system they effectively signed into effect even though it is their core functional to scrutinize new legislation and approve of such if it passes their democratic test. That is, the politicians are responsible for the democratic test of new legislation––which through its effects made the design, deployment, and use of SyRI possible. Regardless of whether the politicians lacked literacy as to the ramifications of data-driven governance, or whether they were incapable of picking up the signs, or if the need for SyRI was indeed as “self-evident” as to not warrant a debate on any potential drawbacks, the problem remains that neither the House of Representatives nor the Senate reviewed the legal expansion in depth.Footnote 42

6.1.3. Analysis of the institutional phase

Looking back, this institutional phase, formal accountability produced meager results. The fora in charge of assessing the legal expansion, that is the House of Representative and the Senate, according to themselves, failed to recognize the significance of the proposed changes, and barely responded.Footnote 43 Diagonal accountability relationships between the State, the Data Protection Authority and the Council of State produced some changes, but fundamental problems such as the system’s proportionality and subsidiarity remained, to a large degree, unaddressed. As it stands, the stronger forum, that is the political one, failed to augment the accountability efforts of the weaker, administrative fora which played an advisory and administrative role (Willems and Van Dooren, Reference Willems and Van Dooren2012).

As for the content of the account itself. We found that the administrative fora touched upon three different considerations. Especially with regards to SyRI’s direct predecessors (called Waterproof and Black Box), they focused on functionality considerations and deployment. Functionality considerations involve addressing how the system works, how it is tested, constructed, operates, and so forth. Deployment considerations focus on how this system operates in context. Especially with regards to SyRI, the DPA and the Council of State also focused on the decision-making around the system. Their negative advice to the Minister as to the legal expansion due to problems with proportionality, subsidiarity, and insufficient purpose limitation are a case in point.

6.2. The public phase (±2017–2020)

The FOI request made by the Privacy Coalition marked a turning point in slowly bringing SyRI to the public’s attention through the media. It provided the basis for a new impetus for accountability, not solely institutionally driven, but rather predominantly framed as res publica, a public affair, which is of public importance, to be discussed in the open, and in the public interest (Bovens et al., Reference Bovens, Schillemans, Goodin, Bovens, Goodin and Schillemans2014), for which there is public attention. Here, we enter “the public phase.”

Note, that while the institutional phase indeed similarly busied itself with this matter of public importance, there was little public attention to the case. There are thus two key changes between the institutional phase and the public debate phase. First, the latter phase ensured public attention to the case through the media.Footnote 44 For instance, there were many newspaper articles about SyRI, members of the Privacy Coalition were invited to talk shows and so forth. Second, we see that accountability efforts are no longer solely of a formal, institutional nature. That is, the “public does not only consist of individual citizens but also of all kinds of other societal actors (journalists, societal groups, intermediaries, etc.)” (Meijer and Bovens, Reference Meijer, Bovens, Palmirani, Van Engers and Wimmer2003). As such, there is not only an increase in accountability actants, but these also “increasingly add dynamic, informal and nonhierarchical accountability relations to the existing accountability relations through formal institutions” (Meijer and Bovens, Reference Meijer, Bovens, Palmirani, Van Engers and Wimmer2003). In the SyRI case, we see that the Privacy Coalition, itself a social forum, leveraged judicial and mediatized accountability, which in turn spurred renewed political accountability. That is, this relatively weak social forum, directly or indirectly leveraged other actants that stood in more firm power relations to the government (i.e., the Court, the House of Representatives). Simultaneously, we see that the Privacy Coalition deploys a media offense so as to educate the public, stimulate reflection, trigger and amplify formal accountability, and that these media start to act as fora themselves too (Jacobs and Schillemans, Reference Jacobs and Schillemans2016). Such informal accountability also triggered institutional accountability, for instance in the form of MPs asking questions about the system.

6.2.1. Social accountability

In 2017, the Privacy Coalition made an FOI request about the system’s workings. From there on out, they became a serious public, social, forum that leveraged other fora when their power proved to be too limited to scrutinize the intentionally opaque algorithmic system. Even though this social forum sometimes exerts power by proxy, the impetus is very much rooted in and spurred by citizen engagement, and a drive to better public governance. As such, it is a type of social accountability. Social accountability can come in two flavors: tactical, or strategical (Fox, Reference Fox2015). The tactical strand mainly focuses on increasing the “voice” of citizen and an increase of the available information. The strategical flavor employs “multiple tactics” and argues for collective action, and for the synergy between citizen initiatives and reforms in the public sector. Moreover, it argues that aside from voice, one also needs teeth (Fox, Reference Fox2015). The citizen initiative of the Privacy Coalition is rooted in this latter, strategic paradigm.

As said, the FOI request marked the beginning of this public debate phase in the SyRI case. While FOI requests themselves are not “a direct tool for accountability” they are “a means by which information can be obtained, and used, by accountability mechanism” (Worthy, Reference Worthy2010, p. 568). Thus, FOI requests can be seen as a prelude to accountability efforts under the right circumstances (Meijer, Reference Meijer, Bovens, Goodin and Schillemans2014). While FOI Acts do increase the transparency of the government in general (Grimmelikhuijsen et al., Reference Grimmelikhuijsen, John, Meijer and Worthy2019), in this instance the crucial information, such as audit reports and PIAs, needed to evaluate the proportionality of the system was withheld.Footnote 45 The acceptability of such a––potentially quite invasive––algorithmic system rests to a large extent with the validity of the models, the balancing of false positives and negatives, and how costs and benefits are balanced, not just on a financial level but also in light of public values.

It is precisely this balancing of particular values (e.g., putting public funds to a good use versus privacy and securing the modus operandi versus openness) in SyRI that is tilted in such a way that it is particularly hard to challenge the system, as the claimants later also argued in their plea.Footnote 46 As ensuring legitimacy in the social security system––argued to be only possible by withholding the system’s modus operandi––was given so much more priority than transparency, it became nearly impossible to hold the State accountable, as one could not inform themselves properly due to the black boxed nature of the system.

As the FOI request did not result in enough transparency according to the coalition, they eventually filed a lawsuit as part of their strategic social accountability scheme. Within this strategy, the coalition not only appealed to the Court, but appealed to society at large, amongst others by making use of the media. By sparking such a societal debate about this system, and the implications thereof, they also managed to leverage the political forum.

6.2.2. Judicial accountability

The strategy that the Privacy Coalition used in order to ensure accountability is thus twofold. On the one hand, they made use of strategic litigation, thus leveraging a judicial forum, on the other hand they coupled this to a media offense. Let us first look at the strategic litigation prong. Strategic litigation is a type of lawyering for change. More specifically, Ramsden and Gledhill (Reference Ramsden and Gledhill2019) note that strategic litigation aims to have a legacy beyond this specific case. They describe it as a “method of advocacy,” the objectives of which extend the judicial forum. Strategic litigation, as a form of legal empowerment, overlaps and complements social accountability efforts rather well, even though they come from different traditions (Ezer et al., Reference Ezer, McKenna and Schaaf2015, pp. 2–3). As Joshi (Reference Joshi2017, p. 160) writes, social accountability and legal empowerment “have much in common—a strategy of awareness raising and mobilization, an orientation toward state-granted rights, and a concern with improving services, creating active citizens, and establishing sustainable changes in governance structures.” The integration of social accountability and legal empowerment can chart new routes for social accountability, and provide the “teeth” of litigation toward systemic change (Fox, Reference Fox2015; Joshi, Reference Joshi2017).

Social fora such as the Privacy Coalition, needed to leverage other mechanisms and accountability structures so as to enforce an account. In this specific case, the social forum employed FOI mechanisms. When that did not produce adequate results, they turned to the legal forum of the Court. The goal of the lawsuit was not so much just to get a verdict, rather, as one of the coalition’s lawyers noted, the main purpose was to spur public debate and “make an impact.”Footnote 47

In the case of SyRI, the judges found that there is indeed a large intrusion in the private life of citizens. Secondly, they concluded that there is a legitimate goal underlying the deployment of SyRI. Finally, the judges argued that the State should use new technology for combatting fraud, however as the repercussions and implications of using such new technology might not be clear from the outset, the State is tasked to be extra careful and needs to shoulder a heavier responsibility than normal.Footnote 48 The verdict noted that the State did not adequately balance the transparency principle, the purpose limitation principle, and the data minimization principle in such a way that SyRI was proportional and necessary (for an in-depth discussion of the judgment see Van Bekkum and Zuiderveen Borgesius, Reference Van Bekkum and Zuiderveen Borgesius2021).Footnote 49

The Court for instance noted that “the SyRI legislation does not provide information on the functioning of the risk model, for instance, the type of algorithms used in the model, nor does it provide information on the risk analysis method as applied by the Social Affairs and Employment Inspectorate.” Moreover, the Court argued, “the SyRI legislation does not afford insight into the validation of the risk model and the verification of the risk indicators.” As such the Court was unable to verify how SyRI’s decision tree was “generated and of which steps it is comprised.” They continued to argue that such transparency is thus necessary not only to be able to inform one’s defense, but also to verify the findings and the model itself.Footnote 50 The Court thus ruled that the State needs to account for the model and the algorithms. Moreover, the State needed to explain how the algorithmic system was designed, tested, applied, and how it operates. Regarding the fundamental values at stake the Court noted that it is not their duty to attach significance or value to the interests at stake. Implicitly, they did acknowledge the need thereof, but their forum is––unlike, for instance, the political one––not equipped to decide on that matter, their prerogative does not stretch that far, they seemed to argue.Footnote 51

6.2.3. Mediatized accountability

Coupled to the lawsuit, the Privacy Coalition started a media offense.Footnote 52 The media attention pulled the SyRI case out of obscurity and into the public eye. In order to leverage the media effectively, the Privacy Coalition deliberately included Maxim Februari and Tommy Wieringa in the lawsuit. The coalition believed that adding these prominent writers as plaintiffs to the case––even though their personal stakes in the matter were admittedly limited––would enhance the effectiveness of this “method of advocacy.” Their inclusion, made it possible that “that they could go to things like De Wereld Draait Door,” a Dutch prime time television talk show, on the coalition’s behalf.Footnote 53

News media reported on the formal events related to the SyRI court case, but more importantly, they were facilitators in maintaining public attention, of shaping public opinion, and for raising literacy and awareness.Footnote 54 Wieringa and Februari regularly appeared in the media in light of the case, as did several academics and experts. Such media appearances did not only explain why the Privacy Coalition took to the court. They also educated citizens as to why SyRI, according to Privacy Coalition and academics/experts, was a signifier for a larger problem and the repercussions it may have had: namely the kind of algorithmic society we desire to live in as humans.Footnote 55 Moreover, it kept public attention on the case.

The media also amplified social accountability efforts, for instance in documenting and magnifying the demonstrations by citizens of two neighborhoods in which SyRI was deployed in 2019.Footnote 56 The event itself, the protests against the system, is one thing. Yet the power of these rallies extended even further when magnified by a media forum.Footnote 57

Yet, the media also played a role in controlling the government, as the famous Fourth Estate (Curran, Reference Curran, Dahlgren and Sparks1991), or the watchdog of the government (e.g., Jacobs and Schillemans, Reference Jacobs and Schillemans2016). They for instance did so when De Volkskrant, announced that not a single case of fraud was detected by SyRI.Footnote 58 Here, the media fulfilled a Fourth Estate function; acting as an independent party scrutinizing government (Hampton, Reference Hampton and Allan2010, p. 3). Yet in some cases, they went even further. An editorial comment in NRC Next and NRC Handelsblad, several days after the hearing, argued that the State has been using a system that is not proportional, with risk of discrimination and little room to effectively appeal any decision.Footnote 59 In such commentary, the media act as fora themselves.

Mediatized accountability is, as Jacobs and Schillemans (Reference Jacobs and Schillemans2016) write partially dependent on other fora and accountability relationships, for example, the proceedings of the legal case. Yet it is also incident driven. The media leverage formal accountability practices and incidents not only to report and play a Fourth Estate function, but also to educate their public to strengthen future accountability efforts (be they social, political, or otherwise). They thus play a crucial role in which they spark and fuel public debate and attention to the matter.

6.2.4. Political accountability

In response to the lawsuit and the media attention for the case, the political fora (both the House of Representatives and municipal councils) proceeded to ask questions to the government and the municipal executives as well.Footnote 60 There is thus a kind of snowball effect to the social accountability strategy: the lawsuit, a form of strategic litigation, triggers a public accountability process, harnessing media attention to the case and the fundamental concerns that go with it. In turn, this sparked renewed political accountability.

Whereas there was little to no political attention for SyRI in the institutional accountability phase before the FOI request, now there is ample debate about the system. This renewed political accountability starts to ask more about the fundamental questions underlying this system (on the local level), strives to make the system more transparent (both levels), and scrutinizes the accounts of the government about the system (on the national level). Yet we see that both the national and local levels are also intertwined with the other accountability practices. The media and societal pressure are informally cited as some of the reasons why the SyRI project in Rotterdam was canceled. On a national level, we see that there is an increase in sensitivity about these kind of projects, and politicians are called on to account for their ineffectiveness in scrutinizing SyRI in its early stages.

6.2.5. Analysis of the public phase

In the public accountability phase, we do see successful agenda-setting and productive accountability relations, as opposed to the institutional phase. This phase hinged on the Privacy Coalition actions as a democratically oriented, social forum. Recognizing that their own power over the State was limited, they leveraged other actants and accountability practices to add “teeth.” By starting a lawsuit, they involved the Court as a judicial forum, with a constitutional function, and by coupling this to a media offense, they involved the media and simultaneously brought the democratic matter to the public’s attention. This two-pronged strategy also sparked political debate, both on the national and the local level.

While this accountability phase was thus more productive, we need to look closer at the discrepancies between the foci and perspectives of these various actants involved. The lawsuit is a case in point. When the judges inquired into the aims of the lawsuit, we see a discrepancy between the views of the State and the Privacy Coalition. In other words, for the State SyRI was purely a new instrument to support their existing work practices, whereas for the Privacy Coalition and the Court it fundamentally altered the relation between the citizen and the State––for instance, because the subject may not know if or why they were singled out by a system, possibly because of a discriminatory practice (see also Van Schendel, Reference van Schendel and Reins2019). Consequently, this resulted in different questions posed and answers given. Whereas the Privacy Coalition wondered about public values and the raison d”être of the system and the way in which it was used, the State emphasized its technical simplicity and instrumental nature. That is, the social forum was predominantly interested in questions around decision-making and deployment, whereas the State countered these questions from a functional standpoint. Meanwhile, the Court could only test the legislation underlying the system, not the system itself.Footnote 61 We thus see that the social forum predominantly focused on questions of decision-making in the lawsuit, and made functionality and deployment considerations subservient to this, the State focused on the functionality of the system in their defense, and the Court could only do a legal test of the legislation underlying SyRI with a focus on function and deployment. It could not comment on the desirability of such a system, for instance.

The media predominantly focused on considerations regarding the decision-making around the system and the deployment thereof. Due to the opacity of the system, their focus was not so much on the functionality of the system. The media fora thus provided a complementary accountability practice, in that they augmented the Court’s focus on functionality and deployment with considerations about the decision-making around the system. Yet, just as the social forum itself, they had less power to enforce anything, though they did facilitate public attention to the case. Eventually, the political fora got involved as well. These fora focused on all three of these considerations (decision-making around the system, functionality, and deployment) and have “teeth.” As a forum that potentially can play a large ex ante role, this also opens avenues for mitigation strategies of accountability gaps, as we will see below.

7. Accountability Gaps and Mitigation

So far we have assessed how accountability was practiced. As we saw above, traditional forms of accountability can be incredibly productive in dealing with algorithmic systems but are simultaneously at risk. To understand these risks, we need to address the associated accountability risks. While SyRI was successfully halted in the end, there are two accountability “gaps” we can identify in this case and which we can use to strengthen future algorithmic accountability practices.

First, the sensitivity of the political fora, on which ex ante accountability hinges.

In the institutional phase, we saw that especially the political fora (House of Representatives and Senate) were, as they themselves acknowledge, too apathic about the legislation that enabled SyRI, despite extensive warnings from administrative fora (DPA and Council of State).Footnote 62 This resulted in an accountability gap. To be more precise, it created a democratic accountability gap around the considerations of the decision-making around the system (i.e., why this system, why is this proportional), functionality (i.e., how does it work), and deployment (i.e., how is this system used and leveraged). The political fora responded in medias res/ex post to public pressure. Their ex ante response without such pressure was deemed to be insufficient by themselves, in hindsight, by the Privacy Coalition, and by the media. Central to their irresponsiveness, seemed to be the idea that SyRI was “merely” on instrument to make the execution of government tasks more effective and efficacious. Framing algorithmic systems as mere instruments for the execution of government tasks is something that may lead to accountability gaps.

Second, there is a discrepancy between the grounds on which the system was eventually overturned, and the impetus of the social forum which tried to halt it. The FOI request of the Privacy Coalition marked the start of the public phase. Here we see successful attempts of the Coalition to put their concerns of this democratic accountability gap regarding SyRI on the agenda. They leverage both the media and the Court for this agenda-setting. This in turn sparked renewed political interest in the system. However, while perhaps this may superficially feel like an adequate mitigation strategy, as it produced the desirable outcome for the social forum, there is a catch. The Court, with its constitutional focus, did not operate on the same democratic grounds as the Privacy Coalition, nor that of the political fora.

Moreover, we found that the administrative, political, and social fora focus on different kinds of questions in their accountability practices. We found three types of algorithm-specific considerations which were relevant to and practiced by various fora:

  • the decision-making around a system;

for example, why this system is designed this way, and according to what (public) values?

  • its functionality;

for example, how does the algorithm work, how it is tested, constructed, how does it operate, how are public.

values technically ensured in the design?

  • its deployment.

for example, how this system is in leveraged in specific contexts to do particular things?

To illustrate, the Court focused on the technicalities of SyRI (e.g., the risk model, analysis method, validation) as well its deployment (in “problem neighborhoods”), but ultimately its prerogative seemed tied to the underlying legislation which it was asked to test against the ECHR. Fundamental questions about what kind of algorithmic society we want to live in are not their prerogative, the Court argued––but such decision-making around the system was very much of interest to the Privacy Coalition (public values and raison d’être). The media did not focus on the functionality of SyRI, but instead focused on the decision-making around the system and its deployment (i.e., amplifying the efforts of the Privacy Coalition). The interests and foci of the respective accounts thus deviate across the different kinds of accountability practices. We thus see that not every forum will require the same kind of account or answers (see for a more in-depth overview of the kinds of algorithm-specific questions that could figure in these accountability practices see Wieringa, Reference Wieringa2020), nor might all of the questions raised to be answered by expert witnesses due to the algorithmic system being multiple, being deployed in different contexts and/or evolving over time. When designing social accountability strategies, this is something actants need be mindful of.

8. Conclusion

In this praxiography of the SyRI case, we see various interwoven accountability practices. Our research question asked to what extent existing accountability practices still suffice and which algorithm-specific considerations surface in algorithmic accountability practices. If accountability falls short, how can accountability gaps be mitigated? Our interest was to see how algorithmic accountability was done in situ and how/if it were done well. Harking back to our analysis, we saw that in the SyRI case, politicians initially failed to recognize the repercussions of such a system, despite warnings from the DPA and the Council of State.Footnote 63 Eventually, a civil society coalition (“the Privacy Coalition”) attempted to mitigate this by engaging in strategic litigation and a media offense. Triggered by these events, political interest was renewed in the case. Eventually, the Court overturned the legislation enabling SyRI, albeit on different grounds than that of the social forum.

We thus see that none of these accountability efforts stand-alone, rather, they respond, react to, or trigger each other. This makes accountability complex. However, not only are respective fora’s accountability practices interwoven, the respective fora need not necessarily serve only one function. While the Privacy Coalition’s central function was of a democratic nature, they also posed questions related to performance, for instance. This was reflected in their algorithm-specific engagement with questions of decision-making, deployment, and functionality. In this light, some scholars speak of “multiple accountabilities disorder (MAD)” (Koppell, Reference Koppell2005) as a way to describe the complex increase in accountability expectations which might even conflict. Willems and Van Dooren (Reference Willems and Van Dooren2012, p. 1028) note that some scholars see this complexity not necessarily as a negative phenomenon, or even see multiplicity as advantageous. Drawing on the SyRI case, we take the latter position.

As the case shows, multiple accountabilities can be used to mitigate existing accountability gaps (albeit on different terms) and these introduce different foci in light of algorithm-specific considerations. As was demonstrated, the different accountability archetypes bring different perspectives and considerations to the table (Table 3). This combination of various accountability practices mitigate potential accountability gaps, and can circumvent “the weaknesses of hierarchical accountability with the strengths of horizontal accountability and vice versa” (Braithwaite, Reference Braithwaite2008 in Willems and Van Dooren, Reference Willems and Van Dooren2012). However, it is important that any algorithmic accountability inventory also pays attention to the medium specificity of the account. We have identified three kinds of algorithm-specific considerations of various actants: the decision-making around a system; its functionality, and its deployment. When assessing accountability relationships for algorithmic systems, paying attention to the interdependencies, the foci, the perspectives, and these medium-specific considerations may help to tease out how algorithmic accountability is done differently in different settings by different actants and where subsequent accountability deficits arise.

Table 3. Accountability practices around SyRI

a The reports of the Council of State were made public but with a delay of several months.

b The Court executes a legal test on the legislation that enables SyRI rather than on the system itself. It is this test that is at stake rather than the system itself, though the Court does ask questions about the functionality of the system, as well as its subsequent deployment. However, the Court’s role is to evaluate the legislation enabling the system, rather than SyRI itself.

Acknowledgments

The author wishes to thank José van Dijck, Albert Meijer, Gijs van Maanen, Isabelle Fest, Daan Kolkman, Annemarie Balvert, Aviva de Groot, Marvin van Bekkum and the anonymous reviewers for their helpful and insightful comments.

Funding Statement

None.

Competing Interests

The author declares no competing interests exist.

Author Contribution

MV: Conceptualization, data curation, funding acquisition, investigation, methodology, project administration, visualization, writing—original draft, and writing—review and editing.

Data Availability Statement

The reconstructed timeline can be found on Zenodo: https://doi.org/10.5281/zenodo.5788502.

Footnotes

1 E.g., Field notes, 2020-2-10; Field notes, 2020-3-12; Expertsessie Eerlijke Algortimen, 2020-2-26, Amsterdam: Pakhuis de Zwijger.

2 Please note that “actor” here is not used in a Latourian sense. The “actor,” in accountability theory, is the actant who needs to account for their conduct. The “forum” is the actant questioning and evaluating the actor’s conduct.

3 Note that this type of accountability deals with the traditional media, not social, new, and/or alternative media.

4 SyRI came into being in 2014 but is in fact third in a line of incremental fraud risk indication systems (respectively named “Waterproof” and “Black Box”). Due to the scope of this paper, we will focus here on the accountability practices around SyRI, for a more technical description see for instance the work of Van Bekkum and Zuiderveen-Borgesius (Reference Van Bekkum and Zuiderveen Borgesius2021).

6 Bitter CM (2019) Pleitnota in zake Staat der Nederlanden (ministerie van Sociale Zaken en Werkgelegenheid)/Nederlands Juristen Comité voor de Mensenrechten (NJCM). https://www.nieuwsszw.nl/download/787836/pleitnotasyri-776124.pdf.

7 Ministerie van Sociale Zaken en Werkgelegenheid. Besluit van 1 september 2014 tot wijziging van het Besluit SUWI in verband met regels voor fraudeaanpak door gegevensuitwisselingen en het effectief gebruik van binnen de overheid bekend zijnde gegevens met inzet van SyRI. Staatsblad van het Koninkrijk der Nederlanden. https://zoek.officielebekendmakingen.nl/stb-2014-320.html.

8 On a functional level, SyRI was not designed to be used exclusively in this fashion, but it has only been deployed in what the State terms “problem neighborhoods.” In part an explanation hereof could be that the system is part of the neighborhood centered approach (WGA) which exclusively focuses on such neighborhoods.

9 College Bescherming Persoonsgegevens (2012) Advies inzake effectiever gebruik van gegevens. The Hague; Autoriteit Persoonsgegevens (4 June 2012) CBP adviseert over effectiever gebruik gegevens in de sociale zekerheid. Autoriteit Persoonsgegevens. https://autoriteitpersoonsgegevens.nl/nl/nieuws/cbp-adviseert-over-effectiever-gebruik-gegevens-de-sociale-zekerheid (25 February 2020); Raad van State (2012) Voorstel van wet tot wijziging van de Wet structuur uitvoeringsorganisatie werk en inkomen en enige andere wetten in verband met fraudeaanpak door gegevensuitwisselingen en het effectief gebruik van binnen de overheid bekend zijnde gegevens, met memorie van toelichting. The Hague; Autoriteit Persoonsgegevens (18 February 2014) CBP adviseert over Besluit SyRI. Autoriteit Persoonsgegevens. https://autoriteitpersoonsgegevens.nl/nl/nieuws/cbp-adviseert-over-besluit-syri (26 February 2020); College Bescherming Persoonsgegevens (2014) Advies conceptbesluit SyRI. The Hague.

10 College Bescherming Persoonsgegevens (2014) Advies conceptbesluit SyRI. The Hague.

11 Asscher LF (2013) Wijziging van de Wet structuur uitvoeringsorganisatie werk en inkomen en enige andere wetten in verband met fraudeaanpak door gegevensuitwisselingen en het effectief gebruik van binnen de overheid bekend zijnde gegevens. Kamerstuk 33,579-7. https://zoek.officielebekendmakingen.nl/kst-33579-7.html; Tweede Kamer der Staten Generaal. (12 September 2013) Plenair verslag. https://www.tweedekamer.nl/kamerstukken/plenaire_verslagen/detail/bf54f9d7-cab9-4039-b119-0aa8a8a2dacf; Eerste Kamer de Staten Generaal. (1 October 2013). Hamerstukken. https://www.eerstekamer.nl/behandeling/20131001/stemming_hamerstuk/document3/f=/vjdvfpx5z3st.pdf.

12 Verzoek om toepassing Systeem Risico Indicatie (SyRI), LSI 2015/39, document opened up by FOI request.

13 Projectplan Adresfraude Afrikaanderwijk Rotterdam, LSI 2015/62, document opened up by FOI request; Annotatie Voorbereidingsgroep LSI van 18 maart 2016, document opened up by FOI request.

14 VNG Kenniscentrum Handhaving and Naleving (2018) Eindrapport Wijkgerichte Aanpak: Kwetsbare buurten in Capelle aan den IJssel.

15 The various parties were: Stichting Platfom Burgerrechten, Nederlands Juristencomité voor de Mensenrechten, Stichting Privacy First, Stichting KDVP, De Landelijke Cliëntenraad, FNV, and two individuals, Maxim Februari and Tommy Wieringa.

16 Bij voorbaat Verdacht (2018) Wat u niet mag weten over hoe SyRI u profileert. https://bijvoorbaatverdacht.nl/wob-verzoek/.

17 Ministerie van Sociale Zaken en Werkgelegenheid (2018) Mededeling van de Staatssecretaris van Sociale Zaken en Werkgelegenheid van 23 februari 2018, nr. 2018-0000028402, betreffende de Aanvangsdatum van het interventieteamproject Haarlem Schalkwijk. https://zoek.officielebekendmakingen.nl/stcrt-2018-12088.html; Ministerie van Sociale Zaken en Werkgelegenheid (2018) Mededeling van de Staatssecretaris van Sociale Zaken en Werkgelegenheid van 23 februari 2018, nr. 2018-0000028472, betreffende de vaststelling van de aanvangsdatum van het interventieteamproject WGA Rotterdam Bloemhof & Hillesluis. https://zoek.officielebekendmakingen.nl/stcrt-2018-12083.html.

18 Van Ark T (2018) Reactie op de motie van de leden Verhoeven en Buitenweg over openbaarmaking van databestanden, algoritmes en analysemethodes van SyRI. Kamerstuk 32,761-122. https://www.tweedekamer.nl/kamerstukken/detail?id=2018D33004.

19 N.A. (2018) Eindrapport project GALOP II: Gerichte Aandacht Leefbaarheid Ondernemerschap Participatie.

20 VNG Kenniscentrum Handhaving and Naleving (2018) Eindrapport Wijkgerichte Aanpak: Kwetsbare buurten in Capelle aan den IJssel.

21 Buitenweg K (15 October 2018) Vragen van het lid Buitenweg (GroenLinks) aan de Ministers van Sociale Zaken en Werkgelegenheid en voor Rechtsbescherming en de Staatssecretaris van Sociale Zaken en Werkgelegenheid over het gebruik van SyRI in Capelle aan den IJssel. https://zoek.officielebekendmakingen.nl/kv-tk-2018Z18418.html.

22 E.g., Roozen V and Aboutaleb A (2019) Beantwoording van de schriftelijke vragen van het raadslid A. van Zevenbergen (SP) over “Stop risicoprofilering bewoners van Bloemhof en Hillesluis.”

23 Heerekop A (2019) FNV en bewoners Rotterdam bieden burgemeester Aboutaleb fraudeboek aan. FNV. Date accessed 5 December 2019. https://www.fnv.nl/nieuwsbericht/sectornieuws/zorg-welzijn/2019/06/fnv-en-bewoners-rotterdamse-wijken-hillesluis-en-b.

24 Huisman C (3 July 2019) Rotterdam stopt omstreden fraudeonderzoek met SyRI. De Volkskrant. https://www.volkskrant.nl/nieuws-achtergrond/rotterdam-stopt-omstreden-fraudeonderzoek-met-syri~becb336a/.

25 Bij Voorbaat Verdacht (2019) SyRI-onderzoek in Haarlem voortijdig beëindigd. Bij voorbaat verdacht. Date accessed: 4 December 2019. https://bijvoorbaatverdacht.nl/syri-onderzoek-in-haarlem-voortijdig-beeindigd/.

26 Alston P (2019) Brief by the United Nations Special Rapporteur on extreme poverty and human rights as Amicus Curiae in the case of NCJM c.s./De Staat der Nederlanden (SyRI) before the District Court of The Hague (case number: C/09/550982/HA ZA 18/388).

27 Buitenweg K (8 November 2019) Vragen van het lid Buitenweg (GroenLinks) aan de Staatssecretaris van Sociale Zaken en Werkgelegenheid over SyRi. https://zoek.officielebekendmakingen.nl/kv-tk-2019Z21611.html.

28 Metselaar D (29 November 2019) “Minister Dekker en SyRI grootste privacyschenders van 2019” NRC. https://www.nrc.nl/nieuws/2019/11/29/minister-dekker-en-syri-grootste-privacyschenders-van-2019-a3982175.

29 The Hague Court of Justice (5 February 2020) ECLI:NL:RBDHA:2020:865. https://uitspraken.rechtspraak.nl/inziendocument?id=ECLI:NL:RBDHA:2020:865.

30 As an aside, the Dutch constitution cannot be used in fundamental rights cases “as it prohibits constitutional review of Acts of Parliament” (Vetzo, Reference Vetzo2021), which is why the coalition drew on the ECHR.

31 Van Ark T (23 April 2020) Kamerbrief naar aanleiding van vonnis rechter inzake SyRI. Ministerie van Sociale Zaken en Werkgelegenheid. https://www.rijksoverheid.nl/ministeries/ministerie-van-sociale-zaken-en-werkgelegenheid/documenten/kamerstukken/2020/04/23/kamerbrief-naar-aanleiding-van-vonnis-rechter-inzake-syri (28 April 2020).

32 A “hamerstuk” [tr. hammer piece] denotes the mere formality of passing through the House of Representatives and the Senate, without any debate or voting. The term originates in the hammer of the chair which is used to formally effect a decision.

33 College Bescherming Persoonsgegevens (2007) Bevindingen ambtshalve onderzoek Waterproof. Den Haag: College Bescherming Persoonsgegevens; College Bescherming Persoonsgegevens (2010) Rapport van definitieve bevindingen: Onderzoek van het College Bescherming Persoonsgegevens naar bestandskoppelingen door de SIOD voor de ontwikkeling van risicoprofielen. The Hague.

34 College Bescherming Persoonsgegevens (2012) Advies inzake effectiever gebruik van gegevens. The Hague; Raad van State (2012) Voorstel van wet tot wijziging van de Wet structuur uitvoeringsorganisatie werk en inkomen en enige andere wetten in verband met fraudeaanpak door gegevensuitwisselingen en het effectief gebruik van binnen de overheid bekend zijnde gegevens, met memorie van toelichting. The Hague.

35 College Bescherming Persoonsgegevens (2012) Advies inzake effectiever gebruik van gegevens. The Hague; Autoriteit Persoonsgegevens (4 June 2012) CBP adviseert over effectiever gebruik gegevens in de sociale zekerheid. Autoriteit Persoonsgegevens. https://autoriteitpersoonsgegevens.nl/nl/nieuws/cbp-adviseert-over-effectiever-gebruik-gegevens-de-sociale-zekerheid (25 February 2020).

36 Raad van State (2012) Voorstel van wet tot wijziging van de Wet structuur uitvoeringsorganisatie werk en inkomen en enige andere wetten in verband met fraudeaanpak door gegevensuitwisselingen en het effectief gebruik van binnen de overheid bekend zijnde gegevens, met memorie van toelichting. The Hague.

37 Autoriteit Persoonsgegevens (18 February 2014) CBP adviseert over Besluit SyRI. Autoriteit Persoonsgegevens. https://autoriteitpersoonsgegevens.nl/nl/nieuws/cbp-adviseert-over-besluit-syri (26 February 2020); College Bescherming Persoonsgegevens (2014) Advies conceptbesluit SyRI. The Hague.

38 Pelgrim C (28 October 2019) Er heerste een sfeer van hard, harder, hardst. De Volkskrant.

39 Asscher LF (2013) Wijziging van de Wet structuur uitvoeringsorganisatie werk en inkomen en enige andere wetten in verband met fraudeaanpak door gegevensuitwisselingen en het effectief gebruik van binnen de overheid bekend zijnde gegevens. Kamerstuk 33,579-7. https://zoek.officielebekendmakingen.nl/kst-33579-7.html; Tweede Kamer der Staten Generaal (12 September 2013) Plenair verslag. https://www.tweedekamer.nl/kamerstukken/plenaire_verslagen/detail/bf54f9d7-cab9-4039-b119-0aa8a8a2dacf.

40 A “hamerstuk” is a piece passed without debate.

41 Eerste Kamer de Staten Generaal (1 October 2013). Hamerstukken. https://www.eerstekamer.nl/behandeling/20131001/stemming_hamerstuk/document3/f=/vjdvfpx5z3st.pdf.

42 Bitter CM (2019) Pleitnota in zake Staat der Nederlanden (ministerie van Sociale Zaken en Werkgelegenheid)/Nederlands Juristen Comité voor de Mensenrechten (NJCM); Pelgrim C (28 October 2019) Er heerste een sfeer van hard, harder, hardst. De Volkskrant.

43 Pelgrim C (28 October 2019) Er heerste een sfeer van hard, harder, hardst. De Volkskrant.

44 Whereas the res publica was classically a matter for the Senate or Forum, nowadays the media play an instrumental role.

45 The FOI request initially comprised 113 documents. It opened up 64 documents, either partially (52) or completely (12). The usefulness of these documents varies, as in some cases the entire document is redacted so that it leaves little more than a title. The request withheld 49 documents. Of these 49 documents, six were already public or made public in other documents. 19 documents were determined to have a different topic thus did not fit the scope of the FOI request. Another group of 13 documents were withheld so as to respect internal deliberation of the civil servants. Five documents were withheld for operational reasons. In four cases files were withheld because they contained personal opinions. Finally, two metadata files were withheld because they contained personal information.

47 Fieldnotes, 2019-12-13; Driessen C (28 October 2019) “Willen we dat de overheid zo met burgers omgaat?” NRC Next.

48 Appelman N (26 February 2020) Samenvatting SyRI uitspraak. Eerlijke Algoritmen expert sessie. Municipality of Amsterdam: Amsterdam; The Hague Court of Justice (5 February 2020) ECLI:NL:RBDHA:2020:865. https://uitspraken.rechtspraak.nl/inziendocument?id=ECLI:NL:RBDHA:2020:865.

49 The Hague Court of Justice (5 February 2020) ECLI:NL:RBDHA:2020:865. https://uitspraken.rechtspraak.nl/inziendocument?id=ECLI:NL:RBDHA:2020:865.

50 The Hague Court of Justice (5 February 2020) ECLI:NL:RBDHA:2020:1878. https://uitspraken.rechtspraak.nl/inziendocument?id=ECLI:NL:RBDHA:2020:1878.

51 One of the reviewers pointed out that the Court—depending on its nature and the circumstances—could comment on this. However, in this particular instance the Court explicitly refrained from commenting on this aspect.

52 Aside from the attention to the case in traditional media, there was also ample attention to the system on social media such as Twitter. However, due to practical limitations in the research design, we only have anecdotal material about such discussions and so this “Fifth Estate” (Dutton, Reference Dutton2009) is left out of the present discussion.

53 Fieldnotes, 2019-12-13.

54 E.g., ANP (13 January 2018) Overheid gedaagd wegens vergaren persooonsgegevens. Trouw; NA (13 January 2018) Privacygroepen klagen Staat aan. Algemeen Dagblad; Van Lonkhuyzen, L (13 January 2018) Rechtszaak tegen staat om profileren burgers. NRC Next. Van Teeffelen K (29 October 2019) Frauderisicosysteem SyRI schendt privacy niet, zegt de staat. Trouw; Huisman C (29 October 2019) Rechter buigt zich over omstreden snuffelprogramma dat fraude via computers moet ontdekken. De Volkskrant; Van Gils S (29 October 2019) Mag de staat gluren in de watermeter? fd.nl; Redactie. (29 October 2019) Rechtszaak tegen IT-systeem dat burgers als “riskant” aanmerkt, Rotterdam gebruikte het voor wijken op Zuid. Algemeen Dagblad; Scholten L (29 October 2019) “Burgers zijn bij voorbaat al verdacht.” Algemeen Nederlands Persbureau (ANP); Driessen C (30 October 2019) “SyRI is eerste stap naar controlesamenleving.” NRC Next; Huisman C (30 October 2019) SyRI: systeem dat stigmatiseert of fraudeurs vangt? De Volkskrant; Van Teeffelen K (30 October 2019) Geheim sleepnet of handige tool tegen fraude? Het Parool.

55 E.g., Custers B (20 July 2019) Fraudebestrijding mag niet uitmonden in “Kafka.” Rotterdams Dagblad; Cath-Speth C and Dobbe R (14 August 2019) Verwacht geen wonderen van artificiële intelligentie. De Volkskrant; Van Teeffelen K (27 August 2019) Een eerlijk algoritme? Dat is niet zo makkelijk te maken, maar het kán wel. Trouw; Naafs S (24 July 2019) Help, de overheid discrimineert!; Maken algoritmes de ongelijkheid in de samenleving groter? Knack Magazine.

56 Van Staalduine J (20 June 2019) De Rotterdamse huishoudens die verdacht worden van fraude zijn diep beledigd: “Het lijkt wel ‘40-‘45”. Trouw.

57 E.g., Sitalsing S (28 June 2019) Verwondering. De Volkskrant.

58 Huisman C (27 June 2019) Fraudesysteem overheid faalt. De Volkskrant.

59 Redactie (2 November 2019) SyRI moet stoppen in afwachting van betere waarborgen. NRC Next; ibid. NRC Handelsblad.

60 Buitenweg K (15 October 2018) Vragen van het lid Buitenweg (GroenLinks) aan de Ministers van Sociale Zaken en Werkgelegenheid en voor Rechtsbescherming en de Staatssecretaris van Sociale Zaken en Werkgelegenheid over het gebruik van SyRI in Capelle aan den IJssel. https://zoek.officielebekendmakingen.nl/kv-tk-2018Z18418.html; Braun I (4 July 2018) High-risk citizens. AlgorithmWatch; Verhoeven K and Buitenweg K (6 June 2018) Motie van de leden Verhoeven en Buitenweg. Kamerstuk 32,761-118. https://zoek.officielebekendmakingen.nl/kst-32761-118.html; Buitenweg K (15 October 2018) Vragen van het lid Buitenweg (GroenLinks) aan de Ministers van Sociale Zaken en Werkgelegenheid en voor Rechtsbescherming en de Staatssecretaris van Sociale Zaken en Werkgelegenheid over het gebruik van SyRI in Capelle aan den IJssel. https://zoek.officielebekendmakingen.nl/kv-tk-2018Z18418.html; Buitenweg K (8 November 2019) Vragen van het lid Buitenweg (GroenLinks) aan de Staatssecretaris van Sociale Zaken en Werkgelegenheid over SyRi. https://zoek.officielebekendmakingen.nl/kv-tk-2019Z21611.html; Buitenweg K (21 January 2020) Vragen van het lid Buitenweg (GroenLinks) aan de Staatssecretaris van Sociale Zaken en Werkgelegenheid over SyRi. https://zoek.officielebekendmakingen.nl/ah-tk-20192020-1926.html; Van Ark T (20 December 2018) Kamervraag/vragen van het lid Buitenweg (GroenLinks). https://www.rijksoverheid.nl/documenten/kamerstukken/2018/12/20/beantwoording-kamervragen-over-het-gebruik-van-syri-in-capelle-aan-den-ijssel; Van Ark T (8 June 2018) Brief van de Staatsecretaris van Sociale Zaken en Werkgelegenheid. Kamerstuk 32,761-22. https://zoek.officielebekendmakingen.nl/kst-32761-122.html; El Hamidi L (19 June 2019) Inspecteur Algoritme. NRC Next; “We krijgen helemaal geen informatie. Nu begint het klein in twee wijken, maar dit wordt veel groter. Mensen lijken de impact hiervan niet te beseffen”; Ritman M (19 March 2019) “Digi-opsporing is een sleepnet.” De Telegraaf.

61 While we take a sociotechnical standpoint, following Seaver and Wieringa ourselves, we wish to highlight with phrases as “technical standpoint” and “technical simplicity” how SyRI is done and accounted for by actants. With it, we trace their own accountability practices.

62 Pelgrim C (28 October 2019) Er heerste een sfeer van hard, harder, hardst. De Volkskrant.

63 Pelgrim C (28 October 2019) Er heerste een sfeer van hard, harder, hardst. De Volkskrant.

References

Ananny, M and Crawford, K (2018) Seeing without knowing: Limitations of the transparency ideal and its application to algorithmic accountability. New Media and Society 20(3), 973989. https://doi.org/10.1177/1461444816676645CrossRefGoogle Scholar
Bekker, S (2021) Fundamental rights in digital welfare states: The case of SyRI in the Netherlands. In Netherlands Yearbook of International Law 2019. Otto, Spijkers, Werner, Wouter G., Wessel, Ramses A. (eds.). T.M.C. Asser Press, The Hague, pp. 289308.Google Scholar
Birhane, A (2021) Algorithmic injustice: A relational ethics approach. Patterns 2(2), 100205. https://doi.org/10.1016/j.patter.2021.100205CrossRefGoogle ScholarPubMed
Bovens, M (2007a) Analysing and assessing accountability: A conceptual framework. European Law Journal 13(4), 447468. https://doi.org/10.1111/j.1468-0386.2007.00378.xCrossRefGoogle Scholar
Bovens, M (2007b) Public accountability. In Ferlie, E, Lynn, LE and Pollitt, C (eds), The Oxford Handbook of Public Management. Oxford University, Oxford, pp. 182208.Google Scholar
Bovens, M (2010) Two concepts of accountability: Accountability as a virtue and as a mechanism. West European Politics 33(5), 946967. https://doi.org/10.1080/01402382.2010.486119CrossRefGoogle Scholar
Bovens, M, Schillemans, T and Goodin, RE (2014) Public accountability. In Bovens, M, Goodin, RE and Schillemans, T (eds), The Oxford Handbook of Public Accountability. Oxford University Press, Oxford, pp. 122.Google Scholar
Braithwaite, J (2008) Regulatory Capitalism : How it Works, Ideas for Making it Work Better. Edward Elgar, Cheltenham.CrossRefGoogle Scholar
Buolamwini, J and Gebru, T (2018) Gender shades: Intersectional accuracy disparities in commercial gender classification. Proceedings of Machine Learning Research 81, 115.Google Scholar
Carroll, N (2019) Medium specificity. In Carrol, N, Di Summa, LT, and Loht, S (eds), The Palgrave Handbook of the Philosophy of Film and Motion Pictures. Palgrave Macmillan, Cham, pp. 2947. https://doi.org/10.1007/978-3-030-19601-1_1Google Scholar
Cobbe, J, Lee, MSA and Singh, J (2021) Reviewable automated decision-making: A framework for accountable algorithmic systems. In FAccT 2021––Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency. ACM, New York, pp. 598609. https://doi.org/10.1145/3442188.3445921Google Scholar
Cooper, AF, Moss, E, Laufer, B and Nissenbaum, H (2022) Accountability in an algorithmic society: relationality, responsibility, and robustness in machine learning, In 2022 ACM Conference on Fairness, Accountability, and Transparency (FAccT ’22). ACM, New York, pp. 864876. https://doi.org/10.1145/3531146.3533150CrossRefGoogle Scholar
Costanza-Chock, S, Raji, ID and Buolamwini, J (2022) Who audits the auditors? Recommendations from a field scan of the algorithmic auditing ecosystem, In 2022 ACM Conference on Fairness, Accountability, and Transparency (FAccT ’22). ACM, New York, pp. 15711583. https://doi.org/10.1145/3531146.3533213Google Scholar
Curran, J (1991) Rethinking the media as a public sphere. In Dahlgren, P and Sparks, C (eds), Communication and Citizenship. Routledge, London/New York.Google Scholar
Dekker, S (october 9th 2018) Transparantie van algoritmes in gebruik bij de overheid. Letter to the House of Representatives. Ministry of Justice and Safety, the Hague, the Netherlands.Google Scholar
Diakopoulos, N (2015) Accountability in algorithmic decision-making: A view from computational journalism. ACM Queue 13, 124.Google Scholar
Dutton, WH (2009) The fifth estate: Democratic social accountability through the emerging network of networks. Prometheus 27(1), 115. https://doi.org/10.1080/08109020802657453CrossRefGoogle Scholar
Eubanks, V (2018) Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor. St. Martin’s Press, New York.Google Scholar
Ezer, T, McKenna, R and Schaaf, M (2015) Expert Meeting on Social Accountability and Legal Empowerment : Allied Approaches in the Struggle for Health Rights (Issue June), Open Society Foundations, New York.Google Scholar
Fox, JA (2015) Social accountability: What does the evidence really say? World Development 72, 346361. https://doi.org/10.1016/j.worlddev.2015.03.011CrossRefGoogle Scholar
Friedman, B and Nissenbaum, H (1996) Bias in computer systems. ACM SIGCHI Bulletin 14(3), 330347. https://doi.org/10.1145/249170.249184Google Scholar
Gantchev, V (2019) Data protection in the age of welfare conditionality: Respect for basic rights or a race to the bottom? European Journal of Social Security 21(1), 322. https://doi.org/10.1177/1388262719838109CrossRefGoogle Scholar
Grimmelikhuijsen, S, John, P, Meijer, A and Worthy, B (2019) Do freedom of information laws increase transparency of government? A replication of a field experiment. Journal of Behavioral Public Administration 2(1), 110. https://doi.org/10.30636/jbpa.12.34Google Scholar
Gusterson, H (1997) Studying up revisited. PoLAR: Political and Legal Anthropology 20(1), 114119.CrossRefGoogle Scholar
Hampton, M (2010) The fourth estate ideal in journalism history. In Allan, S (ed), The Routledge Companion to News and Journalism. Routledge, London/New York. http://chinhnghia.com/The_Routledge_Companion_to_News_and_Journalism.pdf#page=48Google Scholar
Helmke, G and Levitsky, S (2004) Informal institutions and comparative politics: A research agenda. Perspectives on Politics 2(4), 725740. https://doi.org/10.4337/9781781001219.00011CrossRefGoogle Scholar
Hoffer, TB (2013) Accountability in education. In Handbook of the Sociology of Education. Wagner, Robert B. (ed.). Springer, New York, pp. 529543. https://doi.org/10.4324/9781315021348Google Scholar
Jacobs, S and Schillemans, T (2016) Media and public accountability: Typology and exploration. Policy and Politics 44(1), 2340. https://doi.org/10.1332/030557315X14431855320366CrossRefGoogle Scholar
Johnson, DG and Nissenbaum, H (1995) Computers, Ethics & Social Values. Prentice-Hall, Upper Saddle River.Google Scholar
Joshi, A (2017) Legal empowerment and social accountability: Complementary strategies toward rights-based development in health? World Development 99, 160172. https://doi.org/10.1016/j.worlddev.2017.07.008CrossRefGoogle Scholar
Kacianka, S and Pretschner, A (2021) Designing accountable systems. In FAccT 2021––Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency. ACM, New York, pp. 424437. https://doi.org/10.1145/3442188.3445905CrossRefGoogle Scholar
Kemper, J and Kolkman, D (2018) Transparent to whom? No algorithmic accountability without a critical audience. Information Communication and Society 22, 20812096. https://doi.org/10.1080/1369118X.2018.1477967CrossRefGoogle Scholar
Koppell, JGS (2005) Pathologies of accountability: ICANN and the challenge of multiple accountabilities disorder. Public Administration Review 65(1), 94108.CrossRefGoogle Scholar
Kopytoff, I (1986) The cultural biography of things. In Appadurai, A. (Ed.) The Social Life of Things. Cambridge: Cambridge University Press, pp. 6491.CrossRefGoogle Scholar
Kroll, JA (2021) Outlining traceability: A principle for operationalizing accountability in computing systems. In FAccT 2021 - Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency (Vol. 1, Issue 1). Associationfor Computing Machinery. https://doi.org/10.1145/3442188.3445937Google Scholar
Lessig, L (1999) Code: And Other Laws of Cyberspace. Basic Books.Google Scholar
Malik, HM, Viljanen, M, Lepinkäinen, N and Alvesalo-Kuusi, A (2021) Dynamics of social harms in an algorithmic context. International Journal for Crime, Justice and Social Democracy 10(4), 182195. https://doi.org/10.5204/IJCJSD.2141Google Scholar
Marjanovic, O, Cecez-Kecmanovic, D and Vidgen, R (2022) Theorising algorithmic justice. European Journal of Information Systems 31(3), 269287. https://doi.org/10.1080/0960085X.2021.1934130Google Scholar
Meijer, A (2014). Transparency. In Bovens, M., Goodin, R. E., & Schillemans, T. (Eds.) The Oxford Handbook of Public Accountability. Oxford University Press, Oxford, https://doi.org/10.1093/oxfordhb/9780199641253.013.0043Google Scholar
Meijer, A and Bovens, M (2003) Public accountability in the information age. In Palmirani, M, Van Engers, T and Wimmer, MA (eds), E-Government. Workshop in Conjunction with JURIX 2003, International Federation for Information Processing, Laxenburg, pp. 1628.Google Scholar
Meijer, A and Grimmelikhuijsen, S (2020) Responsible and accountable algorithmization: How to generate citizen trust in governmental usage of algorithms. In Schuilenburg, M and Peeters, R (eds), The Algorithmic Society: Technology, Power, and Knowledge. Routledge, London, pp. 5366.CrossRefGoogle Scholar
Meijer, A, Lorenz, L and Wessels, M (2021) Algorithmization of bureaucratic organizations: Using a practice lens to study how context shapes predictive policing systems. Public Administration Review 81(5), 837846. https://doi.org/10.1111/puar.13391CrossRefGoogle Scholar
Mol, A (2002) The Body Multiple: Ontology in Medical Practice. Duke University Press, Durham.CrossRefGoogle Scholar
Neyland, D (2016) Bearing account-able witness to the ethical algorithmic system. Science, Technology, & Human Values 41(1), 5076. https://doi.org/10.1177/0162243915598056CrossRefGoogle Scholar
Nissenbaum, H (1994) Computing and accountability. Communications of the ACM 37(1), 7280. https://doi.org/10.1145/175222.175228CrossRefGoogle Scholar
Noble, SU (2018) Algorithms of Oppression. New York University Press, New York.CrossRefGoogle Scholar
O’Neil, C (2016) Weapons of Math Destruction. Crown Publishing Group, New York.Google Scholar
Pasquale, F (2015) The Black Box Society: The Secret Algorithms that Control Money and Information. Harvard University Press, Cambridge/London.CrossRefGoogle Scholar
Raji, ID, Smart, A, White, RN, Mitchell, M, Gebru, T, Hutchinson, B, Smith-Loud, J, Theron, D and Barnes, P (2020) Closing the AI accountability gap: Defining an end-to-end framework for internal algorithmic auditing. In FAT*2020—Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency, ACM, New York, pp. 3344. https://doi.org/10.1145/3351095.3372873CrossRefGoogle Scholar
Ramsden, M and Gledhill, K (2019) Defining strategic litigation. Civil Justice Quarterly 38(4), 407426.Google Scholar
Romzek, BS, Leroux, K and Blackmar, JM (2012) A preliminary theory of informal accountability among network organizational actors. Public Administration Review 72(3), 442453. https://doi.org/10.1111/j.1540-6210.2011.02547.xCrossRefGoogle Scholar
Rosenblat, A, Kneese, T and Boyd, D (2014) Algorithmic Accountability. The Social, Cultural & Ethical Dimensions of “Big Data.” Accessed on December 19th 2022. https://datasociety.net/pubs/2014-0317/AlgorithmicAccountabilityPrimer.pdfGoogle Scholar
Scharpf, FW (1994) Games real actors could play: Positive and negative coordination in embedded negotiations. Journal of Theoretical Politics 6(1), 2753. https://doi.org/10.1177/0951692894006001002CrossRefGoogle Scholar
Schillemans, T (2008) Accountability in the shadow of hierarchy: The horizontal accountability of agencies. Public Organization Review 8(2), 175194. https://doi.org/10.1007/s11115-008-0053-8Google Scholar
Seaver, N (2017) Algorithms as culture: Some tactics for the ethnography of algorithmic systems. Big Data & Society 4(2), 205395171773810. https://doi.org/10.1177/2053951717738104CrossRefGoogle Scholar
Seaver, N (2019) Knowing algorithms. In Vertesi, J. & Ribes, D. (Eds.) DigitalSTS, Princeton University Press., Princeton/Oxford/Beijing, pp. 412422. https://doi.org/10.2307/j.ctvc77mp9.30Google Scholar
The Social Media Collective. (n.d.). Critical Algorithm Studies: A Reading List. Available at https://socialmediacollective.org/reading-lists/critical-algorithm-studies/ (accessed 28 July 2022).Google Scholar
Thon, J-N (2014) Mediality. In Ryan, BJ, Marie-laure, E and Lori, R (eds), The John Hopkins Guide to Digital Media. John Hopkins University Press, Baltimore, pp. 334337.Google Scholar
Van Bekkum, M and Zuiderveen Borgesius, F (2021) Digital welfare fraud detection and the Dutch SyRI judgment. European Journal of Social Security, 23(4). https://doi.org/10.1177/13882627211031257CrossRefGoogle Scholar
van Schendel, S (2019) The challenges of risk profiling used by law enforcement: Examining the cases of COMPAS and SyRI. In Reins, L (ed), Regulating New Technologies in Uncertain Times. Springer, The Hague, pp. 225240. https://doi.org/10.1007/978-94-6265-279-8_12Google Scholar
Vetzo, MJ (2021) The Netherlands—Algorithmic fraud detection system violates human rights—The case of SyRI. Public Law 3, pp. 650652.Google Scholar
Vu, TT and Deffains, B (2013) Formal and informal mechanisms of accountability in local governance. Towards a new authoritarian governance model. Business Systems Review 2(2), 330367. https://doi.org/10.7350/BSR.V17.2013Google Scholar
Wetenschappelijke Raad voor het Regeringsbeleid (WRR). (2011). iOverheid. https://www.wrr.nl/binaries/wrr/documenten/rapporten/2011/03/15/ioverheid/ioverheid.pdfGoogle Scholar
Wieringa, MA (2020) What to account for when accounting for algorithms: A systematic literature review on algorithmic accountability. In ACM Conference on Fairness, Accountability, and Transparency (FAT*‘20). ACM, New York, pp. 118. https://doi.org/10.1145/3351095.3372833Google Scholar
Willems, T and Van Dooren, W (2012) Coming to terms with accountability: Combining multiple forums and functions. Public Management Review 14(7), 10111036. https://doi.org/10.1080/14719037.2012.662446CrossRefGoogle Scholar
Worthy, B (2010) More open but not more trusted? The effect of the freedom of information act 2000 on the United Kingdom central government. Governance: An International Journal of Policy, Administration, and Institutions 23(4), 561582. https://doi.org/10.1111/j.1468-0491.2010.01498.xCrossRefGoogle Scholar
Yeung, K, Howes, A and Pogrebna, G (2019) AI governance by human rights-centred design, deliberation and oversight: An end to ethics washing. SSRN Electronic Journal, 127. https://doi.org/10.2139/ssrn.3435011Google Scholar
Figure 0

Table 1. Accountability types and their characteristics

Figure 1

Table 2. Overview of “scavenged” material

Figure 2

Table 3. Accountability practices around SyRI

Submit a response

Comments

No Comments have been published for this article.