Hostname: page-component-76fb5796d-vfjqv Total loading time: 0 Render date: 2024-04-29T13:18:34.026Z Has data issue: false hasContentIssue false

Administrative law and the machines of government: judicial review of automated public-sector decision-making

Published online by Cambridge University Press:  09 July 2019

Jennifer Cobbe*
Affiliation:
Compliant and Accountable Systems Group, Department of Computer Science and Technology, University of Cambridge, UK
*

Abstract

The future is likely to see an increase in the public-sector use of automated decision-making systems which employ machine learning techniques. However, there is no clear understanding of how English administrative law will apply to this kind of decision-making. This paper seeks to address the problem by bringing together administrative law, data protection law, and a technical understanding of automated decision-making systems in order to identify some of the questions to ask and factors to consider when reviewing the use of these systems. Due to the relative novelty of automated decision-making in the public sector, this kind of study has not yet been undertaken elsewhere. As a result, this paper provides a starting point for judges, lawyers, and legal academics who wish to understand how to legally assess or review automated decision-making systems and identifies areas where further research is required.

Type
Research Article
Copyright
Copyright © The Society of Legal Scholars 2019 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Footnotes

Many thanks to Jat Singh, Sam Smith, Joe Tomlinson, Swee Leng Harris, Jon Crowcroft, Lauren Downes, Dave Michels, John Morison, Daithí Mac Síthigh, Ross Anderson, and others for advice and for comments on drafts of this paper. Thanks also to the anonymous reviewers.

References

1 See eg Citron, D Keats and Pasquale, FAThe scored society: due process for automated predictions’ (2014) 89 Washington Law ReviewGoogle Scholar; Binns, RData protection impact assessments: a meta-regulatory approach’ (2017) 7 International Data Privacy Law 1CrossRefGoogle Scholar; F Doshi-Velez et al ‘Accountability of AI under the law: the role of explanation’ (2017) Harvard Public Law Working Paper No 18-07, available at https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3064761 (last accessed 17 June 2019).

2 Coglianese, C and Lehr, DRegulating by robot: administrative decision making in the machine-learning era’ (2017) 105 Georgetown Law Journal 1147Google Scholar.

3 Sueur, A LeRobot government: automated decision-making and its implications for parliament’ in Horne, A and Sueur, A Le (eds) Parliament: Legislation and Accountability (Oxford: Hart Publishing, 2016) p 183Google Scholar.

4 Oswald, MAlgorithm-assisted decision-making in the public sector: framing the issues using administrative law rules governing discretionary power’ (2018) 376 Philosophical Transactions of the Royal Society 2128CrossRefGoogle ScholarPubMed.

5 Throughout, this paper uses the term ‘public body’, or ‘public bodies’, to refer to ministers, public authorities, local authorities, health authorities, chief constables, reviewable tribunals, regulators, and any other decision-maker which is subject to judicial review when acting in a public law capacity. Note that the Data Protection Act 2018 (DPA 2018) uses its own definition of ‘public body’ for the purposes of GDPR (DPA 2018, s 7).

6 L Dencik et al ‘Data scores as governance: investigating uses of citizen scoring in public services’ (2018) p 3, available at https://datajusticelab.org/data-scores-as-governance (last accessed 17 June 2019).

7 Dencik et al, above n 6.

8 For more in-depth but legally accessible discussion of how machine learning systems operate see Lehr, D and Ohm, PPlaying with the data: what legal scholars should learn about machine learning’ (2017) 51 UC Davis Law Review 653Google Scholar; for a deeper dive into machine learning research, see Domingas, PA few useful things to know about machine learning’ (2012) 55 Communications of the ACM 10Google Scholar.

9 Barocas, S and Selbst, ADBig data's disparate impact’ (2016) 104 California Law Review 671Google Scholar; Boyd, D and Crawford, KCritical questions for big data: provocations for a cultural, technological, and scholarly phenomenon’ (2012) 15 Information, Communication and Society 5CrossRefGoogle Scholar; Eubanks, V Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor (Macmillan, 2018)Google Scholar.

10 Burrell, JHow the machine “thinks”: understanding opacity in machine learning algorithms’ (2016) 3(1) Big Data & SocietyCrossRefGoogle Scholar; Kroll, JA et al. ‘Accountable algorithms’ (2017) 165 University of Pennsylvania Law Review 633Google Scholar; Pasquale, F The Black Box Society: The Secret Algorithms That Control Money and Information (Cambridge, Mass: Harvard University Press, 2015)CrossRefGoogle Scholar.

11 van den Hoven van Genderen, RPrivacy and data protection in the age of pervasive technologies in AI and robotics’ (2017) 3 European Data Protection Law 3Google Scholar; Council of Europe ‘Algorithms and Human Rights: Study on the human rights dimensions of automated data processing techniques and possible regulatory implications’ (2017) Council of Europe study DGI(2017)12, available at https://edoc.coe.int/en/internet/7589-algorithms-and-human-rights-study-on-the-human-rights-dimensions-of-automated-data-processing-techniques-and-possible-regulatory-implications.html (last accessed 17 June 2019).

12 Primarily in the ‘FAT-ML’ – Fairness, Accountability, and Transparency in Machine Learning – research community; see https://www.fatml.org/.

13 Burrell, above n 10.

14 R Guidotti et al ‘A survey of methods for explaining black box models’, available at https://arxiv.org/abs/1802.01933 (last accessed 17 June 2019).

15 The benefits of transparency have their limits: see Ananny, M and Crawford, KSeeing without knowing: limitations of the transparency ideal and its application to algorithmic accountability’ (2016) 20(3) New Media & Society 973CrossRefGoogle Scholar; Edwards, L and Veale, MEnslaving the algorithm: from a “right to an explanation” to a “right to better decisions?”’ (2018) 16 IEEE Security & Privacy 3CrossRefGoogle Scholar.

16 F Poursabzi-Sangdeh et al ‘Manipulating and measuring model interpretability’ (2018), available at https://arxiv.org/abs/1802.07810 (last accessed 17 June 2019).

17 The need for useful tools for those involved in operating or assessing ADM systems has been recognised elsewhere: see M Veale et al ‘Fairness and accountability design needs for algorithmic support in high-stakes public sector decision-making’ (2018) Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI'18), available at https://arxiv.org/abs/1802.01029 (last accessed 17 June 2019).

18 In another common law jurisdiction, the Australian Government's best practice principles for ADM emphasise that decisions made by or with the assistance of ADM must comply with administrative law (Australian Government Automated Assistance in Administrative Decision-Making: Better Practice Guide (2007) p ix, available at https://www.oaic.gov.au/images/documents/migrated/migrated/betterpracticeguide.pdf (last accessed 17 June 2019)).

19 Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) [2016] OJ L119/1.

20 As well as providing for clarifications, qualifications, and exemptions from GDPR where permitted, DPA 2018 also extends GDPR to many circumstances where automated-decision making by public bodies is not otherwise covered by GDPR because their activities lie outside the scope of EU law (see DPA 2018, Pt 2 Ch 3; Pt 3; Pt 4).

21 That is, any information relating to an identified or identifiable natural person (GDPR, Art 4(1)).

22 The natural or legal person, public authority, agency or other body which, alone or jointly with others determines the purposes and means of processing (GDPR, Art 4(8)). Where the purposes and means of processing are determined by an enactment, the data controller will be the person on whom the obligation to process the data is imposed by that enactment (DPA 2018, s 6(2)) – this will most likely be the public body in question.

23 GDPR, Art 4(8).

24 GDPR, Art 5; see also Recital 39.

25 GDPR, Art 5(2).

26 Processing means ‘any operation or set of operations which is performed on personal data or on sets of personal data, whether or not by automated means, such as collection, recording, organisation, structuring, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, restriction, erasure or destruction’ (GDPR, Art 4(2)).

27 GDPR, Arts 13–14.

28 The existence, extent, and usefulness of this right is much debated. See eg B Goodman and S Flaxman ‘European union regulations on algorithmic decision-making and a “right to an explanation”’ (2016) 2016 ICML Workshop on Human Interpretability in Machine Learning (WHI 2016), available at https://arxiv.org/abs/1606.08813 (last accessed 17 June 2019); Wachter, S et al. ‘Why a right to explanation of automated decision-making does not exist in the General Data Protection Regulation’ (2017) 7 International Data Privacy Law 2CrossRefGoogle Scholar; Selbst, AD and Powles, JMeaningful information and the right to explanation’ (2017) 7 International Data Privacy Law 4CrossRefGoogle Scholar; Malgieri, G and Comandé, GWhy a right to legibility of automated decision-making exists in the General Data Protection Regulation’ (2017) 7 International Data Privacy Law 4CrossRefGoogle Scholar; Edwards, L and Veale, MSlave to the algorithm? Why a “right to an explanation” is probably not the remedy you are looking for’ (2017) 17 Duke Law & Technology Review 18Google Scholar.

29 See eg Cane, PUnderstanding judicial review and its impact’ in Hertogh, M and Halliday, S (eds) Judicial Review and Bureaucratic Impact (Cambridge: Cambridge University Press, 2008)Google Scholar; Elliott, M and Thomas, TTribunal justice and proportionate dispute resolution’ (2012) 71 Cambridge Law Journal 2CrossRefGoogle Scholar.

30 J Singh et al ‘Responsibility & machine learning: part of a process’ (2016), available at https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2860048 (last accessed 17 June 2019).

31 Skitka, LJ et al. ‘Does automation bias decision-making?’ (1999) 51 International Journal of Human-Computer Studies 5CrossRefGoogle Scholar.

32 Council of Civil Service Unions v Minister for the Civil Service [1984] 3 All ER 935Google Scholar; see also Associated Provincial Picture Houses v Wednesbury Corporation [1947] 2 All ER 680Google Scholar.

33 See eg R v Lord Chancellor, ex p Witham [1997] 2 All ER 779Google Scholar.

34 Note that DPA 2018 makes specific provision for law enforcement (Pt 3), intelligence services (Pt 4), and other processing which would normally be outside the scope of GDPR (Pt 2 Ch 3).

35 A natural person who can be identified, directly or indirectly, from personal data (GDPR, Art 4(1)).

36 GDPR, Art 22; Recital 71; see also Article 29 Data Protection Working Party ‘Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679’ (2018a) 17/EN WP251rev.01, p 19, available at http://ec.europa.eu/newsroom/article29/item-detail.cfm?item_id=612053 (last accessed 17 June 2019).

37 Article 29 Data Protection Working Party, above n 36, p 20.

38 The Article 29 Data Protection Working Party was an EU advisory body which consisted of representatives of the Data Protection Authorities of each Member State, the European Data Protection Supervisor, and the European Commission. It provided official guidance on the interpretation and application of EU data protection law. It was replaced by the European Data Protection Board (which adopted the work published by the Article 29 Data Protection Working Party) in May 2018.

39 Article 29 Data Protection Working Party, above n 36, p 21.

40 GDPR, Art 35; Recitals 84, 91–94; Article 29 Data Protection Working Party, above n 36, p 21. Data controllers (including public bodies where ADM involves personal data) are required to undertake a DPIA in advance of any processing which is likely to pose a high risk to individuals, and particularly that which involves automated processing which produces legal or similarly significant effects (although note that DPA 2018 does not require necessity and proportionality assessments in DPIAs for processing undertaken for law enforcement purposes (s 64)).

41 GDPR, Art 22(1).

42 Article 29 Data Protection Working Party, above n 36, p 21.

43 Article 29 Data Protection Working Party, above n 36, p 21.

44 GDPR, Recital 71.

45 Article 29 Data Protection Working Party, above n 36, p 21.

46 GDPR, Art 22(2)(a); while public bodies are unlikely to enter into contracts with individuals who are using their services, they may do so in the context of employment decisions, for example.

47 GDPR, Art 22(2)(b).

48 GDPR, Art 22(2)(c).

49 DPA 2018, s 14.

50 ‘Special category data’ is personal data revealing racial or ethnic origin, political opinions, religious philosophical beliefs, or trade union membership, or the processing of genetic data, biometric data for the purposes of uniquely identifying an individual, data concerning health, or data concerning an individual's sex life or sexual orientation (GDPR, Art 9(1)).

51 GDPR, Art 22(4).

52 GDPR, Art 9(2)(a).

53 GDPR, Art 9(2)(g); see DPA 2018, s 10, including, in particular, s 10(3) – processing under GDPR, Art 9(2)(g) will be lawful only where it meets a condition set out in DPA 2018, Sch 1 Pt 2. Note also that DPA 2018, s 14 places certain requirements on data controllers which rely on Art 9(2)(g) in making a solely automated decision which produces legal or similarly significant effects.

54 GDPR, Art 4(11); see also Recital 32; Article 29 Data Protection Working Party ‘Guidelines on consent under Regulation 2016/679’ (2018b) 17/EN WP259 rev.01, available at http://ec.europa.eu/newsroom/article29/item-detail.cfm?item_id=623051 (last accessed 17 June 2019); Information Commissioner's Office Lawful Basis for Processing: Consent (2018), available at https://ico.org.uk/media/for-organisations/guide-to-the-general-data-protection-regulation-gdpr/consent-1-0.pdf (last accessed 17 June 2019).

55 GDPR, Art 7(4); Recital 43.

56 GDPR, Recital 43.

57 GDPR, Art 22(3)–(4); see also Recital 47.

58 GDPR, Art 9(2)(g).

59 Arising from the fact that these grounds only permit processing where it is necessary.

60 See Article 29 Data Protection Working Party, above n 36, p 23.

61 Article 29 Data Protection Working Party, above n 36, p 23; see also European Data Protection Supervisor Assessing the necessity of measures that limit the fundamental right to the protection of personal data: A Toolkit (2017), available at https://edps.europa.eu/sites/edp/files/publication/17-04-11_necessity_toolkit_en_0.pdf (last accessed 17 June 2019).

62 GDPR, Art 5(1)(a).

63 GDPR, Art 21.

64 DPA 2018, s 15.

65 GDPR, Art 6(1); note that public bodies may not rely on the ‘legitimate interest’ grounds set out in in Art 6(1)(f).

66 GDPR, Art 6(1)(a).

67 GDPR, Art 6(1)(b).

68 GDPR, Art 6(3); see DPA 2018, s 8; this ground can only be relied upon if the processing is undertaken pursuant to EU or domestic law which meets an objective in the public interest and is proportionate to the aim pursued.

69 GDPR, Art 9(2)(h); see also Recital 53; DPA 2018, ss 10–11; depending on the circumstances, public bodies may able to process special category data where it is necessary for a variety of healthcare purposes.

70 See eg Noon v Matthews [2014] EWHC 4330 (Admin); R v London Borough of Tower Hamlets, ex p Khalique [1994] 26 HLR 517Google Scholar.

71 Carltona Ltd v Commissioners of Works [1943] 2 All ER 560 (CA)Google Scholar.

72 H Lavender & Son v Minister of Housing and Local Government [1970] 1 WLR 1231Google Scholar.

73 Ellis v Dubowski [1921] 3 KB 621.

74 Mills v London County Council [1925] 1 KB 213Google Scholar.

75 Le Sueur, above n 3, pp 188–189; see Social Security Act 1998, s 2.

76 Article 29 Data Protection Working Party, above n 36, p 21.

77 This should be reflected in the public body's DPIA if the decision involves personal data or concerns a natural person.

78 See eg Padfield v Minister of Agriculture, Fisheries and Food [1968] 1 All ER 694Google Scholar; British Oxygen Co Ltd v Minister for Technology [1971] AC 610; R v Warwickshire County Council, ex p Collymore [1995] ELR 217; R (Gujra) v Crown Prosecution Service [2012] UKSC 52.

79 See eg R (BBC) v Secretary of State for Justice [2012] 2012 EWHC (Admin); R (GC) v Commissioner of Police for the Metropolis [2011] UKSC 21.

80 Australian Government, above n 18, p viii, p 37; see also Le Sueur, above n 3, pp 196–197.

81 See eg R (Lumba) v Secretary of State for the Home Department [2011] UKSC 12; Nzolameso v City of Westminster [2015] UKSC 22.

82 Le Sueur, above n 3, p 198.

83 R (Gallaher Group Ltd) v The Competition and Markets Authority [2018] UKSC 25 at [24]–[30].

84 See eg R v Minister for Agriculture, Fisheries and Food, ex p Padfield [1968] 1 All ER 694Google Scholar; R v Secretary of State for Foreign and Commonwealth Affairs, ex p World Development Movement [1994] EWHC 1 (Admin); and Porter v Magill [2001] UKHL 67.

85 GDPR, Art 5(1)(b); see also Recital 50.

86 GDPR, Art 5(2).

87 R v Secretary of State for the Home Department, ex p Doody [1993] 3 WLR 154.

88 R v Civil Service Appeal Board, ex p Cunningham [1991] 4 All ER 310.

90 Stefan v General Medical Council [1999] UKPC 10, [2002] All ER (D) 96.

91 R v Secretary of State for the Home Department, ex p Fayed [1996] EWCA Civ 946, [1998] 1 WLR 763.

92 R v Higher Education Funding Council, ex p Institute of Dental Surgery [1994] 1 All ER 651Google Scholar.

93 See the requirements for reasons set out in South Buckinghamshire District Council v Porter (No 2) [2004] 1 WLR 1953 at [36].

94 Guidotti et al, above n 14.

95 See eg R (Nash) v Chelsea College of Art and Design [2001] EWHC (Admin) 538 at [34]; see also Re Brewster's Application [2017] UKSC 8 at [50]–[52] (although this was heard on reference from Northern Ireland).

96 R v Higher Education Funding Council, ex p Institute of Dental Surgery [1994] 1 All ER 651 at [665]–[666].

97 As they would be entitled to conclude if the decision was made by a human: see R v Minister of Agriculture Fisheries and Food, ex p Padfield [1968] 1 All ER 694Google Scholar at [1053]–[1054]; R v Secretary of State for Trade and Industry and another, ex p Lonrho plc [1989] 2 All ER 609 at [620]Google Scholar.

98 For example, as permitted by Deregulation and Contracting Out Act 1994, Pt II or by secondary legislation made under that Act.

99 For which the public body would act as a data controller.

100 GDPR, Arts 24–36; see also Recitals 81–83; Information Commissioner's Office ICO GDPR guidance: Contracts and liabilities between controllers and processors (2017) draft, available at https://ico.org.uk/media/about-the-ico/consultations/2014789/draft-gdpr-contracts-guidance-v1-for-consultation-september-2017.pdf (last accessed 17 June 2019).

101 R Clayton ‘Accountability, judicial scrutiny and contracting out’ (2015) UK Constitutional Law Blog, available at https://ukconstitutionallaw.org/2015/11/30/richard-clayton-qc-accountability-judicial-scrutiny-and-contracting-out [accessed 17/07/2018].

102 GDPR, Art 5(2); Art 24.

103 See eg R v Servite Houses and Wandsworth LBC, ex p Goldsmith [2001] LGR 55 (QBD).

104 GDPR, Art 28; Recital 81; this is a new requirement which did not exist in previous legislation.

105 Clayton, above n 101.

106 Arguments for other approaches in relation to other forms of outsourced public decision-making have also been proposed: see eg Scott, CAccountability in the regulatory state’ (2000) 27 Journal of Law and Society 1CrossRefGoogle Scholar.

107 See R v Panel on Take-overs and Mergers, ex p Datafin [1987] 1 All ER 564Google Scholar.

108 See eg Anisminic Ltd v Foreign Compensation Commission [1968] 2 WLR 163Google Scholar.

109 See eg Associated Provincial Picture Houses v Wednesbury Corpn [1947] 2 All ER 680Google Scholar; R v Somerset County Council, ex p Fewings [1995] 1 WLR 1037Google Scholar; R (Venables) v Secretary of State for the Home Department [1998] AC 407.

110 GDPR, Art 5(1)(d).

111 GDPR, Art 5(2).

112 See eg Brodley, CE and Friedl, MAIdentifying mislabeled training data’ (1999) 11 Journal of Artificial Intelligence Research 131CrossRefGoogle Scholar.

113 GDPR, Art 5(1)(c).

114 D Colquhoun ‘An investigation of the false discovery rate and the misinterpretation of p-values’ (2014) Royal Society Open Science, available at https://royalsocietypublishing.org/doi/full/10.1098/rsos.140216 (last accessed 17 June 2019).

115 GDPR, Art 5(1)(d).

116 GDPR, Art 5(1)(c).

117 R (Gallaher Group Ltd) v The Competition and Markets Authority [2018] UKSC 25 at [24]–[41].

118 Equality Act 2010, Pt 2 Ch 2.

119 The protected characteristics are age, disability, gender reassignment, marriage and civil partnership, pregnancy and maternity, race, religion or belief, sex, and sexual orientation (Equality Act 2010, ss 4–12).

120 Equality Act 2010, s 13.

121 Equality Act 2010, s 19.

122 See eg Veale, M and Binns, RFairer machine learning in the real world: Mitigating discrimination without collecting sensitive data’ (2017) 4(2) Big Data & SocietyCrossRefGoogle Scholar.

123 Association Belge des Consommateurs Test-Achats and Others v Conseil des Ministers (C-236/09) ECLI:EU:C:2011:100, [2012] 1 WLR 1933.

124 See eg Friedman, B and Nissenbaum, HBias in computer systems’ (1996) 14 ACM Transactions on Information Systems 3CrossRefGoogle Scholar, available at http://www.nyu.edu/projects/nissenbaum/papers/biasincomputers.pdf (last accessed 17 June 2019); Barocas and Selbst, above n 9; Eubanks, above n 9.

125 Where a protected characteristic is involved, this could potentially also constitute unlawful discrimination.

126 Davidson v Scottish Ministers [2004] UKHL 34 at [6]; although note that his was a case heard on appeal from Scotland.

127 See eg R Courtland ‘Bias detectives: the researchers striving to make algorithms fair’ (2018) 558 Nature, available at https://www.nature.com/articles/d41586-018-05469-3 (last accessed 17 June 2019).

128 Courtland, above n 127.

129 See eg J Kleinberg et al ‘Inherent trade-offs in the fair determination of risk scores’ (2016), available at https://arxiv.org/abs/1609.05807 (last accessed 17 June 2019); R Berk et al ‘Fairness in criminal justice risk assessments: the state of the trt’ (2017), available at https://arxiv.org/abs/1703.09207 (last accessed 17 June 2019); S Corbett-Davies et al ‘Algorithmic decision making and the cost of fairness’ (2017), available at https://arxiv.org/abs/1701.08230 (last accessed 17 June 2019).

130 R v Secretary of State for the Environment, ex p Kirkstall Valley Campaign [1996].

131 Re Medicaments and Related Classes of Goods (No 2) [2001]; see also Lawal v Northern Spirit [2004].

132 R v Local Commissioner for Administration in North and North East England, ex p Liverpool City Council [1999] All ER (D) 155.