REGULATING USE BY LAW ENFORCEMENT AUTHORITIES OF LIVE FACIAL RECOGNITION TECHNOLOGY IN PUBLIC SPACES: AN INCREMENTAL APPROACH

Abstract Amid the growing calls for the complete prohibition of the use by law enforcement authorities of live facial recognition (LFR) technology in public spaces, this article advocates for an incremental approach to regulating the use of the technology. By analysing legislative instruments, judicial decisions, deployment practices of UK law enforcement authorities, various procedural and policy documents, as well as available safeguards, the article suggests incremental adjustments to the existing legal framework instead of sweeping regulatory change. The proposed approach calls for adopting national legal rules governing watch lists and introducing spatial, temporal and contextual limitations on the deployment of technology based on the assessment of proportionality and necessity. To enhance the effectiveness of overt surveillance using LFR, the article recommends adopting a transparency procedure that promotes accountability without undermining the objectives of law enforcement. Alternatively, the overt use of the technology should be limited to deterring the commission of crimes and safeguarding public safety, where transparency does not undermine its effectiveness. Limiting the scope of overt use of LFR technology entails that law enforcement agencies primarily utilise covert surveillance, with prior judicial approval, except in urgent cases, as this would improve effective criminal investigation and public safety. The legal adjustments proposed in this article can be implemented through flexible secondary legislation or local policies, rather than rigid statutory rules.

The scene ends with the kidnapping of the female subject, leading the police to issue an arrest warrant for the alleged aggressor.A further investigation reveals that the CCTV footage depicted a partially fabricated event, as the victim had actually boarded the bus and headed home, as confirmed by CCTV footage from the bus.
The story is from the BBC's science-fiction TV series, The Capture, which portrays a world of ubiquitous facial recognition CCTV cameras. 1 In The Capture, a live CCTV feed is tampered with and smoothly switches from an actual event to a fabricated one in a split second.The intelligence community uses artificial intelligence (AI) software to fabricate evidence (using a method known as "correction"). 2 The technology is then used to broadcast to the public a live TV interview with a "deep fake" 3 version of a high government official as though it were an actual interview. 4he Capture dramatises and amplifies the potential dangers of growing surveillance using live facial recognition (LFR) technology.
The first ever case to consider the legality of the use of LFR technology in the UK, R. (on the application of Bridges) v Chief Constable of South Wales Police, involved the deployment of LFR by South Wales Police (SWP). 5n submissions reminiscent of The Capture, the appellant's counsel invoked the potential of pervasive CCTV cameras across the nation tracking people's movements to explain the illegality of the SWP's use of LFR. 6 The Court of Appeal declined to engage with the claimant's hypothetical scenario, which it acknowledged "may arise in the future", choosing instead to focus solely on addressing the issue of privacy violation based on the facts presented in the case. 7However, academics and advocacy groups have shown themselves to be more concerned with dystopian visions, such as those depicted in The Capture and this has led them to insist on radical solutions, including a moratorium on the use of LFR or its complete prohibition. 8his article argues for incremental regulation of the use of LFR technology in law enforcement, where the current legal framework is progressively adjusted in the light of the potential risks and evidence of actual harm from using LFR.In the aftermath of the UK Government's publication of a White Paper setting out the country's approach to regulating AI technologies, 9 it is crucial to have clarity on regulating the use of LFR technology in law enforcement.The White Paper adopts a pro-innovation stance towards regulating AI using a sectoral approach.The UK's regulatory trajectory suggests that the Government favours a light touch approach to the existing relevant sectoral laws rather than introducing a sweeping regulatory framework that creates similar regulatory rules and standards for AI systems across all sectors.In line with this, the White Paper envisions regulators implementing five principles in their sectors as they interpret the existing laws: safety, security and robustness; appropriate transparency and explainability; fairness; accountability and governance; and contestability and redress. 10here are conflicting judicial decisions regarding the need to reform the current legal framework governing the deployment of LFR in public spaces.The Divisional Court in Bridges held that, despite LFR being a new technology, it does not require the introduction of new laws. 11The Court of Appeal disagreed with this assessment. 12Meanwhile, existing studies that propose radical solutions take little interest in looking holistically at the evidence emerging from the actual use of the technology in the UK, and the relevant policies, procedures and safeguards for its deployment.A comprehensive analysis of the available evidence is essential for assessing the suitability of the current legal framework or the extent of legal reform needed to address the challenges that the technology presents.
This article critically analyses the existing literature, laws, codes of practices, policies and procedures followed by UK law enforcement authorities in deploying LFR and Equality Impact Assessment documents, along with the decision in Bridges.The article primarily focuses on UK law, occasionally referencing the upcoming European Union (EU) AI Act. 13The European Parliament's (EP) compromise text outright prohibits using LFR, 14 whereas the European Commission's initial proposal permitted a regulated use of the technology. 15Despite the Commission's proposal not being adopted by the EP, its rules on facial recognition systems used in law enforcement remain a valuable source of insight.
The rest of the article is divided into four sections.Section II provides an overview of LFR technology and its risks and benefits.Section III analyses the legal framework that governs the use of LFR technology by law enforcement authorities in the UK and explains the areas that do not require legal reform.Section IV examines areas where legal adjustment is necessary, advocates for incrementalism by setting out its normative framework and offers specific recommendations.Section V concludes the article by highlighting the limit of incrementalism in addressing legal challenges beyond the deployment of LFR technology.

II. LFR TECHNOLOGY IN THE CONTEXT OF LAW ENFORCEMENT
A. The Distinctive Features of LFR Today, facial recognition technology finds extensive application across both the public and private sectors.Amongst other things, it is used for unlocking mobile phones and tablets, eliminating the need to input passwords manually, authenticating individuals for banking services and conducting automated checks at the border. 16It is a process by which AI software can identify or recognise a person using their biometric facial features extracted from a photo or video. 17The software compares the features captured by a camera against an existing biometric facial image to estimate the degree of similarity between two templates. 18LFR involves six essential steps: compiling/using an existing image, image acquisition, face detection, feature extraction, comparison and matching. 19Generally, the last five stages, namely image acquisition to matching, should occur instantaneously. 20The UK police compare the facial features of individuals against images of persons in the police database (a so-called "watchlist"). 21FR has at least three distinctive features that present opportunities and risks which must be carefully addressed.First, the instantaneous matching of facial features is one of its unique characteristics.Non-real-time facial recognition, or 16 Centre for Data Ethics and Innovation, "Snapshot Series: Facial Recognition Technology" (2020), 18, available at https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/905267/Facial_Recognition_Technology_Snapshot_UPDATED.pdf(last accessed 13 September 2023). 17Information Commissioner's Office, "Information Commissioner's Opinion: The Use of Live Facial Recognition Technology in Public Places", available at https://ico.org.uk/media/for-organisations/documents/2619985/ico-opinion-the-use-of-lfr-in-public-places-20210618.pdf(last accessed 13 September 2023). 18Ibid. 19R. (Bridges) v Chief Constable of South Wales [2020] EWCA Civ 1058, at [9]. 20However, there could be some delays in the process due to a software flaw or intentional design aimed at circumventing specific rules applicable to LFR.To address this, the EU AI Act recognises that real-time remote biometric identification systems may experience minor delays during analysis, but that such delays do not negate their real-time nature: European Parliament, "DRAFT Compromise Amendments", 140.
It should be noted that terms such as "significant delay" and "minor delay" involve subjective assessment that could cause practical challenges.

C.L.J.
Regulating Live Facial Recognition Technology post-system, involves facial image analysis after the event. 22In practical terms, this requires an investigating officer to obtain a facial image of the subject from a source such as CCTV and subsequently to compare their facial biometric features with facial biometric data held in a database.The instantaneous identification process may have its downside, as the LFR Operator and LFR Engagement Officer might need to decide to intervene quickly, such as by arresting the subject, creating room for engagement with mistakenly identified persons.However, under the practices of the Metropolitan Police and the SWP established by their Standard Operating Procedures (SOPs), engagement is preceded by human review, where the Engagement Officer, who is notified of an alert by the Operator, has the authority to decide whether to engage with the subject. 23The EU AI Act mandates that at least two persons must conduct a human review in the case of biometric identification systems, without which an action cannot be taken. 24econd, LFR software processes biometric data, which is defined as "personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of an individual, which allows or confirms the unique identification of that individual, such as facial images or dactyloscopic data". 25A video footage or a photo alone does not qualify as biometric data without there being a technical analysis that results in data that uniquely identifies the individual. 26Such technical analysis aims to extract features including "the width of nose, wideness of the eyes, the depth and angle of the jaw, the height of cheekbones, and the separation between the eyes". 27As this kind of processing is intrusive, it is considered to be sensitive processing, which thereby requires adherence to strict data protection standards.
Third, LFR technology can process personal data on a mass scale as it is deployed in a publicly accessible space where the faces of thousands of people can be seen and captured.The scale of facial feature analysis by this technology is significant, making it a unique form of surveillance technology.

B. Benefits of Law Enforcement Authorities' Use of LFR
Through aiding law enforcement authorities in identifying and locating suspects and vulnerable persons, such as missing children, LFR technology significantly contributes to criminal investigation and public safety.
Facial recognition technology has proven effective in preventing potential threats to public safety, including terrorism.In 2019, an individual left two electric rice cookers in a New York subway station, causing public panic and the evacuation of the transportation hub. 28A New York Police Department (NYPD) detective used facial recognition technology to identify the suspect using an image from a CCTV camera compared against photos in the police database.By relying on possible matches suggested by the technology and additional manual scrutiny, the NYPD successfully identified and arrested the suspect before it was determined that the rice cookers were harmless. 29lthough this was not a real-time system, it demonstrates that LFR can help avert a threat to public safety through its capability to enhance the swift identification of persons of interest.
During a deployment by the SWP at a public event in 2018, SWP "identified a person who had made a bomb threat at the very same event the previous year and who had been subject to a (suspended) custodial sentence". 30Besides apprehending criminals, the technology is also used to locate witnesses and the associates of suspects. 31nother proven benefit of LFR is the apprehension of individuals with outstanding arrest warrants or those suspected of having committed crime.In nine 32 deployments between February 2020 and July 2022, the London Metropolitan Police Service (MPS) presumably 33 arrested nine subjects. 34In 2017, the SWP deployed LFR in 13 instances, leading to 15 arrests. 35In a single SWP deployment in 2022, two cases of arrest/ disposal were reported. 36 Another potential benefit of LFR is the ability to locate vulnerable persons, including missing children and those presenting a risk of harm to themselves or to others. 37Although there is no data on whether vulnerable persons have been located using LFR in the UK, non-real-time facial recognition has successfully been used to reunite missing children with their parents in India, which demonstrates its potential to help locate vulnerable persons. 38n independent study of 11 deployments by the SWP between 2017 and 2018 conducted by Bethan Davies, Martin Innes and Andrew Dawson at Cardiff University concluded that: "The evidence clearly supports the conclusion that AFR [Automated Facial Recognition] processes and systems can contribute to police identifying persons of interest that they would not otherwise have been able to do so."39 The advantages of LFR outlined above present a compelling case for using this technology for law enforcement purposes.The key focus should be on putting in place a suitable legal framework that enables the safe and responsible use of the technology whilst mitigating its potential risks.

C. Risks of LFR in Law Enforcement
LFR technology carries several risks that could undermine the rights of individuals and the welfare of citizens.Two commonly cited concerns are: the potential for inaccuracy stemming from biased model training datasets; 40 and privacy intrusion and surveillance. 41hen identifying certain faces. 42In a 2018 study, Joy Buolamwini and Timnit Gebru found that commercial gender classification facial analysis algorithms had higher accuracy rates with male faces compared to female faces (8.1 per cent -20.6 per cent difference in error rate) and performed even better on lighter-skinned faces than darker-skinned faces (11.8 per cent -19.2 per cent difference in error rate). 43The algorithms had the poorest performance on darker-skinned female faces (20.8 per cent -34.7 per cent error rate). 44here could be several reasons for the inaccuracy of facial recognition algorithms.The most widely accepted cause of inaccuracy is the limited composition of model training data. 45If the machine learning system is trained primarily on facial images of lighter-skinned faces, its accuracy decreases when presented with images of people with darker skin.This is commonly attributed to biased data selection, also known as statistical bias, 46 which occurs when data from certain groups are underrepresented in the training dataset.
The implications of inaccuracies in LFR technology could be severe.In the US, law enforcement authorities have used non-real-time facial recognition, leading to numerous mistaken identifications and wrongful arrests of Black Americans.For instance, Nijeer Parks, a Black American, was wrongly identified by a non-real-time facial recognition system, for suspicion of shoplifting, resisting arrest and attempting to hit a police officer with a vehicle, amongst other things, leading to his wrongful incarceration for 11 days in New Jersey. 47In February 2023, Detroit Police arrested a pregnant black woman, Porcha Woodruff, after she was wrongly identified by the police using facial recognition. 48The incidents of wrongful arrests based on misidentification by facial recognition in the US are troubling.However, these incidents could be partly attributable to the fact that the regulatory standards of policing in the US are lower compared to the UK and that there is a private prison system in the US, which could potentially lead to higher levels of police misconduct and wrongful arrests. 49he National Physical Laboratory independently tested two facial recognition software systems for accuracy, including NEC Neoface V4, which is used by the MPS and the SWP forces, and published the result in 2023.The study is based on data collected through deploying LFR for the purpose of the study, simultaneously(in the same locations) with five deployments by the MPS and the SWP in London and Cardiff respectively. 50The test confirmed that the LFR systems had the poorest performance on Black-Female faces, but that the discrepancy in accuracy rates across demographics is statistically insignificant. 51Whilst the result is encouraging, law enforcement authorities should continuously monitor the accuracy of the software they use, alongside putting in place safeguards to address eventual risks of inaccuracy to the public, including minority groups.In this regard, the MPS's existing practice, including the Equality Impact Assessment, 52 shows an encouraging awareness of the potential disproportionate impact that LFR technology could have on specific demographics and efforts to address it.These efforts include: understanding possible discrepancies in the technology's accuracy across demographics; subjecting automated identification to human review; and halting operation in case of suboptimal performance.
Whilst authorities should always prioritise the responsible use of the technology, as of now, no alarming cases of wrongful arrest or abusive engagements with individuals based on misidentifications by a facial recognition system have been reported in the UK.

Privacy intrusion and surveillanceroom for compromise
The use of LFR in law enforcement could potentially violate the right to privacy.In Bridges, the Divisional Court and the Court of Appeal agreed that the SWP's use of LFR interfered with the appellant's right to private life recognised under Article 8 of the European Convention on Human Rights (ECHR) (see Section III(A) below). 53n addition, it is feared that the extensive use of LFR by law enforcement authorities could expand a government's surveillance power beyond law enforcement and have a chilling effect on other civil and political rights.Such effects include, amongst others, the stifling of freedom of expression 54 and the suppression of peaceful demonstration and assembly. 55Along this line, Nathalie Smuha et al. expressed the concern that permitting the use of remote biometric identification systems could lead to the installation of a permanent surveillance infrastructure that could be operational at any moment, leading to privacy interference and a chilling effect on the exercise of democratic rights, including freedom of expression and assembly. 56he fear of the potential misuse of LFR technology to suppress democratic rights arises from existing practices in other regions.For instance, in a legal challenge involving an allegation that the use of facial recognition in public spaces led to the arrest of a political protestor, the European Court of Human Rights (ECtHR) found Russia to be in violation of its obligation under the ECHR. 57In 2022, the Cambodian Government was accused of using drones to intimidate and possibly profile peaceful protestors under the guise of ensuring security. 58uch practices raise concerns about the potential chilling effect that the deployment of LFR technology for law enforcement purposes might have on democratic rights.
Given the existing precedents elsewhere, it would be imprudent to disregard completely the possibility of mass surveillance using LFR technology meant for law enforcement in the UK.Nevertheless, it is equally crucial to see this in the overall context of the country, particularly the prevailing rule of law that establishes the limits of surveillance.This contextual assessment is essential to understanding the actual capacity of LFR technology to amplify surveillance capabilities.
The paramount aspect to consider is that, within the current legal framework, law enforcement authorities cannot establish an enduring AI-powered surveillance infrastructure in publicly accessible areas that could be activated at will due to the potential for such measures to be deemed disproportionate under the Human Rights Act 1998 (see Section III(A) for further details).Furthermore, the deployment of the technology is subject to limitations regarding time, location and purpose.
Currently, the police deploy LFR by mounting a camera on a marked police vehicle, a pole or other structure, which is then removed after the intended duration. 59The temporariness and spatial limits of deployment are rooted in Article 8(2) of the ECHR, where the/an infringement of privacy should be proportionate to the aim pursued, which limits the power to install live surveillance cameras at whim.
Nevertheless, privacy law alone may not be sufficient to tackle the effect of LFR technology on other civil and political rights.If the threat of the chilling effect of the technology is considered real and imminent, one way to address this would be to create a legal restriction on using the technology in the context of critical civil and political activities, such as demonstrations against the Government or other socio-economically and politically significant events.Such a restriction would tackle censorship and the potential chilling effect of the technology, whilst allowing law enforcement authorities to utilise it in non-political contexts, such as significant sporting events, where credible security threats exist (see Section IV(C)(2) below).These kinds of restrictions should not take the form of a blanket prohibition, as a proportionality assessment needs to be conducted in the given circumstances.Whilst the details must be carefully thought out, the general approach allows for a reasonable compromise.

III. THE EXISTING LEGAL FRAMEWORK IN THE UK
The use of LFR technology by law enforcement authorities in the UK is governed by various legal frameworks.Most of these frameworks are capable of effectively addressing the challenges that the technology presents.This section analyses privacy, data protection, equality and civil liability laws.

A. Privacy Law Under the ECHR
As is well known, following the enactment of the Human Rights Act, public authorities must act in a manner which is compatible with the rights recognised in the ECHR. 60These rights include the right to respect for private and family life, as well as home and correspondence, enshrined in Article 8 of the Convention. 61The 1998 Act requires courts or tribunals to take into account the jurisprudence of the ECtHR in determining a question which has arisen in connection with a Convention right. 62he ECtHR has interpreted Article 8 broadly in a number of decisions 63 to include a person's physical, psychological and social identities, such as race, gender and sexual identities. 64Furthermore, the court has made it clear that the right to privacy extends to acts in public spaces, 65

424
The Cambridge Law Journal [2023]     the use of LFR by law enforcement authorities.In cases, such as Bridges, that involve the processing of personal data, the violation of the right to privacy under Article 8 of the Convention can be established if the processing is conducted in such a way that it intrudes into the individual's private life.Previous decisions have found that the recording of personal data in a permanent form 66 or its mere storage can interfere with private life within the meaning of Article 8(1). 67In this regard, courts consider the individual's reasonable expectation of privacy to be "a significant though not necessarily conclusive factor". 68n Bridges, SWP argued that there cannot be a reasonable expectation of privacy in a public space, where an image of a person is recorded and processed near-instantaneously, without recording biometric data and making it available to a human. 69This was rejected by the Divisional Court, which held that the automated capturing of a person's facial image and the extraction of biometric features for comparison with existing biometric data fall under Article 8(1) of the Convention. 70The Divisional Court apparently recognised a reasonable expectation of privacy in this instance.Although the Court of Appeal did not engage with this question, as it merely summarised the Divisional Court's findings on the issue, it proceeded on the basis that the use of LFR by the SWP interfered with the appellant's right to privacy within the meaning of Article 8(1) of the Convention.
The other controversial issue is whether the interference with privacy falls under the exception in Article 8(2).To fall under this exception, the justification for interfering with private life would need to be (1) in accordance with the law, (2) in pursuit of a legitimate aim and (3) necessary in a democratic society. 71In Bridges, the Divisional Court and the Court Appeal disagreed on this question.
The Divisional Court found that the deployment of LFR met all three requirements of Article 8(2), 72 but the Court of Appeal ruled that the "in accordance with the law" requirement was not met due to a lack of clarity in the legal basis for deploying LFR. 73The justifications for the police's use of LFR technology are detecting, investigating or prosecuting crimes and preserving public safety.Because these are considered to be legitimate aims, it is generally not difficult to meet the first condition. 74The necessity and proportionality of using the technology are assessed on a caseby-case basis, weighing the aim pursued against the effect on privacy. 75Both courts agreed that the SWP's use of LFR was in pursuit of a legitimate aim and fulfilled the requirements of necessity and proportionality.However, the Court of Appeal held that it failed to meet the "in accordance with the law" requirement for interference with privacy rights (the first prong). 76n interference with private life is considered to be in accordance with the law if it fulfils several requirements that the author organises into three general categories.First, the measure must have a basis in domestic law. 77econd, the law must be compatible with the rule of law requirements of accessibility and foreseeability. 78Third, the law must afford adequate legal protection against arbitrariness by sufficiently defining the scope and manner of the exercise of discretion by public authorities. 79The last two requirements are dubbed the "quality of law requirement" in the case law of the ECtHR. 80A rich body of jurisprudence explains each component of the three general requirements.Of interest to the core argument of this article is identifying why the Court of Appeal concluded that the "in accordance with the law" requirement was not met.
First, the legal basis for privacy-intrusive measures under Article 8(2) need not be statutory law; it can also be a code of conduct or administrative guideline, as long as there are effective means of enforcing them. 81On this issue, the Divisional Court's conclusion that "[t]he legal framework within which AFR Locate operates comprises three elements or layers (in addition to the common law), namely: (a) primary legislation; (b) secondary legislative instruments in the form of codes of practice issued under primary legislation; and (c) SWP's own local policies", 82 was accepted by the Court of Appeal. 83he common law confers upon the police the power to collect and retain information for the purpose of maintaining public order and detecting or preventing crimes without a specific search warrant, as long as it does not involve intrusive surveillance. 84In addition, Part 3 of the Data Protection Act (DPA) 2018, which implemented into UK law the EU Law Enforcement Directive (LED), 85 provides grounds for sensitive personal data processing.The processing of personal data in the case of LFR is considered sensitive as it is aimed at uniquely identifying an individual using biometric data.Although some contentions were made, there was no disagreement that Part 3 of the DPA provides a legal basis for using LFR as long as the relevant conditions were met (see Section III(B) below).Similarly, the Home Office's Surveillance Camera Code of Practice, 86 which covers certain aspects of deploying LFR technology, was considered relevant secondary legislation. 87Last, the SWP's local policies containing the SOP and Sensitive Data Processing Policy were considered to be part of the legal basis for interfering with the appellant's right to privacy, as long as these are published. 88lthough the aforementioned legal frameworks provide the basis for using LFR, the Court of Appeal ruled that they fail to meet the "quality of law" requirement.Specifically, the local policies of the SWP granted the police unlimited discretion in determining the individuals to be added to the watch list and the deployment locations of LFR. 89The court emphasised that the "other persons where intelligence is required" category mentioned in the Data Protection Impact Assessment (DPIA) gave the police unlimited discretion in creating the watch list. 90Regarding the issue of deployment locations, the court pointed out that the lack of a normative requirement or limit on where the LFR technology could be deployed was a flaw. 91hilst stressing that the task of designing specific policies is out of its purview, the Court of Appeal concluded that "the current policies do not sufficiently set out the terms on which discretionary powers can be exercised by the police, and for that reason do not have the necessary quality of law". 92he  90 Ibid. 91Ibid., at [130]. 92Ibid., at [94].
interference with the appellant's private life was not in accordance with the law, engaged with the argument on proportionately (although it did not have to do so) on principle.On this issue, the decision of both courts in Bridges largely hinged on whether the use of LFR technology by the SWP met two of the four requirements of the Bank Mellat test for proportionality. 93These requirements are whether a less intrusive measure could have been used without unacceptably compromising the objective of the measure pursued and whether, having regard to the severity of the consequences of the measure, amongst other things, "a fair balance has been struck between the rights of the individual and the interests of the community". 94he Divisional Court concluded that the two deployments of LFR by the SWP were proportionate in the light of, amongst other things, the aim pursued, the limited duration and the transparency of the deployments, public engagement, limited personal data processing and the outcomes of the deployments (specifically, the successful arrests of wanted suspects). 95he Court of Appeal affirmed this conclusion. 96n summary, Bridges demonstrates that Article 8 of the ECHR adequately addresses the privacy challenges posed by the use of LFR technology by the police.Interfering with private life requires a clear legal basis that meets the standards of the rule of law and limits discretion.This is further analysed in Section IV(C) (1).Courts also need to engage with the proportionality assessment, even if the legal basis for the interference is clear or the objective of the interference is legitimate.There are no inherent characteristics of LFR technology that undermine the effectiveness of Article 8 of the Convention.

B. Data Protection Law
In law enforcement, data processing is governed by the LED which was implemented into UK law by the DPA 2018.The LED's provisions are as good for LFR deployment, as they are for other less advanced personal data processing methods.Currently there is no gap in the provisions of the LED that is uniquely related to the police's use of LFR technology.

428
The Cambridge Law Journal [2023] "necessary for reasons of substantial public interest"; 100 and (3) the authority must have an appropriate policy document. 101rocessing personal data for law enforcement purposesduring the course of an investigation, when prosecuting a crime or when safeguarding public safetyfalls within the requirements that processing must be permitted by law and necessary for the reasons of substantial public interest.This means that the first two requirements are usually easily met.To meet the last requirement, the authority should have an appropriate policy document which must be retained, reviewed and updated regularly during the relevant period.This policy document must include, amongst other things, procedures for complying with data protection principles and a policy for the retention and erasure of personal data. 102he LED also requires the authority to conduct a DPIA when the type of processing is likely to result in a high risk to individuals' rights and freedoms. 103The controller must consider the nature, scope, context and purposes of the processing when deciding whether it will likely result in a high risk to individuals' rights and freedoms. 104The DPIA should include a general description of the processing, an assessment of the risks to data subjects' rights and freedoms, measures to address those risks and safeguards to protect personal data and demonstrate compliance. 105n this front, the Court of Appeal in Bridges (reversing the position taken by the Divisional Court) found the SWP to be in breach of the LED.The reason for this reflected the Court of Appeal's initial conclusion that the use of the technology was not in accordance with the law.In the light of this, it concluded that "notwithstanding the attempt of the DPIA to grapple with the article 8 issues, the DPIA failed properly to assess the risks to the rights and freedoms of data subjects and failed to address the measures envisaged to address the risks arising from the deficiencies we have found, as required by section 64(3)(b) and (c) of the DPA 2018". 106n conclusion, Bridges did not expose any gaps in the data protection regime of the LED relating to the police's use of LFR technology.Whilst LFR technology can be used in a way that violates data protection law like any other technology, the use of the technology by law enforcement authorities in itself does not present distinct challenges that the existing data protection law cannot address. 100Ibid., sched.8, para.6(1)(b). 101Ibid., ss.35(4)(b), 35(5)(c), 42. 102Ibid., s. 42. 103Ibid., s. 64(1). 104Ibid., s. 64(4). 105Ibid., s. 64(3). 106R. (Bridges) v Chief Constable of South Wales [2020] EWCA Civ 1058, at [153]. C.L.J.
Regulating Live Facial Recognition Technology 429

C. Equality Law
Although much has been discussed about the disproportionate impact on specific demographics of the use by law enforcement authorities of LFR, there is no strong case for major legal reform to tackle this.The UK has legal rules to tackle discrimination and bias across the public sector.The ECHR prohibits discrimination on "any ground such as sex, race, colour, language, religion, political or other opinion, national or social origin, association with a national minority, property, birth or other status". 107The Equality Act 2010 requires a public authority, when making decisions of a strategic nature about how to exercise its functions, to have "due regard to the desirability of exercising them in a way that is designed to reduce the inequalities of outcome which result from socio-economic disadvantage". 108The Act's "public sector equality duty" (PSED) provision requires public authorities to conduct an Equality Impact Assessment to "remove or minimise disadvantages suffered by persons who share a relevant protected characteristic that are connected to that characteristic". 109he PSED requires taking positive steps to ensure that decisions and policies made by public authorities do not disproportionately impact specific groups. 110he Court of Appeal in Bridges (again reversing the decision of the Divisional Court) found that the Equality Impact Assessment conducted by the SWP was insufficient as it did not do "everything reasonable which could be done : : : to make sure that the software used does not have a racial or gender bias". 111n spite of the fact that the PSED could be construed strictly to be conditional on the engagement of rights, such as non-discrimination, it is also broader in its scope as it holds public authorities accountable before discriminatory practices that impact individuals occur.
In line with the PSED, the MPS's Equality Impact Assessment provides a comprehensive framework for addressing potential disparities in accuracy based on race, religion, gender, age, sex, disability and other protected characteristics. 112The document outlines specific measures, including: collaborating with LFR service providers to identify the most suitable software for law enforcement purposes; implementing stringent LFR deployment criteria within the SOP to prevent the targeting of specific individuals or groups; suspending LFR deployment during operations if suboptimal performance is detected; and conducting post-deployment reviews to investigate false alerts and identify patterns or underlying causes. 113he MPS's Equality Impact Assessment likely satisfies the required standard under the PSED.However, the Equality Impact Assessment should ideally be developed in consultation with a diverse range of stakeholders, including police associations, 114 institutional representatives and members of the broader public through open consultation.Doing so will promote enhanced compliance with the PSED.
Overall, based on the existing legal framework, the Court of Appeal found the SWP's commendable efforts to comply with their PSED to be inadequate.This demonstrates that the current equality law can effectively address the concerns of inaccuracy and bias that are raised by the use of LFR technology by law enforcement authorities, without the need for reform.

D. Civil Liability Laws
Civil remedies for the illegal use of LFR by law enforcement authorities are available under a number of legal regimes.This section considers, in outline, potential civil liability under privacy and data protection laws and the tort of negligence.
In the UK, claimants alleging privacy violations have two possible routes for civil claims.The first is in tort law under the misuse of private information.The second is under the ECHR (claims against public authorities).
In tort law, there is no distinctive "tort of privacy infringement".Because of this, privacy violations have been litigated under breach of confidence 115 and, more recently, under misuse of private information. 116Generally, to find a breach of privacy in tort, courts ask: (a) whether the claimant has a "reasonable expectation of privacy" concerning the information disclosed; and (b) whether the claimant's interest in maintaining their right to privacy outweighs the defendant's interest in freedom of expression. 117s recognised by Nicole Moreham, "whether and, if so, when a person might have a reasonable expectation of privacy in a public place", 118 is a difficult question.A reasonable expectation of privacy, being an objective test, requires taking into account several factors including, but not limited to, "the nature of the information, or activity", "the form in which the information is kept" and "the effect on the claimant" of the disclosure of the information. 119In Bridges and other similar cases, claimants would face two important challenges.First, case law in this area involves the

C.L.J.
Regulating Live Facial Recognition Technology misuse of private information through publication or broadcasting 120 or, more broadly, disclosure.In Bridges, the information is neither published as in Campbell 121 nor broadcast or disclosed to an entity that has no authority to access it.Thus, it is unlikely that an action for the misuse of private information would succeed.There is currently no precedent within UK tort law that offers a civil remedy to claimants whose private information is merely collected, analysed and stored without the disclosure of such information contrary to the claimant's reasonable expectation of privacy.As a result, if the court determines that the information has not been disclosed or published, no cause of action for privacy infringement exists (and there is no need to examine whether the threshold of a "reasonable expectation of privacy" has been reached).This also aligns with the fact that, in such cases, there may ultimately not be a harm for which compensation should be awarded, and indeed that appears to be the case in Bridges.Nonetheless, it is worth examining whether, in all cases of the civil claim in tort law in relation to the use of LFR technology by the police, the lack of disclosure of the information acquired during surveillance should defeat a successful claim for privacy infringement.In other words, if the technology is used in an excessive manner, but the information obtained by the police is not disclosed or published, should a civil remedy in tort be unavailable?This is to be decided by courts in the future.It is possible that the concept of "misuse" of private information might be found to extend beyond situations of disclosure (to include collection and use) for two reasons.First, the origin of misuse of private information has been the doctrine of breach of confidence, from which the former mutated as a separate tort. 122Breach of confidence clearly precludes the use of confidential information, not just the disclosure of such information. 123Second, this may have some support in case law.In Imerman v Tchenguiz, Lord Neuberger M.R. held: It would seem to us to follow that intentionally obtaining such information, secretly and knowing that the claimant reasonably expects it to be private, is itself a breach of confidence.The notion that looking at documents which one knows to be confidential is itself capable of constituting an actionable wrong (albeit perhaps only in equity) is also consistent with the decision of the Strasbourg court that monitoring private telephone calls can infringe the article 8 rights of the caller.

432
The Cambridge Law Journal [2023]     Despite involving the unique circumstances of unauthorised access to files that were subsequently disclosed to a solicitor, Imerman v Tchenguiz gives room for extending breach of confidence to cover situations involving the mere acquisition of private information, without further disclosure.Nonetheless, assuming that this is plausible, the claimant still needs to demonstrate that the private information was acquired in violation of a reasonable expectation of privacy or an obligation of confidence.In Bridges, the Court of Appeal did not review the question of whether Mr. Bridges had a reasonable expectation of privacy in a public space, as this was not the subject of appeal.The Court of Appeal assumed that the appellant has reasonable expectation of privacy, as concluded by the Divisional Court. 125However, the threshold adopted in assessing the claimant's reasonable expectation of privacy by the Divisional Court was not particularly high, leaving a room for future decisions to deviate from the decision in Bridges, to the detriment of claimants.Thus, it can be concluded that a claim in tort law based on the misuse of private information or its variation would likely face serious obstacles in cases similar to Bridges.Nevertheless, the Human Rights Act provides an avenue to assert a civil claim for breach of privacy. 126The act allows "a court which has power to award damages, or to order the payment of compensation, in civil proceedings", to grant compensation for the violation of the provision of the ECHR, if the court finds such compensation to provide just satisfaction to the claimant. 127Therefore, whilst there may be challenges in pursuing a claim of breach of privacy under UK tort law, claimants in cases involving the police's use of LFR technology have an alternative legal avenue under the human rights regime.Because the violation of the right to privacy through an interference with the private life of the claimant under Article 8 of the ECHR does not require disclosure of private information, unlike in tort law, the human rights regime could potentially present less challenges to successfully claiming compensation.
In addition to privacy law, data protection law plays a major role in providing a remedy to an aggrieved person in relation to surveillance technologies.The biggest challenge claimants face in data protection cases is proving harm, 128 especially non-material harm (such as distress).Showing general reluctance to award damages for non-material harms in 2021, the UK Supreme Court decided that a mere loss of control over personal data, without proof of material damage or distress, is not compensable under the General Data Protection Regulation (GDPR). 129However, despite some inconsistencies in practice among EU national courts concerning the threshold that must be met for a claimant to be entitled to compensation for non-material harm, courts do award compensation for such harm. 130n asserting violations of data protection rights arising from the use of LFR technology, individuals may encounter challenges.However, these challenges will not impact individuals affected by the unlawful use of LFR technology in unique ways, as successfully claiming compensation for the breach of data protection law is inherently challenging across the board.
Last, in tort law, the police can be liable for harm suffered due to their wrongful conduct.Whilst the police are not liable for harms inflicted by a third party, on the basis of omission (as a general rule), 131 harms caused due to the police's negligent positive action relating to operational matters are compensable. 132Thus, the police who wrongfully detain an individual using LFR or conduct a distressing interrogation could be liable in tort law.
Based on the preceding analysis, it can be concluded that there are no gaps in the existing laws that apply to the use of LFR technology by the police that call for a comprehensive legal reform.However, there are missing pieces that should be addressed through an incremental change (see Section IV below).

A. Incrementalism: Theoretical Foundation and Insight from Practice
This article acknowledges the need to close some loopholes in the current legal framework in relation to the use of LFR technology by law enforcement authorities, but it argues that changes must be made incrementally.Incrementalism is used as a theoretical framework in financial, environmental and technology regulations. 133It has also been applied to resolve contemporary legal challenges by using old legal rules.Despite these examples, no literature currently theorises its essential components, scope and limitations.
Lawrence Cunningham and David Zaring contended that, in the context of the US Government's response to the 2008 Global Financial Crisis, incremental adjustments were a more effective and practical form of regulation than overarching reform. 134Similarly, in environmental regulation, Robert Glicksman and Sidney Shapiro analysed the practice of incremental regulation in the form of deadline extensions or waivers relating to environmental obligations based on assessing actual harm. 135hey argued that such an approach permits regulators to address specific problems within the context of a particular regulatory domain. 136lthough these works focus on different regulatory fields, they advance the view that, in certain circumstances, regulatory adjustments that gradually tighten or loosen regulatory standards based on actual evidence of harm may be superior to sweeping regulation.
Antonio Franco et al. investigated the effectiveness of extending existing EU legislation to address the challenges presented by nanomaterials. 137hey concluded that incrementalism in such cases can be effective if the necessary legislative amendments are made. 138In essence, incrementalism in this context involves extending existing legislation to address new technology.
The regulation of automated decisions in consumer credit risk assessment also provides a valuable lesson on how regulators adopt incrementalism.In the EU and the UK, automated decision-making (ADM) in consumer credit risk assessment is regulated by the GDPR, 139 as implemented by the DPA 2018 in the UK.The GDPR's key features in this sphere are the prohibition of certain solely automated decisions and its transparency rules that require the disclosure of information about the ADM in question (the so-called "right to explanation"). 140he lack of tailored laws governing automated consumer credit risk assessment in the US led to a call for GDPR-inspired laws in the US. 141espite such a call, automated consumer credit scoring in the US is governed by existing laws passed decades agolaws that did not envision 134  C.L.J.
Regulating Live Facial Recognition Technology advanced machine learning algorithms. 142These laws include the Equal Credit Opportunity Act (ECOA), 143 the Fair Credit Reporting Act (FCRA) 144 and the Fair Housing Act. 145One of the concerns raised regarding these laws has been that algorithms will allow financial institutions to use information, such as Zip codes, as "a proxy for race" and engage in discriminatory practices that these laws are not designed to address. 146In response, the Federal Trade Commission issued a guideline specifying that using Zip codes in consumer credit scoring could be challenged under the ECOA. 147lthough the ECOA does not explicitly address discriminatory practices using Zip codes, 148 the Act has been applied to cases of discrimination that excluded minority neighbourhoods under the practice of redlining, 149 where financial institutions divided these communities according to their Zip codes in loan provisions.In one case, the Justice Department and Consumer Financial Protection Bureau (CFPB) made Hudson City Savings Bank pay over $27 Million in settlement for excluding majority Black and Hispanic counties through redlining. 150imilarly, the FCRA has been used to address automated consumer credit reporting issues.In 2017, the CFPB fined Conduent Business Services, L.L.C. $1.1 Million for inaccurate consumer credit reporting using an automated process. 151Conduent furnished automated consumer credit reporting to lenders and credit reporting agencies in relation to auto loans. 152The information it provided was used to determine whether consumers qualified for loans or under what terms, but there were errors in the files of over one million consumers, including wrong reports of involuntary repossession of vehicles or account default-related information. 153he CFPB applied old laws to address this challenge.
Consumer credit scoring practices continue to evolve, due to the extensive datafication of the market and the reliance on machine learning algorithms that collect and analyse consumers' personal data to make predictions in opaque manners. 154The use of advanced data analytics techniques and opaque machine learning systems in consumer credit risk assessment remains a challenge even in the UK and the EU where the legal frameworks governing personal data and consumer credit are relatively robust. 155Nevertheless, there does not seem to be a call for major legislative change, as proposals focus on better enforcement, including the issuance of compliance and enforcement guidance by relevant authorities. 156he preceding analysis shows that, whenever potential solutions are available within the existing laws to address new technologies, it has become common practice to apply the existing laws rather than to rush to implement new ones.If interpretation by the judiciary or administrative agencies is insufficient, the relevant regulator issues guidelines extending the existing law to the new phenomenon.A new statutory law should only be enacted where such an incremental process does not offer a legal framework that meaningfully tackles the risks posed by the new technology, provided that those risk are also realistically proven to materialise.The author's theory of incrementalism calls for an iterative, continuous and gradual adjustment of the law to address the challenges posed by LFR technology in law enforcement.

B. A Normative Framework for Incrementalism
Whilst incrementalism is addressed in the existing literature and observed in practice, there is no clear guidance on how the theory should be applied in practice.The UK Government's envisioned approach to AI regulation outlined in the White Paper, despite its flaws, incorporates the key components of incrementalism.
Building upon the existing literature, insights from the application of incrementalism in regulating ADM in consumer credit services in the US and the key components of the UK's envisioned AI regulation, the author proposes that incrementalism should incorporate four main ingredients: sectoralism; reliance on existing legal frameworks; evidence-based regulation; and flexibility.

Sectoralism
Incrementalism favours a sectoral approach to regulation rather than an overarching regulatory framework that applies to the use of facial recognition or AI technologies across the board.Substantive rules should be tailored to a specific sector, whilst the relevant sectoral regulator should be charged with enforcement.
The UK's envisioned AI regulatory approach advocates applying five principles for each regulator to implement in its sectors.Each regulator should determine which principle it should implement or to what extent, depending on what is lacking in the existing legal framework.The fact that these principles may not be backed by the statutory duty to enforce them is, at least initially, 157 problematic. 158Nevertheless, one of the policies behind the UK's envisioned approach is sectoralism, which creates substantive rules that are adaptable to each sector and enforced by the relevant regulator.Such an approach has several benefits, the main one being its efficient and effective implementation and enforcement.
Entrusting regulatory oversight to a single regulator operating across multiple sectors could result in an inefficient implementation and enforcement system.Regulatory agencies possessing expertise in specific fields are better placed to regulate the AI systems used in their sectors. 159entralising regulation may lead to corruption, regulatory capture or misaligned enforcement objectives, impacting multiple sectors.By contrast, a decentralised approach allows specific regulators to set enforcement policies, goals and strategies, preventing major enforcement failures and promoting accountability.In addition to allowing effective oversight, sectoral regulation could help minimise disruption and cross-sectoral regulatory conflict.It can also be formulated at a more granular level, offering better opportunities for effective implementation.
An incremental regulation of the use of LFR technology by law enforcement authorities requires a sectoral-level legal adjustment rather than bringing it under a broad and multi-sectoral single regulation.

Reliance on existing legal frameworks
Incrementalism also requires finding solutions in the existing legal framework that should be interpreted to apply to new challenges.The UK's envisioned approach to AI regulation allows regulators to apply the existing regulatory frameworks in conjunction with the principles set out in the White Paper. 160his article has argued that there are currently no significant legal loopholes in the existing legal frameworks governing the use of LFR technology by the police that warrant substantial legal reform.If supplemented by appropriate

438
The Cambridge Law Journal [2023]     guidance and policies, these frameworks can be interpreted to address the challenges posed by LFR technology in law enforcement.

Evidence-based regulation
Another essential component of incrementalism is its openness to evidencebased regulation.Legal adjustments should be made based on evidence of actual harm rather than conjecture. 161LFR technology used by law enforcement in the UK has potential risks.However, the actual risk of harm in some regards, such as the discriminatory impact on specific demographics, is yet to be demonstrated by evidence from actual practice.For example, the risks of inaccuracy and bias identified in gender classification AI systems 162 are frequently cited to support a moratorium on using LFR in the UK until a comprehensive statute is enacted. 163uch a call is not based on solid evidence of harm.In its "Stop Facial Recognition" campaign, Big Brother Watch asserts that "Met and South Wales Police facial recognition [are] over 85% inaccurate 2016-2023". 164ig Brother Watch looked at multiple deployments and the number of false alerts generated relative to the total number of matches. 165n independent review of the MPS's six deployments between 2016 and 2019 by Peter Fussey and Daragh Murray, which shows a success rate of only 19.05 per cent, 166 is also widely used to highlight the potential risk posed by LFR technology.The reported figures of inaccuracy of LFR technology used by the UK police are alarming in the light of the fact that, due to misidentification by face recognition systems, several incidents of wrongful arrests and incarcerations have been documented in the US.However, it is crucial to put the reported inaccuracy of the technology in the context of the existing safeguards and actual police practice, as well as the overall benefit of the technology.
Fussey's and Murray's report shows that, out of 42 matches generated, 16 were immediately ruled as false alerts without engaging with the subjects, 22 were stopped for an identity check, 14 were ruled as false positives after an engagement and eight were true positives.This means that 63.64 per cent of the individuals stopped for identity checks (14 out of 22) were stopped incorrectly and 36.36 per cent (eight out of 22) stopped for identity checks matched the individuals wanted by law enforcement. 167Most of the reports on the inaccuracy of LFR technology focus on false alerts in general, even if the false alerts that are generated are inconsequential in real terms due to the police generally not engaging with most of the subjects falsely identified.If false alerts were to be considered as indicative of potential harm, then the primary focus should be on the number of false alerts where the police engaged with subjects mistakenly identified.At least in such cases, the inaccuracy of the technology arguably starts to manifest an actual effect on individuals.Nonetheless, it is equally crucial to look at how the engagement takes place (e.g.respectful engagement with subjects compared to ill-treatment).With all factors considered and at this stage of the technology, the aforementioned audit did not reveal information that might be damaging to the use of LFR by the police.Identifying eight wanted criminals out of the 22 that had their identities checked is a good result in terms of enhancing public safety.
Regarding differences across demographics, the study by the National Physical Laboratory concluded that LFR systems had the poorest performance on images of Black-Female faces but the discrepancy in accuracy rates "by gender, by ethnicity, and by gender-ethnicity combined were not statistically significant". 168ithout evidence of actual harm resulting from the use of LFR technology, the anticipated risk of harm based solely on its high inaccuracy rate is insufficient to justify the prohibition of the technology or a major legal reform.This is more so given the series of legal and procedural constraints adopted by UK police, such as Equality Impact Assessment and SOP, which mitigate the potential consequences of mistaken identification.Incrementalism allows regulators to evaluate progressively the risk of harm and make regulatory decisions that are sensitive to the evidence and context.

Flexibility
Finally, incrementalism advocates adopting a more flexible regulatory option rather than a route that requires a lengthy legislative process.The UK Government's envisioned approach to AI regulation supports this as well.It favours a non-statutory regulatory framework where regulators initially implement the identified principles without a statutory duty.If the principles set out in the White Paper were to be established through primary legislation, adapting them and swiftly responding to new risks would be challenging due to lengthy parliamentary procedures.The UK Government's envisioned approach is certainly not ideal if the 167 Ibid., at 70. 168 Mansfield, "Facial Recognition Technology", 4.

440
The Cambridge Law Journal [2023]     notion of a non-statutory enforcement policy, where regulators are not under a statutory duty to enforce the selected principles, is adopted for a prolonged time.Nevertheless, a middle ground exists between allowing regulators to pick principles they choose to enforce on a voluntary basis and rigid, overarching statutory rules.In summary, incrementalism would be instrumental in ensuring that regulation weighs the risks and benefits of LFR technology and strikes a fair balance based on evidence.It also keeps the door open for changes that respond to the needs of the time.Furthermore, in the context of technology that has demonstrated significant benefits, incrementalism allows for creating a regulatory framework that encourages socioeconomically useful innovation.Finally, this approach aligns with the UK Government's current proposed approach to AI regulation, which aims to incentivise innovation. 169ased on the preceding, four specific issues need to be addressed: (1) a national rule for watch list; (2) spatial and contextual limits on deployment; (3) adopting appropriate transparency or limiting the scope of overt surveillance; and (4) authorisation of covert deployment by an independent authority.

C. Incrementalism Applied to the Use of LFR by Law
Enforcement Authorities

National rules for watch list
Generally, due to the potential risks associated with using LFR in public spaces, its deployment should be limited to specific types of crimes and the protection of vulnerable individuals (as legally defined).The initial European Commission proposal for the EU AI Act allows for LFR to be deployed for: (1) targeted searches "for specific potential victims of crime, including missing children"; (2) "the prevention of a specific, substantial and imminent threat to the life or physical safety of natural persons or of a terrorist attack"; and (3) "the detection, localisation, identification, or prosecution of a perpetrator or suspect" of a crime with a maximum sentence of at least three years that would allow for issuing a European Arrest Warrant. 170The first two requirements in the EU AI Act are similar to the ones that UK police generally adopt.However, the last requirement is more specific and stringent.Currently, the persons on the watch list created by the MPS or the SWP are broader, with no requirement that the use of LFR be confined to offences of a particular gravity.
The MPS and SWP have revised their SOPs following the decision in Bridges.Their revised SOPs, which are based on the LFR Authorised Professional Practice (APP) developed by the College of Policing, 171 have similar criteria for including persons in the watch list.Accordingly, a watch list could comprise individuals that meet the criteria set by the relevant parts of the SOPs. 172These include those who are: a) wanted by the courts; b) suspected of having committed a crime or suspected, based on reasonable grounds, to be about to commit a crime or in the course of committing a crime; c) "subject to bail conditions, court order or other restrictions that would be breached if they were at the location at the time of the deployment"; d) missing and deemed to be at increased risk; e) at risk of causing harm to themselves or others; and f) victims of an offence, reasonably suspected to have information of importance and relevance to progress an investigation, or otherwise close associates of an individual and that individual themselves would fall within paragraphs (a) -(f). 173e category of persons included in a watch list is generally limited to the closed list provided in the SOP.Therefore, the current rules for the creation of a watch list remove discretion from law enforcement authorities in relation to the persons that can be included in the watch list.However, there is a lack of specification of the types of crimes that warrant the deployment of LFR.In this regard, the police are expected to conduct the assessment of necessity and proportionality by considering other less intrusive methods, the importance of locating the person being sought and expectation of privacy. 174

442
The Cambridge Law Journal [2023]     Excluding certain offences from warranting the deployment of LFR technology is necessary to maintain proportionality between the use of technology and the crime being investigated.The technology should be reserved for severe and high-priority crimes with clear implications for the safety or economic well-being of the public.Giving police unlimited discretion in creating a watch list could lead to inefficient resource allocation, potentially shifting the focus from prioritising the quality of policing efforts to quantifying success based on the number of resolved crimes, no matter how petty the offences might be.This could divert attention from serious crimes, such as murder, rape, human trafficking, tax evasion, money laundering and other financial crimes often involving sophisticated schemes and tech-savvy criminals.
To ensure the effective and responsible use of LFR technology, there needs to be a legal rule that establishes a clear requirement of nondeployment for certain offences.This could be achieved through a national policy document or APP similar to the one developed by the College of Policing that outlines key principles and policies that govern the watch list to ensure consistency in application.The rule should also contain an exception that allows the police to operate flexibly.

Spatial, temporal and contextual limits on deployment
It is crucial to deploy LFR in a publicly accessible space where it does not have a disproportionate chilling effect, ideally limiting it to locations where members of the public have low expectation of privacy.Furthermore, the deployment space must be justified based on the reasonable likelihood of the people on the watch list being in the area.Deployments should also have specific temporal limitations.Such limitations align with the proportionality requirement under Article 8(2) of the ECHR.
The SWP has introduced a revised and principled approach to determining the place of deployment175 in response to the Court of Appeal's decision that the local policy failed adequately to address the place of deployment.In an encouraging step, besides recognising the need for a temporal limit, the SWP's "Legal Mandate" document identifies places where members of the public generally have a higher expectation of privacy and clarifies that deployment in such places requires a proportionality assessment. 176hile it is not advisable to impose a broad restriction on the deployment of LFR technology in specific places, as it could have a disproportionate impact on public safety, it is prudent to consider limitations on LFR deployment in particularly sensitive locations, such as religious institutions, schools and similar venues.These limitations should be determined through a thorough assessment of necessity and proportionality.Adopting such a context-based limitation does not require the enactment of primary legislation, although a national approach would be necessary to ensure the consistent protection of civil liberties across the country.

Appropriate transparency, overt use of LFR and the alternative
Under the Home Office's Surveillance Camera Code of Practice, people in public spaces normally have the right to be informed if they are being monitored by a surveillance camera system (so-called "overt surveillance").This includes knowing the authority undertaking the surveillance and the purpose for which the information is to be used. 177Whilst the Surveillance Camera Code requires transparency in overt surveillance, 178 it does not dictate the specific means by which the transparency is to be ensured.
In practice, law enforcement authorities inform the public of LFR use through appropriate channels, including through signs placed before individuals enter the LFR camera's zone of recognition. 179Additionally, following the LFR APP, the MPS and SWP commit to notifying the public of the deployment of LFR in advance of the operation including through websites and social media. 180However, a notification through websites and social media is not carried out if it "would undermine the objectives or operational imperative of the deployment". 181Law enforcement authorities use these transparency obligations to allow members of the public to exercise their right to avoid overt surveillance by not entering the zone of recognition. 182onetheless, the existing legislation does not state that the goal of transparency in overt surveillance is to enable members of the public to avoid surveillance operations.The overall objective of transparency that can be gathered from the reading of the Surveillance Camera Code seems to be that of enabling members of the public to hold authorities accountable, for instance in case of possible abuse. 183To that end, knowing the authority that conducts the surveillance, the private information acquired, the purpose of the surveillance and appropriate channels for complaints are crucial.
In practical terms, this means that the police should be open about using LFR technology in public spaces, so as to give members of the public the opportunity to exercise their right to hold the police accountable in relation to possible illegal surveillance.This does not necessarily require the police to enable members of the public, potentially including those suspected of committing crimes, to avoid the surveillance and thereby undermine the very objectives of conducting the surveillance.
Whilst the current level of transparency in overt use of LFR lacks legal basis in primary legislation and could undermine the effective apprehension of wanted criminals, overt surveillance could still prevent criminal activities, as the presence of LFR cameras could act as an effective deterrent in most instances. 184owever, the principal aim of the police's use of LFR technology seems to be to apprehend criminal suspects. 185With growing public awareness and the police notifying the public of LFR operations before they enter the recognition zone, the overt deployment of LFR may not be achieving its intended objectives, as the procedures followed could potentially provide career criminals with an advantage to strategically avoid these operations.
Evidence from the deployment of LFR in the UK seems to be consistent with the proposition that overt deployment is becoming less effective.In the most recent eight deployments by the MPS (from April to June 2023), with over 97,000 faces estimated to have be seen, only five true alerts were generated, leading to two arrests. 186This is lower than the 26 true alerts generated in eight deployments in 2022 (see Section II(B) above).However, on a closer look, the evidence is rather inconclusive as the difference in true alert rates could be due to factors, such as crowd size, the size of the watch list, the time of deployment and other environmental factors.Reliable empirical evidence will only be generated by studies covering a more prolonged period, accounting for varying factors impacting the effectiveness of deployments.Nonetheless, the existing literature recognises that transparency in law enforcement, including surveillance, may reduce the effectiveness of law enforcement efforts. 187If law enforcement authorities aim to use LFR effectively in a covert manner, it is therefore imperative that they revise their SOPs that currently impose excessive transparency requirements.
If law enforcement authorities are not willing to adopt a realistic transparency procedure due to concerns of potential legal challenge, 188 then they should resort primarily to covert use of LFR technology.Surveillance is covert if "it is carried out in a manner that is calculated to ensure that persons who are subject to the surveillance are unaware that it is or may be taking place". 189Under the Regulation of Investigatory Power Act 2000 (RIPA), directed surveillance is one of the recognised forms of covert surveillance. 190It is undertaken: (a) for the purposes of a specific investigation or a specific operation; (b) in such a manner as is likely to result in the obtaining of private information about a person (whether or not one is specifically identified for the purposes of the investigation or operation); and (c) otherwise than by way of an immediate response to events or circumstances the nature of which is such that it would not be reasonably practicable for an authorisation. 191vert surveillance is subjected to higher oversight as it poses a greater risk to privacy, requiring two-tiered authorisation.Accordingly, to be lawful, directed surveillance must be authorised by designated persons, 192 meaning "individuals holding such offices, ranks, or positions with relevant public authorities" 193 or police of a certain rank in law enforcement. 194However, such authorisation is ineffective until the relevant judicial authority approves it. 195In the case of covert intrusive surveillance, 196 authorisation becomes effective once approved by a Judicial Commissioner. 197aw enforcement authorities' use of LFR technology could be classified as directed surveillance (if done covertly), as it is used for specific investigations or operations that lead to the acquisition of private information.Because "private information" is defined broadly as information relating to private or family life, 198 LFR systems deployed in public spaces would likely lead to obtaining such information, as affirmed in Bridges, where the deployment was considered to have violated Article 8(1) of the ECHR.Moreover, a typical LFR deployment is not made as an immediate response to an event, making it a clear case of directed surveillance.
The alternative solution being proposed here requires limiting the overt use of LFR technology in public places to two specific purposes (as a general rule): locating vulnerable persons; and preventing crimes in general or violence by monitoring public events to prevent threats to public safety.This limited overt use of LFR would be effective because vulnerable persons and persons going to public events with violent intent are unlikely to avoid LFR deployment areas.As such, the transparency of these types of operations does not compromise their effectiveness.For such deployments, the current system of authorisation of overt deployment by law enforcement authorities should be retained, namely authorisation should be given in these cases by the law enforcement authority with no need for additional approval.
In conclusion, the overt use of LFR to apprehend people suspected of committing crimes could prove to be ineffective unless law enforcement authorities adopt a more pragmatic approach to their transparency commitment.The current practice of transparency appears to undermine the objectives of law enforcement authorities' use of the technology.Because of this, law enforcement authorities should mainly utilise covert deployments following the two-tier authorisation procedure applicable to directed surveillance under RIPA, with the exceptions analysed above.

Prior judicial approval under RIPA
One of the critical aspects of conducting surveillance is prior authorisation.In the UK, a distinction is made between overt and covert surveillance, in regard to authorisation.
As mentioned earlier, directed (covert) surveillance requires authorisation by the law enforcement authority and judicial approval. 199For overt surveillance, the Surveillance Camera Code of Practice, issued under the Protection of Freedoms Act 2012,200 allows the Chief Police Officer to establish a procedure of authorisation for the deployment of LFR, 201 with no need for further judicial approval.In normal circumstances, authorisation is granted by a police officer ranked Superintendent or higher.In cases of urgency, an officer below the rank of Superintendent but not below the rank of Inspector could grant the authorisation. 202That authorising officer must, as soon as practicable, inform a Superintendent or higher of the deployment of LFR who can decide to cease the operation. 203n the one hand, the UK's approach to authorising the overt deployment of LFR poses a risk of abuse of police authority since there is no independent scrutiny.On the other hand, requiring prior authorisation from an independent judicial body could significantly delay deployment, potentially jeopardising public safety.However, the approval of the use of LFR technology as an investigative tool by an independent oversight body could enhance public trust in law enforcement.
In line with the recommendation for the police to rely more on covert surveillance, it would be good practice to employ covert deployment of LFR authorised by the police and approved by the court through an expedited procedure.The procedure should also allow a post-use judicial approval of covert deployments, in cases of justified urgency. 204To be clear, this proposal does not suggest that the police abandon overt surveillance even for apprehending violent criminals, as noted earlier.
It invites law enforcement authorities to be mindful of the fact that the current level of transparency to which they adhere could unnecessarily undermine the effectiveness of law enforcement operations and that they use covert operations as a matter of good practice.This will also better enhance the effective investigation of crimes and public safety.

V. THE LIMIT OF THE INCREMENTAL APPROACH
The UK law regulating the use of LFR technology by law enforcement agencies has some gaps that require incremental reform.Local policies or codes of practice should be implemented to address these loopholes.To ensure the consistent protection of civil liberty, a national policy that regulates the creation of watch lists should be adopted, in addition to principled limits on the spatial and contextual deployment of LFR involving the assessment of proportionality and necessity.LFR technology should also be reserved for serious offences, excluding minor crimes and SOPs should clearly set out this principle.Judicial approval of authorisation of deployment should be mandatory for covert use, with the possibility for fast-track or post-deployment approval in cases of urgency.To enhance the effectiveness of overt use of LFR, law enforcement authorities should adopt a transparency procedure that is aimed at promoting accountability rather than undermining the effectiveness of law enforcement objectives.Thus, the current level of transparency should be revisited.Alternatively, law enforcement authorities should primarily use covert deployment of LFR, with overt use being limited to specific cases, such as locating non-offending subjects or preserving public safety.
Whilst an incremental approach is a valuable tool in addressing the legal lacunae relating to the deployment of LFR, it has limit in regulating the technology more widely.In the future, it would be necessary to 124
Metropolitan Police, "Standard Operating Procedure", [6.8]. 32Because one deployment was stopped due to technical fault, no data has been reported in relation to that deployment.Regulating Live Facial Recognition Technology would otherwise have required considerable police resources.And the swift apprehension of the suspects prevents potential further crimes and safeguards public safety.
third prong for justifying interference with private life under Article 8(2) is proportionality.Even if the interference is in pursuit of a legitimate aim and in accordance with the law, it may fail the proportionality test.The Court of Appeal, having concluded that the 84 R. (Catt) v Association of Chief Police Officers [2015] UKSC 9, at [7] (Lord Sumption). 85Part 3 of the DPA 2018 transposes into domestic law Directive (EU) No 2016/680 (OJ 2016 L 119 p.89). 86Home Office, Surveillance Camera Code of Practice (first published 2013, London 2021). 87R. (Bridges) v Chief Constable of South Wales [2020] EWCA Civ 1058, at [109]-[118].
114Various police associations, such as the MPS Black Police Association, the MPS Sikh Police Association, and research institutes such as the Ada Lovelace Institute were consulted in drawing up the MPS's latest LFR equality impact assessment: see ibid., at 19-30. 115N.A. Moreham and M. Warby (eds.),Tugendhat and Christie: The Law of Privacy and the Media, 3rd ed.
L.A. Cunningham and D. Zaring, "The Three or Four Approaches to Financial Regulation: A Cautionary Analysis Against Exuberance in Crisis Response" (2009) 78 George Washington Law Review 39, 48. 135R.L. Glicksman and S.A. Shapiro, "Improving Regulation Through Incremental Adjustment" (2004) 52 University of Kansas Law Review 1179. 136Ibid., at 1186. 137A. Franco et al., "Limits and Prospects of the 'Incremental Approach' and the European Legislation on the Management of Risks Related to Nanomaterials" (2007) 48 Regulatory Toxicology and However, this does not necessarily remove discretion in relation to extending the use of LFR technology for relatively minor crimes. 171College of Policing, "Live Facial Recognition: Authorised Professional Practice", available at https:// www.college.police.uk/app/live-facial-recognition(last accessed 16 September 2023). 172Metropolitan Police, "Standard Operating Procedure", [6.8];South Wales Police, "Standard Operating Procedure", [6.8].A single watch list could include individuals who belong to one or more, or even all, of the categories outlined in points (a) to (f), as each point pertains to different categories of individuals. 173Ibid.It is worth noting that, in relation to close associates, the MPS's SOP cross-references paragraphs