Hostname: page-component-89b8bd64d-ksp62 Total loading time: 0 Render date: 2026-05-13T06:22:50.330Z Has data issue: false hasContentIssue false

From ‘wild west’ to ‘responsible’ AI testing ‘in-the-wild’: lessons from live facial recognition testing by law enforcement authorities in Europe

Published online by Cambridge University Press:  19 September 2025

Karen Yeung*
Affiliation:
Interdisciplinary Professorial Fellow in Law, Ethics and Informatics, Birmingham Law School & School of Computer Science, University of Birmingham, Birmingham, UK
Wenlong Li
Affiliation:
Research Professor, Guanghua Law School, Zhejiang University, China
*
Corresponding author: Karen Yeung; Email: k.yeung@bham.ac.uk

Abstract

Although ‘in-the-wild’ technology testing provides an important opportunity to collect evidence about the performance of new technologies in real world deployment environments, such tests may themselves cause harm and wrongfully interfere with the rights of others. This paper critically examines real-world AI testing, focusing on live facial recognition technology (FRT) trials by European law enforcement agencies (in London, Wales, Berlin, and Nice) undertaken between 2016 and 2020, which serve as a set of comparative case studies. We argue that there is an urgent need for a clear framework of principles to govern real-world AI testing, which is currently a largely ungoverned ‘wild west’ without adequate safeguards or oversight. We propose a principled framework to ensure that these tests are undertaken in an epistemically, ethically, and legally responsible manner, thereby helping to ensure that such tests generate sound, reliable evidence while safeguarding the human rights and other vital interests of others. Although the case studies of FRT testing were undertaken prior to the passage of the EU’s AI Act, we suggest that these three kinds of responsibility should provide the foundational anchor points to inform the design and conduct of real-world testing of high-risk AI systems pursuant to Article 60 of the AI Act.

Information

Type
Research Article
Creative Commons
Creative Common License - CCCreative Common License - BYCreative Common License - NCCreative Common License - ND
This is an Open Access article, distributed under the terms of the Creative Commons Attribution-NonCommercial-NoDerivatives licence (http://creativecommons.org/licenses/by-nc-nd/4.0), which permits non-commercial re-use, distribution, and reproduction in any medium, provided that no alterations are made and the original article is properly cited. The written permission of Cambridge University Press must be obtained prior to any commercial use and/or adaptation of the article.
Copyright
© The Author(s), 2025. Published by Cambridge University Press
Figure 0

Figure 1. Police control van and poster situated outside the Stratford Centre, where one of the London trials was conducted.

Figure 1

Figure 2. Police van involved in one of the Welsh trials conducted outside Cardiff City Football Club.

Figure 2

Figure 3. The blue and white stickers outside Berlin Sudkreuz station notifying those entering of ongoing live FRT trials inside the station.

Figure 3

Figure 4. The setting of the E4 entrance of the 135th Nice Carnival for the live FRT trials.

Figure 4

Table 1. Accepted phased evaluation structure for pharmaceuticals (columns 1 and 2) and a proposed analogous structure for evaluation of algorithmic tools (columns 3 and 4)

Submit a response

Comments

No Comments have been published for this article.