Kotsoglou, Kyriakos and Oswald, Marion (2020) The long arm of the algorithm? Automated Facial Recognition as evidence and trigger for police intervention. Forensic Science International: Synergy, 2. pp. 86-89. ISSN 2589-871X
|
Text
1-s2.0-S2589871X20300024-main (1).pdf - Published Version Available under License Creative Commons Attribution Non-commercial No Derivatives 4.0. Download (237kB) | Preview |
|
|
Text
1-s2.0-S2589871X20300024-main.pdf - Accepted Version Available under License Creative Commons Attribution Non-commercial No Derivatives 4.0. Download (690kB) | Preview |
Abstract
Criminal law's efficient and accurate administration depends to a considerable extent on the ability of decision-makers to identify unique individuals, circumstances and events as instances of abstract terms (such as events raising ‘reasonable suspicion’) laid out in the legal framework. Automated Facial Recognition has the potential to revolutionise the identification process, facilitate crime detection, and eliminate misidentification of suspects. This paper commences from the recent decision regarding the deployment of AFR by South Wales Police in order to discuss the lack of underpinning conceptual framework pertinent to a broader consideration of AFR in other contexts. We conclude that the judgment does not give the green light to other fact sensitive deployments of AFR. We consider two of these: a) use of AFR as a trigger for intervention short of arrest; b) use of AFR in an evidential context in criminal proceedings. AFR may on the face of it appear objective and sufficient, but this is belied by the probabilistic nature of the output, and the building of certain values into the tool, raising questions as to the justifiability of regarding the tool's output as an ‘objective’ ground for reasonable suspicion. The means by which the identification took place must be disclosed to the defence, if Article 6 right to a fair trial is to be upheld, together with information regarding disregarded ‘matches’ and error rates and uncertainties of the system itself. Furthermore, AFR raises the risk that scientific or algorithmic findings could usurp the role of the legitimate decision-maker, necessitating the development of a framework to protect the position of the human with decision-making prerogative.
Item Type: | Article |
---|---|
Uncontrolled Keywords: | Automated facial recognition, Algorithms, Policing, Decision-making, Reasonableness, Evidence, Individualisation |
Subjects: | G700 Artificial Intelligence M200 Law by Topic M900 Other in Law |
Department: | Faculties > Business and Law > Northumbria Law School |
Depositing User: | Elena Carlaw |
Date Deposited: | 20 Jan 2020 14:47 |
Last Modified: | 27 Aug 2021 10:06 |
URI: | http://nrl.northumbria.ac.uk/id/eprint/41947 |
Downloads
Downloads per month over past year