Media Coverage

Kingsley Hayes argues for live facial recognition regulation in Computer Weekly

Head of Data and Privacy Litigation, Kingsley Hayes discusses the regulatory lacuna surrounding the use of live facial recognition in the UK in Computer Weekly.

Kingsley’s article was published in Computer Weekly, 8 June 2023, and can be found here.

Live facial recognition (LFR) is a type of facial recognition technology that involves the automatic collection of biometric data. LFR cameras point to an Orwellian future; one which bears witness to the usurping of an individual’s privacy. While not necessarily unlawful – the Information Commissioner’s Office (ICO) recently said it was satisfied with LFR used by security company Facewatch – there are genuine concerns, and the independent regulator has far from given a green light to the blanket use of LFR technology.

In April 2023, a cross-party collection of almost 50 parliamentarians wrote to Mike Ashley’s Frasers Group to condemn the use of LFR cameras in the group’s stores, and called for the group to end the deployment of LFR cameras.

What are the issues?

There are a myriad of deeply troubling issues with the deployment of LFR cameras, of which the parliamentarians address many in their letter to Frasers Group, but what is particularly alarming is the risk of people being subjected to wrongful, automated decisions. Technology is not without its flaws, and the automatic collection of biometric data at speed and scale is highly likely to lead to a significant number of errors.  

One flaw that is especially egregious is algorithmic bias. In 2019, National Institute of Standards and Technology researchers studied 189 facial recognition algorithms (representing the majority of the industry at the time), and found that most facial recognition algorithms exhibit a form of bias. According to the researchers, facial recognition technologies falsely identified black and Asian faces 10 to 100 times more often than they did white faces. The technologies also falsely identified women more than they did men, thus making black women particularly vulnerable to algorithmic bias.

Similarly, in 2021, the ICO published its opinion on LFR technology in public places. The opinion identified a number of key data protection issues, including, but not limited to: the potential for bias and discrimination. On that matter, the ICO commented, “[s]everal technical studies have indicated that [LFR] works with less precision for some demographic groups, including women, minority ethnic groups and potentially disabled people. Error rates in [LFR] can vary depending on demographic characteristics such as age, sex, race and ethnicity. These issues often arise from design flaws or deficiencies in training data and could lead to bias or discriminatory outcomes”.

More recently, the ICO said that it would “monitor the evolution of live facial recognition technology to ensure its use remains lawful, transparent and proportionate”.

Those who seek to deploy LFR cameras in public places will claim that they are ensuring safety and preventing crime. In a retail store for example, LFR cameras will scan the faces of every shopper and check them against a database of suspected thieves. Should there be a match it will, at least in theory, alert staff and/or security who will proceed to either (i) closely monitor the person; or (ii) remove the person from the store. However, there is a growing body of evidence that LFR technologies are likely to produce a large number of errors when processing such a large volume of data. Those errors could go on to have a damaging impact on people’s ordinary lives.

Legal framework and regulatory direction

LFR technology involves the processing of personal, biometric, and sometimes special category, data. The main laws which regulate its use are the UK General Data Protection Regulation (UKGDPR) and the Data Protection Act 2018 (DPA), legislation which controllers therefore must comply with before LFR technology can be deployed. However, these laws in themselves are not sufficient as guardrails to govern this nascent technology’s use.

The UKGDPR and DPA are silent on the technical effectiveness of LFR technologies that controllers use. For example, the ICO’s opinion in 2021 recommended, amongst other things, that “precision” should be used by a controller as a measure of successful deployment. “Precision” here relates to the percentage of positively identified cases that return positive hits (e.g., if the LFR technology matches 1 out of 10 matches to a watchlist correctly, its precision rate would be 10%). High precision is imperative to ensuring that people are not subject to detriment through incorrect identification, however the current legal framework does not stipulate a specific threshold. This lacuna is problematic, to say the least; the practical reality means that controllers are not obliged to establish thresholds in the data protection impact assessment (DPIA), meaning there is nothing stopping a controller using LFR technology with abysmal precision rates so long as the relevant provisions in the UKGDPR and DPA are complied with.

Clearly, regulators need to move fast in putting in place rules concerning the accuracy of these camera systems – before more firms follow the lead of Frasers Group. On 8 March 2023, the UK government introduced the Data Protection and Digital Information (No 2) Bill (the Bill), which despite being hailed by Data Minister Julia Lopez as “modern laws for a data-driven era”, omits any specific reference to LFR technology. Failure to address the pitfalls, inaccuracies and uncertainties surrounding LFR technology as soon as practicably possible – and certainly before it becomes ubiquitous in public places – may leave the UK in a legal quagmire unless specific legislation is enacted to regulate its use.

If LFR technology is to be become the norm in public places, which by all indications it may, steps need to be taken to ensure accountable and proportionate deployment. The regulatory steps that need to be taken will have to be decided by parliament, and these measures will need to carefully consider how to ensure that the efficacy of this technology is balanced against people’s fundamental rights to privacy and fair treatment.

The use of LFR technology in public places currently puts onus on the data controller to conduct a DPIA. However, under the Bill, the notion of DPIAs will be replaced with an assessment of high risk processing (AHRP), which is much narrower in scope. AHRPs have three basic objectives: they must summarise the purpose of processing, assess the necessity of the processing and the risks that apply to individuals, and detail how the controller aims to mitigate any such risks. The ICO has already noted that they have seen a lack of due diligence from controllers in their work reviewing DPIAs, therefore work must be done to ensure the same cannot be said for AHRPs. It would be worthwhile for the ICO to provide clarity for controllers on what needs to be made available for further assessment when they deploy LFR technologies, and how they can ensure that their AI models will not unfairly discriminate against those most at risk.

It would also be prudent for the ICO to issue guidance for minimum statistically accurate percentage figures. If controllers were not able to meet the set minimum, there should be self-reporting requirements, so that the ICO can investigate further. By imposing a regulation akin to this, work can be done to ensure the technical effectiveness of LFR technologies.

Conclusion

The ICO will need to continue to investigate and advise on the myriad of issues that permeate LFR technologies. Failure to take proactive steps in, amongst other things, conducting audits of LFR systems that are already in operation and assessing DPIAs (soon to be AHRPs) which identify high-risk processing, could have dire far-reaching consequences. Unless this issue is seriously considered by parliament and steps are taken to enact specific legislation and/or the regulator seeks to impose higher standards, we could well bear witness to copious amounts of particularly egregious stories of how LFR technology has negatively impacted individuals and their communities.

Maltin PR

Recent Posts

KP Law Highly Commended at the Modern Law Awards 2024

We are very pleased to share that KP Law has been Highly Commended at the… Read More

10 months ago

Keller Postman UK merges with Lanier, Longstaff, Hedar & Roberts to form specialist collective redress law firm KP Law Limited

Today Keller Postman UK Limited and Lanier, Longstaff, Hedar & Roberts LLP announce their merger… Read More

10 months ago

What is group litigation?

Group litigation, also known as class action or group legal action, is a process where… Read More

11 months ago

What’s been happening in January 2024?

What’s been happening in January 2024? In our regular monthly update, we share the latest… Read More

11 months ago

What is talcum powder cancer?

What is talcum powder cancer? Here, we explain what talcum powder cancer refers to and… Read More

11 months ago

Lucy Burrows comments on 23andMe’s response to its data breach in ITPro

Associate Lucy Burrows provides insight on the 23andMe data breach and highlights the danger of… Read More

12 months ago