Facial Recognition Technology and Data Protection Compliance

How important are necessity and proportionality?

Martin Brazier
|
Senior Consultant at URM
|
PUBLISHED on
26 Jan
2024

The use of facial recognition technology can deliver great benefits, but it can be controversial and compliance with data protection legislation can be complex and nuanced.  What part can an assessment of necessity and proportionality play in assessing compliance?

What is Facial Recognition Technology?

Facial recognition technology (FRT) is the electronic process of using the image of a person’s face to create a unique biometric map of their facial characteristics that can be used to match against a predefined database of faces.  There are generally two types of use.  The first is retrospective FRT, where a person’s photograph is checked against a database – for example, the police could use this to determine whether a suspect in custody appears on crime systems under another name.  The second is live FRT, where everyone passing a camera in a public location is photographed and checked in real time against a database of known individuals – for example, a shop using the technology to identify known or potential shoplifters.

Retrospective vs Live Facial recognition technology (FRT)

The technology is developing quickly and there are companies providing FRT solutions for mobile device authentication, for access control into premises, to provide alerts when known individuals enter particular locations, and for law enforcement purposes.

What is the opposition to FRT?

The technology has generated privacy concerns, and there have been some high-profile cases where its use has been challenged.  

Formal challenges have been brought by Big Brother Watch over the use of FRT in 35 Co-Op stores across southern England designed to identify and alert staff and other stores to known shoplifters.  Campaign group Liberty challenged the use of live FRT by South Wales Police, and further challenges have been made about FRT use at the Trafford Centre in Manchester and at a property development in King’s Cross, London.

As part of these cases, campaign groups cite an infringement of privacy and point to the poor results of the technology in these early deployments.

Studies by the London Policing Ethics Board and the Ada Lovelace Institute have shown that some people are happy to trade off privacy if it assures their security, whereas others are opposed to having their biometric data captured in public spaces under any circumstances.

The use of FRT is governed by data protection legislation, the Human Rights Act, the Equality Act and the Protection of Freedoms Act.  The Equality Act is relevant here because studies, such as the US National Institute of Standards and Technology (NIST) study of the effects of race, age and sex on FRT, indicated a significantly higher level of false positive matches for faces from the black and ethnic minority communities than for white people.

What are the data protection considerations?

Most organisations need to satisfy the requirements of the UK General Data Protection Regulation (UK GDPR), implemented via the Data Protection Act 2018 (DPA2018), although processing for law enforcement purposes falls under Part 3 of the DPA2018.

If you are a data controller, you should adopt a privacy by design and default approach, building in GDPR compliance from the start of the project.  It is likely that a data protection impact assessment (DPIA), mandatory for high-risk processing, would be required.  Whether mandatory or not, a DPIA would always be recommended because it is the ideal opportunity to record the thought process, lawful basis and controls to be implemented, and helps to fulfil the requirement to demonstrate accountability.

If you are using FRT for the purposes of crime prevention or detection – even outside law enforcement – the situation may arise that images processed by FRT are compared to known offenders, and this opens further obligations if this means processing criminal offence data.  The UK GDPR provides extra protection to the personal data of offenders or suspected offenders in the context of criminal activity, allegations, investigations, and proceedings.  If you have official authority, you can process personal data about criminal convictions and offences, because you are processing the data in an official capacity.  If you do not have official authority, you can only process criminal offence data if you can identify a specific condition for processing in Schedule 1 of the DPA2018. You cannot keep a comprehensive register of criminal convictions unless you do so in an official capacity.

If FRT is used to identify people and this information is used to make an intervention or action, could this be considered as automated decision making?  Article 22 of the UK GDPR limits the circumstances in which you can make solely automated decisions that have a legal or similarly significant effect on individuals.  You would need to decide whether your processing constitutes automated decision making or whether you can introduce a manual verification process before acting upon the findings of FRT.

If FRT captures the images of children, either deliberately or inadvertently because they pass a camera in a public space, this leads to further considerations.

But let’s start at the beginning – the first data protection principle states that processing must be lawful, fair and transparent.

What is lawful processing?

A lawful basis is required under Article 6 of UK GDPR.  There are obvious difficulties with using consent as a lawful basis.  If using live FRT, how can somebody walking down the street or entering a shopping centre provide consent which is freely given, specific, informed and an unambiguous indication by a clear affirmative action?  If using FRT for access control, for example, could the imbalance of power between an employer and an employee mean that an employee can withhold consent?  And, of course, consent can be withdrawn at any time.

Some public bodies may be able to rely on the public task basis, but often controllers are left with legitimate interests as their lawful basis. This means you must balance your interests with the individual’s interests, rights and freedoms.  Furthermore, the processing must be necessary, and if it is possible to achieve the same results in a less intrusive way, legitimate interests will not apply.

For law enforcement purposes, you can use only consent as a lawful basis, or that the processing is necessary for the performance of a law enforcement task.

Here we have the first mentions of processing being necessary.  This does not mean that processing must be absolutely essential, but it must be more than just useful or habitual and should be a reasonable, targeted and proportionate way of achieving the purpose.

However, as FRT uses algorithms to build a biometric template of an individual’s face, the processing is ‘special category’ processing and requires a further condition under Article 9 of UK GDPR.  (Law enforcement terms this as ‘sensitive’ processing and requires the further condition under DPA 2018 section 35.)

There are ten conditions in Article 9, and five of them require you to satisfy further conditions and safeguards set out in Schedule 1 of DPA2018.  In many cases, you will also need an appropriate policy document.

An appropriate policy document is a short document outlining the compliance measures and retention policies for sensitive personal data. It should include:

  • The Schedule 1 condition (or conditions) you are relying on
  • The procedures for complying with each of the principles
  • The retention and deletion policies
  • An indication of the retention period for the specific data.

An appropriate policy document is also required when processing ‘sensitive’ data for law enforcement purposes and you must also meet one of the conditions in Schedule 8 of the DPA2018.  In this case, we come across the notion of processing needing to be strictly necessary.

Strictly necessary means that the processing is essential rather than reasonably necessary and that the task cannot be performed without the processing.  It is not sufficient that the processing is convenient, more efficient or technologically achievable.

Facial recognition technology policy

What is fair processing?

As well as being lawful, processing must also be fair.  This means that it must not be unduly detrimental, unexpected or misleading to data subjects and must be processed in a way that people would reasonably expect and with no unjustified adverse effects.  In other words, it must be proportionate to the purposes stated.

What is transparent processing?

Transparency is fundamentally linked to fairness.  It means that processing is open and honest and complies with the transparency obligations under the right to be informed.  You must ensure that you tell individuals about processing in a way that is easily accessible, easy to understand and in clear and plain language.

There are clear practical difficulties in informing data subjects who are subject to FRT processing, especially when using live FRT in public areas, and there may be circumstances where informing data subjects would alert them to avoid the area under surveillance.

Is it a matter of necessity and proportionality?

There are many data protection aspects to be considered, including whether the processing involves automated decision making, whether it involves criminal offence data or the processing of children’s data.  There are processes to ensure compliance, such as a DPIA, or a legitimate interest assessment (LIA) and the determination of the lawful basis.  Forming the foundation of all of this is a robust analysis of necessity and proportionality.

In essence, if the proposal is necessary (and not just technically possible or helpful) and the intrusion into privacy is proportionate to the objectives of the processing then it is likely you can comply with data protection law. In addition, there are some key learning opportunities from previous high-profile deployments of FRT which we can use to ask ourselves whether any FRT proposal is necessary and proportionate.

Is processing using FRT necessary? – Is FRT the only way the objective can be delivered, and can the aims be delivered in a less intrusive way?

Is processing using FRT proportionate? – The average member of the public may be content if the purpose of live FRT is to catch criminals who represent a danger, but perhaps not those committing parking offences.

Is the technology effective in achieving the objective? – What would be the measure of success of the processing which would justify it as proportionate?  For example, would capturing the personal data of all passers-by in a high street over a year be proportionate if it yields just one conviction for a minor offence?

Is the processing being carried out overtly and transparently or covertly? – This has an impact on whether the processing is perceived as fair, transparent and proportionate.

Is the processing ethical? – There are those who argue that if processing is lawful then it is ethical, and those who consider that just because the law allows a controller to do something, it doesn’t necessarily mean they should.  Are the benefits delivered by the technology proportionate to the impact on privacy, adverse publicity or reputational damage you might incur?

Is there a risk of bias or discrimination in the implementation? – Is the risk of discrimination or bias proportionate to the objectives of the processing?  Could controls be implemented to remove or limit any bias?

Is it necessary to address data subjects’ disquiet over the lack of consent? – There is a belief among the public, however inaccurate, that data subjects must consent to FRT.  How can this be overcome and the data subjects be fully informed?

Is it possible to add controls if the technology makes mistakes? – The Post Office Horizon scandal has shown us that technology is not infallible.  If an individual has been wrongly identified as a missing person or someone who has committed a crime, how would they feel about being confronted or arrested?  How could you prevent that from happening, and are there any checks or human intervention in place between the FRT identifying them and any resulting actions?  Is the risk of mistakes proportionate to the objectives?

Is it possible there are unintended consequences? – Will the number and complexity of complaints or data subject access requests increase if data subjects know they have been subject to FRT?

There is always a danger that a desire to exploit new and emerging technology becomes the aim, rather than an aim to provide a solution to a known problem.  However, assessing necessity and proportionality throughout the process can ensure that any solution is compliant with data protection legislation.

How URM can help

Taking advantage of new technological developments while remaining compliant with data protection (DP) legislation, such as the GDPR, can be difficult for organisations to navigate without guidance.  Leveraging the expertise gained from 17 years of helping organisations comply with DP legislation, URM can offer a wide range of GDPR consultancy services to guide your DP practices.  Our team of GDPR consultants and trainers can offer advice and support with issues surrounding DP, including the deployment of new technology such as FRT in compliance with the GDPR.

Our consultancy offerings include everything from conducting gap analyses and providing remediation support, through to helping you conduct DPIAs and produce records of processing activities (ROPAs). If you would like ongoing support, we can offer a virtual data protection officer (DPO) service, providing you with access to a team of qualified DP practitioners, each with their own specialised area of GDPR consultancy. Meanwhile, if your organisation has received data subject access requests (DSARs), URM can help you process this request in full compliance with the legislation by offering a GDPR DSAR redaction service.  

In addition to our consultancy services, we also offer a range of DP-related training courses, including practical half courses on specialist topics such as conducting DPIAs and data transfer impact assessments (DTIAs) as well as a 1-day ‘How to Manage DSARs’ training course, where you will receive guidance on how to complete them, determine whether the DSAR request is valid or not, and verify the identity of the data subject.  For those individuals looking to gain a recognised qualification in data protection, URM provides the BCS Foundation Certificate in Data Protection training course where you can gain a sound grounding and practical interpretation of all the key elements of UK data protection law, including the UK GDPR and the UK Data Protection Act 2018.

Martin Brazier
Senior Consultant at URM
Martin is a highly experienced and knowledgeable GRC consultant at URM specialising in data protection. He holds BCS Certificates in Data Protection and Freedom of Information and achieved Certified Information Privacy Professional (Europe) (CIPP/E). He also holds BCS Certificates in Information Security Management Principles, Business Continuity Management and Information Risk Management.
Read more

Does your organisation fully comply with the General Data Protection Regulation (GDPR)?

If uncertain, URM is able to conduct a high-level GDPR gap analysis which will assist you understand your current levels of compliance and identify gaps and vulnerabilities.
Thumbnail of the Blog Illustration
Data Protection
Published on
22/7/2022
Verifying the Identity of Someone Requesting Information Under the GDPR

We look at the requirement within both the DPA and the GDPR to verify the identity of an individual making a request before acting or releasing information

Read more
Thumbnail of the Blog Illustration
Data Protection
Published on
25/7/2022
Data Subject Access Requests (DSARs) Services

One of the fundamental rights of an individual (data subject), under the UK GDPR is to be able to access and receive a copy of their personal information.

Read more
Thumbnail of the Blog Illustration
Data Protection
Published on
22/7/2022
Transferring Personal Data Outside of the EEA

This blog looks at a very specific area of the GDPR - Article 28 and data transfer outside of the EEA.

Read more
We have been a partner with URM Consulting for many years. They offer a great service and are a team of real experts in all things cyber security.
Award-winning IT Services
contact US

Let us help you

Let us help you in your compliance journey by completing the form and letting us know how we can best support you.