The use of biometric data has very much been cemented into our everyday lives, from unlocking our phones and laptops, accessing online banking, even clocking in at work. Long gone are the days where its use was reserved for unlocking vaults via a retina-scan in an action movie!
‘Biometric data’ is defined in the GDPR as “personal dataInformation which relates to an identified or identifiable natural person. resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopy [that’s fingerprints, to you and me] data.”
If you have a smart phone, chances are that you unlock it using biometric data, as many now rely on fingerprint or facial recognition. Biometrics also cover iris and retinal recognition, DNA matching, voice recognition, digital signatures, and even keystroke analysis.
Despite its increased use, many organisations are still unsure as to the rules around its use, and any special considerations that must be factored in. Therefore, this blog discusses our five do’s and don’ts of processing biometric data.
Biometric data under both the UK and EU GDPRRegulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons regarding the processing of personal data and on the free movement of such data (General Data Protection Regulation). is considered ‘special category data’. Article 9(1) prohibits the processing of special category dataTypes of personal data listed in Article 9(1) GDPR that are considered sensitive and thus require extra protection. Article 9(1) lists data relating to: • racial or ethnic origin • political opinions • religious or philosophical beliefs • trade union membership • genetic data • biometric data • health • sex life • sexual orientation Where these types of personal... unless one of the conditions under Article 9(2) is met, this includes:
In addition to this you will need to ensure that your organisation has satisfied one of the lawful bases prescribed under Article 6 – about which you can find more information in our previous blog. These will both need to be documented in your Article 30 Records of Processing Activities (RoPA), as well as in a DPIA and your privacy noticeA clear, open and honest explanation of how an organisation processes personal data. (more on those later).
Note: There may be local exemptions that also need to be considered. E.g. Article 29 of UAVG, the implementation law in The Netherlands, provides exemptions where there is a necessityThe purpose of the personal data processing activity must not be able to be achieved by a less intrusive method. for authentication or security, i.e. security within a nuclear plant.
Regardless of which lawful basis and condition for processing you rely on, you should always ensure that the collection and use of biometric data is necessary and proportionate to your aims. If there is another less intrusive type of data you could processA series of actions or steps taken in order to achieve a particular end., without compromising your aims and objectives, this should be considered instead. For example, you should ask yourself whether using CCTV and security personnel would be as effective as live facial recognition1. This is Something Southern Co-Op may have to justify to the Information Commissioner’s OfficeThe United Kingdom’s independent ‘supervisory authority’ for ensuring compliance with the UK GDPR, Data Protection Act 2018, the Privacy and Electronic Communications Regulations etc. (ICOThe United Kingdom’s independent supervisory authority for ensuring compliance with the UK GDPR, Data Protection Act 2018, the Privacy and Electronic Communications Regulations etc.). In addition, where you do use biometrics, you should consider offering an alternative to your data subjects who do not wish to use biometrics; something online banking apps offer, as you can sign in with passwords and other authentication methods.
For many uses of biometric data (e.g. when processed on a large scale, to monitor individuals, when processing data about vulnerable individuals etc.), it is a legal requirement that a Data Protection Impact AssessmentA formal documented assessment which allows decision-makers to identify, manage and mitigate any data protection risks associated with a project. (DPIA) is completed on the processing, as it is deemed likely to result in a high risk to the rights and freedoms of your data subjects. However, the UK’s Information Commissioner’s Office (ICO) states that it is best practice to always conduct a DPIA when processing biometric data.
Additionally, if your organisation is relying on Legitimate InterestsLegitimate interests is one of the six lawful bases for processing personal data. You must have a lawful basis in order to process personal data in line with the ‘lawfulness, fairness and transparency’ principle. as its lawful basis for the processing, you will need to do a Legitimate Interest Assessment (LIA). Furthermore, if you are relying on Substantial Public Interest or Employment, Social Security and Social Care conditions for processing, you will need an Appropriate Policy Document in place before you process any biometric data, which essentially outlines how you respect each of the principles of the UK GDPRThe UK General Data Protection Regulation. Before leaving the EU, the UK transposed the GDPR into UK law through the Data Protection Act 2018. This became the UK GDPR on 1st January 2021 when the UK formally exited the EU. whilst processing the personal data.
Finally, and most importantly, you must inform data subjects about the processing. You will need to provide them with a privacy notice clearly outlining what data you are processing, for what purposes, and the lawful basis and special category condition you are relying on, to name but a few examples. The ICO recently fined Clearview AI £7.5 million for a number of non-compliances including its failure to inform data subjects that it was processing their personal data for facial recognition purposes.
As with any personal data processing, you need to implement appropriate technical and organisational measures to protect the personal data being processed (Article 32 GDPR). As biometric data is special category data, the bar as to what is appropriate is set higher. This includes things such as access controlsA series of measures (either technical or physical) which allow personal data to be accessed on a need-to-know basis.; secure disposal; antivirus and firewall protections; encryption; and strong password management.
If your organisation is engaging with a third-party for the processing, you will need to conduct a due diligence assessment to confirm the company also has appropriate security measures in place to protect the personal data. You can find more about vendor due diligence here.
When processing biometric data, you will need to consider the rights that both the UK and EU GDPR bestow upon data subjects. These include the right of access, rectification, erasure, data portability, and rights related to automated decision-making and profiling. All organisations need to be aware of these rights, as well as consider how they would action them in the context of biometric data.
However, there are also wider rights your organisation will need to consider. Article 8 of the European Convention on Human Rights enshrines the ‘right to a private life’. In S and Marper v United Kingdom the court held that collecting biometric data can affect the right to a private life; and in Bridges the High Court stated that if biometrics are “captured, stored and processed, even momentarily” then Article 8 is triggered.3
In order for processing to be compliant with Article 8, organisations (especially those in, or acting on behalf of, the public sector) must ensure:
For more information on this, please see the Ada Lovelace report on the governance of biometrics in England and Wales.
As biometric data is classed as ‘special category data’, breaches of the law involving it are likely to be looked on less leniently than those related to non-special category data. If your organisation suffers a data breach, the impact on the rights and freedoms of your data subjects is likely to be far more serious due to the nature of the data you are processing, with breaches needing to be reported to the applicable Supervisory AuthorityAn authority established by its member state to supervise the compliance of data protection regulation. with the possibility of an upper tier fine (£17.4 million/€20 million or 4% of global annual turnover).
Whilst Clearview AI’s fine was not related to a data breach but rather a host of other non-compliances with the law, the £7.5 million fine from the ICO, and another €20 million fine from Italy’s Supervisory Authority, shows how seriously regulators will take non-compliance in this arena.
Conclusion
Biometrics can serve as a useful tool for organisations, adding that extra layer of protection to sensitive accounts; as well as playing a vital role in law enforcement, security, and healthcare. However, with greater sensitivity comes greater responsibility and greater risk, meaning complying with data protection rules and regulations becomes that more vital.
It is also important to be on the look out for future regulations that will inevitably come into force in both the UK and EU and global recommendations that could come from wider bodies that may impact the way biometric data can be processed. For example, the EU’s Draft AI Regulation under whose remit real-time biometric systems and AI technologies used alongside biometric processing, will fall.
For more advice on how to ensure your processing of biometric or other special category data remains compliant, get in touch by filling in the form below.
Fill in your details below and we’ll get back to you as soon as possible