4 minutes read

From science fiction to company corridors: Navigating the legal maze surrounding facial recognition technology

Facial recognition technology (FRT) has swiftly moved from science fiction to a relatively common feature in our daily lives. From employee access controls to cashless transactions and theft deterrence, its commercial applications are numerous. Yet, despite its potential, there is no direct, well adapted legal framework governing its use. The gap is partially filled by privacy and data protection laws, of which companies and other organisations need to be aware as more and more use is made of such technologies.

John Edwards, the UK information commissioner, underscored the unique risks associated with biometric data in 2024. He pointed out, "Biometric data is wholly unique to a person so the risks of harm in the event of inaccuracies or a security breach are much greater – you can't reset someone's face or fingerprint like you can reset a password." Clearly, robust protections are essential to safeguard individuals' biometric data.

Regulatory decision-making surrounding use of FRT has highlighted the potential errors that may be made. In a case involving Serco Leisure, Serco Jersey, and seven community leisure trusts, the Information Commissioner's Office (ICO) issued enforcement notices, ordering the companies to stop using facial recognition and fingerprint scanning technologies for monitoring employee attendance. The ICO's investigation revealed that Serco and the trusts had unlawfully processed the biometric data of over 2,000 employees across 38 leisure facilities. Edwards commented, "Serco Leisure did not fully consider the risks before introducing biometric technology to monitor staff attendance, prioritising business interests over its employees’ privacy. There is no clear way for staff to opt out of the system, increasing the power imbalance in the workplace and putting people in a position where they feel like they have to hand over their biometric data to work there."

The need for a proper process when implementing new technologies, including substantive and timely consultation with affected groups, is clear. The power imbalance between employers and employees is significant and impacts all decisions regarding the use of their data. If organisations get it wrong, it may lead to legal challenges, as well as damaging employment relationships.

Although the Serco decision pertains to the leisure sector, other sectors are not immune to these issues. Last year, Chelmer Valley High School was reprimanded under Article 35(1) for failing to complete a Data Protection Impact Assessment (DPIA) before implementing facial recognition technology for cashless catering. The school did not adequately manage consent, leaving students unable to exercise their rights and freedoms properly. The ICO advised the school to follow guidelines, amend their DPIA, and update privacy information to ensure compliance with UK GDPR.

FRT use is increasingly considered by overseas regulators. In late 2024, Australia’s Privacy Commissioner declared a retail chain’s use of facial matching to identify individuals known to have committed violent assaults on staff to be unlawful. Again, the need to conduct a proper impact assessment was highlighted.

Any use of FRT must comply with privacy and data protection rights. The collection and processing of biometric data must be conducted lawfully and transparently. The ICO's stance is clear – they “will closely scrutinise organisations and act decisively if [they] believe biometric data is being used unlawfully."

Organisations looking to implement FRT must tread carefully. However, it is still possible to use these promising technologies. Our top three tips for organisations wishing to use FRT are:

  1. Follow the rules: Make sure you are following all the relevant laws and regulations about using facial recognition technology. This includes understanding the specific requirements of the UK GDPR, such as the need for a lawful basis to process biometric data and ensuring the use is necessary and proportionate. If there is a less intrusive way of achieving the same outcome, the ICO are likely to expect organisations to use the alternative option.
     
  2. Be clear about how it will be used: Before you use FRT, make sure you explain clearly how you will be using the personal data of the people whose faces you will be scanning. This means explaining what you are doing, why you’re doing it, and how their data will be used. It's important to be transparent and ensure people understand. Ensure that you have fully considered the risks of using the personal data and documented it accordingly, usually by way of a data privacy impact assessment.

  3. Protect the data: Facial recognition data is very sensitive, so you need to protect it carefully. Use strong security measures like encryption and secure storage to keep the data safe from hackers. Regularly check and update your security practices to make sure they stay effective. This will assist in minimising any loss or damage should there be a data breach or cyber attack.

By considering consent requirements, transparency, and conducting adequate data protection impact assessments, organisations can avoid hidden pitfalls and errors. Expert assistance may be necessary, but ultimately, it is possible to realise the benefits of FRT without compromising individuals’ legal rights.

Our content explained

Every piece of content we create is correct on the date it’s published but please don’t rely on it as legal advice. If you’d like to speak to us about your own legal requirements, please contact one of our expert lawyers.

Contact

Claire Williams

+441865968562

How we can help you

Contact us

Related sectors & services