This article is provided by BRC Associate Member, Foot Anstey.

__________________________​

Facial recognition technology ("FRT") is not a new concept. It is used by us all every day, whether it be unlocking your phone or going through border control. However, retailers are now exploring how FRT can be used in a variety of ways, such as: deterring theft (which is the number one concern for 300 senior retail leaders surveyed by Avery Dennison in 2024); acting as an alternative to security passes; or monitoring consumer behaviour. It's therefore no surprise then that we're seeing a concomitant increase in client enquiries about the use of such technology.

FRT and other forms of biometric recognition technology are not without legal and reputational risk, and we have seen cases in the UK and overseas where retailers have been instructed to stop using FRT. They involve processing personal data and AI, a form of technology that is subject to increasing scrutiny. To help you avoid falling foul of the rules, we're going to explore 5 key considerations retailers must consider when navigating FRT and similar technologies.

1.   What is the purpose of using the technology?

To begin with, ask yourself (and any other stakeholders within your organisation) what business need you're seeking to address using such technology. Are you adopting FRT for security purposes or to understand trends in consumer behaviour (i.e. how many different individuals visit your store)? Do you have a lawful basis for collecting personal data (such as having the consent of affected individuals or it being necessary for a legitimate interest), which includes biometric data using the technology? It's also important to consider whether you're able to achieve your objectives via less invasive means, which will almost always be the case when using FRT to understand consumers engagement with advertising. For example, sending a push notification (initiated via Bluetooth) when a consumer walks past the shop to send an email is far less invasive than using FRT to send a push notification when an individual looks at advertising/marketing in a public space.

Bunnings – the difficulty in obtaining consent

Bunnings presents an example of the challenges retailers face in obtaining express consent for the use of FRT.

Bunnings, the Australian DIY and hardware retailer, employed FRT in 62 of its stores between November 2018 and November 2021, to safeguard its customers, suppliers and staff in the face of increased violence. The Office of Australian Commission ("OAIC") recently held that Bunnings had interfered with the privacy of thousands of customers through collecting FRT and holding biometric data without their express consent. Whilst Bunnings did place signs both outside and inside some stores and update its privacy policy to notify customers about the use of FRT, the signs weren't obvious enough and failed to adequately inform customers about the collection/retention of data. The OAIC considered the use of FRT as unnecessary, disproportionate and the most intrusive option available to the retailer.

Although it is not directly applicable to retailers trading in England and Wales, it does show:

(i) the difficulty retailers will face in obtaining express consent from customers to use FRT (with the signage considered insufficient by the OAIC); and

(ii) the need to always consider if, in the circumstances, there are less invasive means available.

2.   What is the technology actually doing and does it use AI?

Understanding the mechanics of FRT is critical to ensure you can explain how the technology works to affected individuals and fulfil your transparency obligations under UK data protection law. Does the technology include hardware (i.e. CCTV cameras) and software (i.e. software that matches faces to a database or unique hexadecimal reference numbers)? If the FRT is AI-enabled, there are heightened risks involved, including bias. While not directly applicable in the UK, the risk mitigation measures in the newly passed EU AI Act offer a useful framework for risk management and potentially provide an indication of what we can expect a similar UK law to look like. Retailers may wish to consider these risk mitigations when contemplating the use of AI.

3.   What kind of data is being collected and processed?

There is a distinction between collecting biometric data (e.g. a person's face, voice or fingerprints) and special category biometric data (i.e. biometric data used for the purpose of uniquely identifying an individual). The latter is considered more sensitive and carries the risk of bias and profiling. As such, you will require a valid condition for processing it under UK data protection law, such as substantial public interest.

Serco Leisure – special category data

Serco provides an example of the difficulty retailers can face in establishing a lawful basis for processing special category personal data.

On 19 February 2024, the Information Commissioners Office (ICO) (following an extensive investigation dating back to 2019 where an employee first complained about Serco using FRT in the workplace) found that special category biometric data of more than 2,000 employees had been unlawfully processed. In making its determination the ICO concluded Serco failed to show why it was necessary or proportionate to use biometric data as opposed to ID cards, when employees clocked in or out of work. The ICO ordered Serco to stop using FRT.

The ICOs finding acts as an apt reminder of the need for retailers to constantly assess whether they can achieve objectives without using FRT. The ICO finding has indicated FRT to monitor attendance is likely to be non-compliant with data protection laws.

4.   What have we done to mitigate the risks involved?

Have you prepared a data protection impact assessment? Have you undertaken due diligence on the supplier of the technology? Engaging with relevant stakeholders across your business (i.e. your DPO, IT team and HR) will simplify these processes. Ensure you've carefully reviewed the supplier agreement to understand how risk and responsibility are allocated and undertaken an information security assessment to satisfy yourself the technology is cybersecure.

5.   Have you ensured that your internal governance structure is updated?

Think about what internal policies and procedures you will need to implement or update to support the introduction of the technology. Inform your employees and customers that they will be subject to FRT by updating your internal and external privacy information (i.e. your privacy policies). For a belt-and-braces approach, create and place public signage outside your premises which clearly alerts affected individuals to the use of the technology on the premises, and consider how you will respond to any objections you receive.

Whilst these are fundamental legal considerations, equally as important are reputational considerations: i.e. what your customers and employees think of your organisation. Do your customers think there are less invasive measures that can be used to prevent crime? Do employees deem ID badges as a sufficient way to monitor employees when they work? Asking for customer and employee feedback on FRT and/or running trials are useful tools to engage with those most people important to your business.

The rise of FRT and similar technologies raises questions and so should you. Gather the information you need to understand what the technology you are rolling out is intended to achieve, how it works, how liability is apportioned and what risk mitigation measures will be implemented. To save face, ask questions and be prepared to answer them.

Article provided by