Facial recognition technology in employment: What you need to know

As in other areas of life, facial recognition technology (or “FRT” for short) has gained traction in the employment sphere over recent years, as employers explore new ways of working and managing employees.

Further to this general trend, the COVID-19 pandemic, with the resulting enforced movement to home working for many employers and employees, has prompted employers to consider new ways of monitoring and managing staff, including FRT.

In this article, we look at how FRT works, where we are seeing its use in the employment sphere, and some of the risks and pitfalls when implementing FRT at work.

  1. FRT – how it works

    Facial recognition is the process of identifying or verifying the identity of a person using their face. FRT captures, analyses and compares patterns based on the person's facial details. In simple terms, it works using the application of algorithms as follows:

    • Step 1: detecting and locating a human face or faces in images and videos.
    • Step 2: transformation of analogue information (a face) into a set of digital data based on the person's facial features, which can then be applied.
    • Step 3: verifying whether two images match, based on the digital information collected.

  2. How are employers using FRT?

    The use of FRT in the workplace has become more widespread over the past few years and is now used in multiple ways in the employment context.

    Streamlining Recruitment

    FRT-based software can be applied to analyse the facial expressions, vocabulary and tone of voice of candidates and then rank each candidate based on an automatically generated ‘employability’ score. Firms such as Vodafone, Unilever and Intel, as well as the likes of Goldman Sachs and JP Morgan, are all reportedly using this type of software to improve efficiency in recruitment. Unilever is reported to have claimed that its average recruitment time has been cut by 75% as a result of such processes.

    Attendance tracking and time-recording

    Traditional attendance tracking and clock-in / time recording systems are open to abuse, particularly at sites where lots of people regularly come and go. FRT was adopted early in certain sectors, such as manufacturing and construction, where it can be difficult to monitor accurate on-site attendance and where the practice of workers clocking in for absent colleagues could be an issue.  It has also seen growth in other sectors in recent years. The software usually requires the worker to enter a unique pin code and then stand in front of a camera while their identify is verified by FRT.

    Monitoring activity and productivity

    The use of FRT technology in monitoring employee productivity has accelerated in 2020. As COVID-19 imposed homeworking arrangements on many businesses, employers have looked for ways to maintain oversight over their workforce remotely.

    One increasingly common monitoring solution grants the employer access to an employee’s camera and uses FRT to monitor when the employee is present. We have seen a particular growth in this type of solution during lockdown, perhaps reflective of the common concerns employers cite regarding remote working arrangements (see our article on employee monitoring for further discussion of this). This has been particularly prevalent in the financial services sector where (as noted below) security compliance is a specific concern, but we are increasingly seeing employers in other sectors considering these types of solution.

    Compliance & security

    FRT is increasingly being applied to restrict access and to check the identity of workers as they enter and exit the workplace, or certain areas of the workplace.

    Firms are also increasingly adopting FRT to assist with security and compliance obligations. Unsurprisingly, this is more prevalent in highly regulated areas, such as the financial services sector, where companies are subject to strict compliance obligations.

    For example, earlier this year, Pricewaterhouse Coopers (“PwC”) developed an FRT-based tool that would allow clients to track their employees whilst they worked from home. The system reportedly uses employees’ webcams to log absences from their desks and forces them to give a written explanation for time spent away from their computer screens. PwC said in a statement that they are “developing [the] technology specifically to support the compliance environment required for traders and front office staff in financial institutions. Crucially it is designed to support those adhering to the regulations while remote working, in the least intrusive, pragmatic way.”

    More recently we have seen companies considering pairing thermal cameras with facial recognition software and personnel directories as part of their COVID-19 health & safety protocols adopted by employers as staff return to the workplace. This is a complicated area (explored further in our article here and our international guide here).

  3. What about processing this data?

    The use of FRT in the employment context involves the processing of an employee’s personal data and therefore the GDPR and the Data Protection Act 2018 (“DPA 2018”) apply. Deployment of FRT in the workplace must be necessary and proportionate to the aims the employer is seeking to achieve (e.g. preventing and detecting unlawful acts committed in the workplace). Where such aims can be achieved through less intrusive methods, the deployment of FRT may not be lawful. Any interference with fundamental rights (including the employees’ right to data protection and privacy) must be demonstrably necessary, rather than just convenient, with the standard for this test becoming more stringent as the interference becomes greater.

    FRT involves the processing of biometric data

    FRT involves the processing of biometric data, irrespective of whether the image yields a match or the biometric data is subsequently deleted in a short space of time. Biometric data used for the purpose of uniquely identifying an individual is a special category of personal data (i.e. sensitive data). Employers must therefore ensure (i) that they have a lawful basis for the each of the purposes involving the processing of personal data for the FRT; and (ii) that they also identify a special category condition.

    • The relevant lawful basis and the relevant condition may differ depending on the purposes for which the FRT is deployed. FRT could be used for preventing crime, for authentication, or for other purposes. Each of these applications of the FRT could require different lawful bases and special category conditions.

    • Processing special category data is prohibited unless one of an exhaustive list of conditions applies. The relevant conditions are limited, particularly for employers. Generally, the two conditions relevant to FRT are: (1) explicit consent; and (2) substantial public interest. The former is difficult in the employment context as consent is unlikely to be deemed “freely given”, taking into account the imbalance of power in the employment relationship. Therefore, substantial public interest would be the most viable condition on which employers would seek to rely.

    Artificial intelligence (AI)

    FRT relies on AI to function and it is therefore important for employers to ensure that the vendor supplying the AI tool has carefully considered their obligations under data protection law. The Information Commissioner’s Office (“ICO”), the UK data protection supervisory authority, has released guidance on AI including draft guidance on the AI auditing framework. Employers should be comfortable with how the technology works and that it will not inadvertently lead to discrimination, which can be even more pronounced in the employment context as discussed below.

    Automated decision-making

    FRT often involves the use of automated decision-making. Article 22(2) GDPR prohibits automated individual decision-making unless the processing is based on performance of a contract, is authorised by EU or UK law or is based on the individual’s explicit consent (noting that an additional condition must also apply for special category data).

    To the extent that the FRT could be used to take decisions with a legal or similarly significant effect on employees, then employers should consider whether the processing can benefit from one of the exceptions in Article 22(2) GDPR (i.e. contractual necessity, legal necessity or explicit consent). Examples of legal effects given in guidance issued by the European Data Protection Board (“EDPB”) include denying someone an employment opportunity or put them at a serious disadvantage – all of which are relevant to the common uses of FRT in the employment sphere.

  4. What are the main issues with FRT?

    Aside from the complexities in safely processing the data, as discussed above, employers should also carefully consider additional risks before implementing tools incorporating FRT, including the following.

    Accuracy, bias and discrimination

    One of the key areas of concern with the use of FRT is the accuracy of FRT systems, and the risk of algorithmic racial or sexual discrimination or bias. The Equality and Human Rights Commission’s report to the UN on civil and political rights in the UK highlighted evidence indicating that many automatic FRT algorithms disproportionately misidentify black people and women, and therefore operate in a potentially discriminatory manner.

    As an example, a recent review of the FRT developed by Microsoft, IBM and Face ++ (which combines facial recognition and artificial intelligence software) found that the software misidentified 0.8% of lighter skinned males, compared to 34.7% of darker skinned women. These statistics are reflected across similar products.

    Critics point to the lack of representation of women and people of colour in the training and benchmark data sets used to create the algorithms as one of the main reasons for this bias. As a result, the FRT is not able to identify as many differences between faces in these categories of individuals.

    If an employer makes decisions solely or mainly based on FRT data (for example, to deny employment opportunities, or to discipline or dismiss an employee accused of disciplinary breaches, poor performance or absence identified primarily using FRT), the employer could be vulnerable to a discrimination claim. Employers should consider related risks risk carefully, as getting this wrong can have significant consequences - compensation awarded for successful discrimination claims is uncapped and such cases are inevitably high-profile and attract internal gossip and media attention.

    Separately, many FRT systems have reported difficulties in identifying the faces of individuals who are wearing masks – which is a practical problem for employers using such systems given the requirements regarding face masks during the COVID-19 pandemic.

    Privacy considerations

    It is common for employers to include the right to monitor employees as a term in their contracts of employment and/or in their IT and communications policies, but employers must appreciate that doing so does not give them carte blanche on monitoring activities. Given the nature of FRT, particular care should be taken to manage the associated risks to employee privacy rights.

    Excessive monitoring could constitute a breach of the right to respect for private and family life under Article 8 of the European Convention on Human Rights (“ECHR”). This right is engaged whenever there is a reasonable expectation of privacy, and it is well established that the concept of private life extends to the workplace. Whilst the Human Rights Act 1998 (which incorporates the ECHR into UK law) is only directly applicable to public authorities, it is still relevant to employers in the private sector as courts and tribunals are obliged to interpret all legislation consistently with ECHR rights. In the employment context, this means that an employment tribunal must consider the right to privacy where relevant (which it almost certainly will be, where FRT data is used to inform the employer’s decision making).

    Employers must be able to demonstrate that any form of monitoring they choose to implement, including FRT, is proportionate and necessary. This includes consideration of the underlying concern or purpose of the monitoring, whether this is a legitimate aim, and whether FRT is a necessary and proportionate solution to address these concerns, balancing the employees’ privacy rights against the employer’s interests. For further details on employee monitoring considerations, see our article (s) here and here. For international considerations, see our Country Comparisons Tool.

    Employee relations

    Monitoring and surveillance techniques used by employers are inherently unpopular with employees, and the more intrusive they are the more employees dislike them.

    In 2018, the Trades Union Congress (“TUC”) released a report further to a study it had conducted on workplace surveillance across c.2,000 workers. The report stated that 76% of participants believed the use of FRT to monitor employee activity is unacceptable. The report also highlighted more generally that employees feel monitoring should be justified rather than arbitrary and anything which is too invasive or overly focused on one individual is also widely seen as unacceptable.

    Failure to consult employees on the use of FRT or disproportionate use or reliance on FRT could easily lead to a breakdown in trust between the employer and employee, thereby damaging the relationship. This is particularly likely where the FRT is used for monitoring employee activity and productivity, as many employees see the application of FRT here in itself as an expression of distrust.

    Employers should be mindful of the implied duty of trust and confidence, which exists in every employment relationship. The use of FRT, particularly for monitoring activity or productivity, is high-risk and could be argued to be in breach of this duty, something which could give an employee grounds to resign and claim constructive dismissal.

    Reputation and brand value

    Perceived misuse, poor employee engagement and/or failure to implement FRT technology in a proportionate and transparent way all have potential implications for the reputation and brand value of employers.

    PwC has been forced to explain the development and roll out of the FRT tool referred to above – its media response has emphasised that the tool was developed specifically with the financial services sector, and the relevant strict compliance obligations, in mind.

    Barclays (who, along with The Telegraph newspaper, faced employee backlash and press scrutiny in relation to the rollout of OccupEye in 2017), was also forced to scrap a pilot of employee tracking software amid criticism from staff and unions earlier this year. The new system reportedly told staff to "avoid breaks" and recorded toilet trips as "unaccounted activity," which was not well received by employees.

    More generally, IBM has announced that it “no longer offers general purpose IBM facial recognition or analysis software”. Amazon and Microsoft have also retreated from selling this technology to the public sector. Looking internationally, Facebook is facing class action in the US in relation to its use of FRT, Apple has been subject to repeat criticism regarding its FRT applications, and TikTok’s use of FRT has been subject to scrutiny by regulators in both the US and Europe, all of which have attracted negative press and commentary. In Europe, the use of FRT in Kings Cross station (UK) and in schools (Sweden) have been subject to regulatory investigation and, in the latter case, a fine. All of these examples (and there are others) have been extensively reported in the press.

    In the UK, the Court of Appeal in R (Bridges) v Chief Constable of South Wales Police & Ors [2020] EWCA Civ 1058 held that the use of automated FRT by the police force breached privacy rights. Mr Bridges had challenged the use of FRT by South Wales Police (“SWP”) to scan faces in public places, automatically register those faces and then compare the images with those of people on the police's watch list. The Court of Appeal ruled in August 2020 that this breached the right to privacy under Article 8 of the ECHR.  It held that SWP was also in breach of the data protection legislation and equality legislation. This case was widely reported in the media and has potentially caused lasting reputational damage to South Wales Police.

  5. Top tips for implementing an FRT system

As the above examples demonstrate, FRT is particularly controversial and so employers should approach it with extreme caution.  When considering the use of FRT in the employment context:

  • Consider (i) the relevant risk or concern and (ii) the desired outcome for which you are proposing to use FRT – is the use of FRT necessary and proportionate, and/or could you achieve this by a less intrusive method? If so, there may be serious risks in proceeding with deploying FRT in the workplace. Employment courts and national data protection authorities are likely to punish excessive, inappropriate or unnecessary applications of FRT.

  • Make sure you notify employees clearly of (i) the existence, nature and extent of any FRT application, and (ii) the purpose in applying FRT, including the potential consequences or uses of information generated by those applications. If you intend to capture and use FRT stills or reports to performance manage or discipline employees, you need to make this explicitly clear to those employees.

  • Scrutinise third party vendors and offerings carefully, with particular regard to their data privacy compliance. You cannot avoid the risks by passing these on to external providers.

  • Consider data protection obligations carefully. Key compliance obligations include:

    • carrying out a data protection impact assessment (“DPIA”);
    • updating privacy notices and policies (including meaningful information about the logic involved if there is any automated decision-making);
    • establish clear retention periods, access restrictions and security protocols for data retained by the FRT;
    • implement an appropriate policy document and an extended record of processing where required;
    • implement appropriate protocols in relation to employee data rights requests, including the rights of access, objection and erasure.

  • Expect pushback from staff and plan ahead – employee buy-in is crucial. Plan your employee communications carefully, anticipate sensitive areas and questions and think about your responses in advance.

  • If your plan is particularly controversial – notify your press advisors and be prepared to deal with negative publicity.

Latest insights

More Insights
Energy and Utilities 500x333

Current European plans to promote hydrogen technologies: The Net Zero Industry Act

Apr 25 2024

Read More
Generative AI

Use of AI within the energy sector – Ofgem’s proposals and call for input

Apr 25 2024

Read More
Competition and EU

Competitive Edge newsletter - Special edition on Investigations - April 2024

Apr 25 2024

Read More

Related capabilities