Algorithmic Bias in Employment - What You Need to Know

Written By

rob collier wright Module
Rob Collier-Wright

Senior Associate
UK

I am a senior associate in our Employment practice in London, specialising in both contentious and non-contentious employment law matters. I am part of the International HR Services team.

Algorithmic bias describes a situation in which systematic errors by an algorithm create unfair outcomes, often to the disadvantage of minority groups.  It is not a new phenomenon. It has, however, garnered attention in the mainstream press since Ofqual’s recent school exam scandal,1 where an algorithm downgraded students from certain socioeconomic backgrounds.

Whilst algorithmic bias may arise in any sector of society, it is particularly important for employers to understand where algorithmic bias can occur in the employment sphere, why it happens, and what legal implications they may face if they do not mitigate the risks.  The wealth of technological tools now available to employers for recruiting and managing staff makes this a particularly pressing issue. 

Algorithmic bias in the employment sphere
Employers are increasingly implementing algorithmic decision-making technologies to streamline tasks throughout the employment lifecycle, from recruitment to dismissal. As with many aspects of commercial life, COVID-19 has accelerated this process, with 60% of organisations adopting new technology or management practices within the first six months of the pandemic.2

There is clear evidence of algorithmic bias in AI tools. For example, Buolamwini found that three commercial facial-recognition systems correctly identified the gender of white men 99% of the time, whilst the error rate for darker-complected women was nearly 35%.3

Algorithmic bias most commonly occurs where an algorithm (usually a deep learning algorithm which ‘learns’ from large amounts of data) has not been provided with sufficiently diverse training data. This causes the algorithm to develop blind spots and provide inaccurate or biased decisions.

Legal implications
UK employers have legal obligations to avoid algorithmic bias under both equality laws and data protection legislation (for example, the UK Data Protection Act 2018 (DPA 2018) and the General Data Protection Regulation (GDPR)). 

First, under the Equality Act 2010, employers are prohibited from discriminating throughout the employment relationship, including during recruitment and dismissal. A number of forms of discrimination are prohibited, including direct discrimination and indirect discrimination:

  1. Direct discrimination occurs where an employer treats an employee or job applicant less favourably because of the individual’s protected characteristic.4 The protected characteristics are age, disability, gender reassignment, marriage and civil partnership, pregnancy and maternity, race, religion or belief, sex and sexual orientation.5  Direct discrimination could occur where an algorithm relies on protected characteristics or proxy data when making a decision. This is more likely to happen where an algorithm aggregates large quantities of data. Depending on the nature of the alleged discriminatory conduct, there may be no need to prove the employer’s intention (or indeed knowledge),6  so an employer’s ignorance of algorithmic bias giving rise to discrimination will not necessarily relieve them of liability.

  2. Indirect discrimination occurs where an employer applies a provision, criterion or practice (PCP) to all employees or job applicants which puts individuals with a protected characteristic at a particular disadvantage.7  PCPs are interpreted broadly, and algorithmic decision-making systems would likely count as a PCP. Their use may therefore give rise to indirect discrimination, save to the extent the employer can show the use of algorithmic decision-making systems is objectively justified, i.e. that it is a proportionate means of achieving a legitimate aim.  This may be difficult in practice.

Second, employers must comply with the data protection principles of lawfulness, fairness and transparency.  This can encompass non-discrimination obligations.
 
Furthermore, the GDPR does not allow solely automated decision making which produces legal or similar effects (which the UK regulator – the ICO – says includes “e-recruiting” practices8 ) unless: 

  1. authorised by law (which requires compliance with additional requirements set out in section 14 of the DPA 2018); 

  2. “necessary” to enter into or carry out a contract with the individual; or

  3. if the individual has freely given their explicit consent
Consent is problematic in the employment context, due to the power imbalance in the employment relationship and, even if explicit consent is legitimately obtained, the employer must still provide a system to allow the individual to appeal against an automated decision and have the decision reviewed by a person.

Employers are also obliged under the GDPR to undertake a data protection impact assessment (DPIA) where, inter alia, automated decisions produce significant legal effects or special category data is processed on a large scale.9  The classification of data as special category data loosely aligns with a number of the protected characteristics under the Equality Act 2010.  Even the mere retention and/or use of special category data to train, test or trial algorithms (for example, check for biases), whether before or after they are deployed for real use with employees or job applicants, can raise issues under the GDPR and DPA 2018.

Rights of enforcement
Enforcement of rights in relation to the discriminatory or otherwise unlawful use of algorithmic decision-making technologies has not yet been the subject of material litigation in the UK, although UK “gig economy” workers have asserted their GDPR rights in cases elsewhere in Europe.10

Employees or job applicants who consider they have been treated unlawfully as a result of algorithmic decision-making would have the usual rights of enforcement depending on the nature of the unlawful conduct, i.e.:
  1. For discrimination under the Equality Act 2010, employees or candidates may bring claims in an Employment Tribunal within three months of the alleged unlawful conduct. Compensation for financial loss arising from discrimination is potentially uncapped and employers can also be ordered to pay “injury to feelings” and personal injury awards to compensate employees for non-financial loss; and 

  2. For breaches of data protection obligations, employees can seek compensation and enforcement under the GDPR and DPA 2018 in court, or report a GDPR breach to the Information Commissioner’s Office. This could lead to heavy fines, business disruption, and reputational damage. 
Practical consequences for employers
Employers have a range of duties they must follow to remain compliant with the Equality Act 2010 and the GDPR. Potential strategies that employers and developers could utilise include completing comprehensive equal opportunities and data protection impact assessments before embarking on the use of algorithmic tools, as well as regular review and audit of the relevant technology. Diversifying datasets and informing users of where and how AI is being used may also help to avoid algorithmic bias and reduce the risk of claims.  Over in the EU, new legislation has been proposed that would help lawfully use special category data – such as personal data relating to health or ethnicity – to check for biases in “high risk” software.11

Organisations such as the Institute for the Future of Work also recommend ensuring that algorithmic decisions are always evaluated by a trained human decision-maker before implementation.12  Whilst limiting algorithmic bias is a challenging issue to navigate, developers and employers can reduce the risk of breaching their legal obligations with careful preparation and planning, and effective communication.

 


1 https://schoolsweek.co.uk/a-level-results-2020-poorer-pupils-see-greater-drop-in-calculated-grades/ 

2 https://cep.lse.ac.uk/pubs/download/cepcovid-19-009.pdf 

3 http://proceedings.mlr.press/v81/buolamwini18a/buolamwini18a.pdf

4 Equality Act 2010, Section 13

5 Equality Act 2010, Section 4 

6 R (E) v Governing Body of JFS [2010] 2 AC 728

7 Equality Act 2010, Section 19

https://ico.org.uk/for-organisations/guide-to-data-protection/guide-to-the-general-data-protection-regulation-gdpr/individual-rights/rights-related-to-automated-decision-making-including-profiling/#ib3

9 General Data Protection Regulation, Article 35

10 https://www.adcu.org.uk/news-posts/app-drivers-couriers-union-files-ground-breaking-legal-challenge-against-ubers-dismissal-of-drivers-by-algorithm-in-the-uk-and-portugal

11 EU Proposal for a Regulation laying down harmonised rules on artificial intelligence (COM(2021) 206 final), Article 10(5).

12 https://www.ifow.org/publications/mind-the-gap-the-final-report-of-the-equality-task-force 

Latest insights

More Insights
Colourful building

Fintech Features - October 2024

Oct 11 2024

Read More
people bridge

Frontline UK Employment Law Update Edition 31 2024 - Case Updates

Oct 09 2024

Read More
Pair of glasses

UK: “Preparing to prevent” sexual harassment – what should employers be doing ahead of the new duty coming into force?

Oct 09 2024

Read More