A guide to responsible AI in recruitment

The UK government has released guidance on responsible AI in recruitment – here's how it can help employers to reduce bias and discrimination risks from AI recruitment systems.

author

Bobby Ahmed

Managing Director Bobby is a highly experienced Employment Law Solicitor and the Managing Director at Neathouse Partners. He has a wealth of knowledge on all aspects of Employment Law & HR, with a particular specialism in TUPE and redundancy.

Date

18 June 2024

Updated

30 June 2024
4 min read
featured
A guide to responsible AI in recruitment
7:48

Increasing numbers of employers are turning to artificial intelligence (AI) to automate and streamline recruitment processes.

Examples include using AI to sift through CVs and applications to select candidates, assessing speech, body language and facial expressions during video-based interviews, and employing chatbots to automate parts of the recruitment process and filter out candidates.

AI has the potential to improve the candidate experience by giving them answers more quickly and being more responsive. It can also save an employer significant time and costs during the hiring process, allowing staff to focus on other tasks. However, there are associated risks which we'll discuss here, as all employers should be aware of them.


How is AI being used in recruitment?

Woman with tablet computer in a meeting room

Here are just some of the ways that AI is revolutionising the recruitment process for employers:

CV screening

AI algorithms can quickly sift through large volumes of resumes, identifying candidates that best match a job description based on keywords, experience and qualifications. This reduces the time and effort required for manual screening by a human being.


Candidate sourcing

An AI can search multiple online platforms, such as job boards and social media, to identify potential candidates who are a good fit for open positions. They can also contact other candidates who may not be actively looking for a job in the market, but may possess the desired skills and experience and be interested in a position.

 

Chatbots for initial communication

AI-powered chatbots can engage with candidates in real-time, answering questions they may have about a role and company, and even conducting preliminary screening interviews. This helps keep candidates engaged and informed throughout the recruitment process.

 

Interview scheduling

Tools powered by AI can free up company time by automating the scheduling of interviews. The AI can coordinate the availability of candidates with interviewers, reducing the back-and-forth communication and time that is typically involved in arranging meetings.

 

Video interviews

AI can be used to analyse video interviews, assessing candidates’ responses, facial expressions and body language to provide insights into their suitability for a role. These tools can also ensure consistency and reduce biases in the evaluation process.

 

Skill assessments and pre-interview tests

AI-driven platforms can administer and evaluate skill assessments and tests, providing objective data on candidates’ abilities and competencies. This helps ensure that hiring decisions are based on merit rather than personal judgments.

 

Predictive analytics

Historical hiring data and current market trends can be analysed in detail by an AI to predict which candidates are likely to be successful in a given role. This helps recruiters make more informed decisions and improves the quality of new hires.

 

Diversity and inclusion

Biases in job descriptions, screening processes and candidate evaluations can be mitigated by an AI, promoting diversity and inclusion in the hiring process.

 

Onboarding new recruits

AI can also assist with onboarding by providing new hires with personalised training programs and answering common questions to ensure they have the information they need to get started in their role.

 

Employment law risks of using AI in recruitment

Robot manfactured in a human form


Under the Equality Act 2010, job applicants, workers and employees are all protected from discrimination based on any protected characteristic, such as disability, race, or sex.

Using AI systems in recruitment can pose some risks – the main one being unintentionally giving biased results and discrimination. Bias can be introduced in various ways, and it can be within the data used to train the AI tool and the algorithm's design.

For instance, back in 2018, Amazon's AI recruitment tool, which was trained on a decade's worth of applicant data predominantly from men, learned to prefer male candidates, allegedly discriminating against women.

AI systems also risk excluding or discriminating against applicants who may lack proficiency in or access to technology due to factors like age or disability. Employers must make reasonable adjustments to the recruitment process for applicants with disabilities to ensure they do not face substantial disadvantages compared to non-disabled applicants.

This can be challenging when AI systems are involved, especially if employers or candidates do not fully understand these systems or how they make decisions based on data patterns.

 

What is the new 'responsible AI in recruitment' guidance?


The government guidance is non-statutory, and has been developed with feedback from CIPD and other organisations. It reflects the government’s AI regulatory principles, which are safety, security, robustness, appropriate transparency, fairness, accountability, and governance and redress.

Although the principles are not currently enshrined in law, they are enforceable by regulators such as the Equality and Human Rights Commission (EHRC) and the Information Commissioner’s Office (ICO). Therefore, employers and recruiters should take note of the guidance, and think about “assurance mechanisms” (ways of testing AI systems), for procurement.

 

What should employers do?

Women having a recruitment meeting at a conference room

Before using an AI for hiring purposes, employers should clearly outline its purpose and desired outcomes. These purposes should be made clear to whoever is supplying the AI tool, so that it can be easily integrated with existing systems. Training employees in how to use the AI tool and consulting with newly onboarded hires is also important to guarantee that the AI is being used positively. Feedback methods might include chatbots, surveys, or a contact email to get opinions.

Employers need to consider potential legal risks, including discrimination and data protection when using AI for recruitment processes. To reduce risks, they should carry out an impact assessment, a data protection impact assessment, and create an AI governance framework. Collaborate with suppliers to understand the AI tool's functionality and identify risks. Perform tests on the AI system, including a bias audit.

When doing a test-run of an AI, employers need to support employees with training to ensure correct use of the system. Carefully examine the AI tool’s performance against equality outcomes, and thoroughly check for bias. Plan for necessary adjustments to the technology, such as text-to-speech software for visually impaired candidates.

When a job candidate is applying for a role, be clear and inform applicants that AI is being used, specifying any limitations and how they might affect individuals. This helps applicants know if they need to request reasonable adjustments.

Finally, continuously monitor the AI system to identify potential issues, and quickly address any problems. Keep in mind that performance testing may not identify all potential issues or unintended consequences, including discriminatory outcomes. Keeping open feedback channels is crucial for ongoing improvement and mitigation.

 

We can advise on using AI responsibly for recruitment


Wondering whether using an AI tool is right for your business?

Our professionals at Neathouse Partners can advise on how to use AI effectively for hiring processes, while mitigating risks and bias. Get in touch about how we can support your business.

Speak to us by calling 0333 041 1094 today or use our contact form.

Neathouse Partners: Your Trusted Partner

We empower employers across the UK with tailored advice, offering strategic guidance that aligns with business needs and goals.

99.2 %

Customer happiness rating

99.2 %

Customer happiness rating

banner

100%

Qualified experts

Chester Headquarters

Regus House, Chester CH4 9QR

Success stories
Having used Neathouse for some time now, I can highly recommend their services. I always get a quick response from my lawyer, the advice is practical, and I always feel confident that they’re helping us with the correct approach.
avatar
Jasmin Bemmelen Head of People and Culture @ Action Tutoring Ltd

Have questions?

Get in touch today

Contact us, and our team will get back to you within 24 hours. We value your questions and are committed to getting them answered quickly.

Get a quote
banner
photo@2x

Hello! I am Nicky

Just fill in the form below with your details, and I will arrange for a member of our team to give you a call.

By clicking, you agree to our Privacy Policy