The ICO has warned organisations of the need to ask key questions around data and fairness when procuring AI tools to assist with their staff recruitment processes.
Many recruiters may be looking to procure these tools to improve the efficiency of their hiring process, helping to source potential candidates, summarise CVs and score applicants. However, the ICO warns that, if not used lawfully, AI tools may negatively impact jobseekers who could be unfairly excluded from roles or have their privacy compromised.
A recent ICO audit of several providers and developers of AI tools for the recruitment industry uncovered considerable areas for improvement, such as ensuring personal information is processed fairly and kept to a minimum, and clearly explaining to candidates how their information will be used by the AI tool.
The regulator made almost 300 clear recommendations for providers and developers to improve their compliance with data protection law, all of which have been accepted or partially accepted. Its audit outcomes report summarises the key findings from these audits, as well as practical recommendations for recruiters wishing to use these tools.
Ian Hulme, ICO director of assurance, said: “AI can bring real benefits to the hiring process, but it also introduces new risks that may cause harm to jobseekers if it is not used lawfully and fairly. Organisations considering buying AI tools to help with their recruitment process must ask key data protection questions to providers and seek clear assurances of their compliance with the law.”
Printed Copy:
Would you also like to receive CIR Magazine in print?
Data Use:
We will also send you our free daily email newsletters and other relevant communications, which you can opt out of at any time. Thank you.
YOU MIGHT ALSO LIKE