Kate Dodd tells HRNews about HR’s role in understanding AI technology used in the workplace
HR-News-Tile-1200x675pxV2

We're sorry, this video is not available in your location.

  • Transcript

    The Equality and Human Rights Commission has responded to the government’s White Paper on artificial intelligence regulations, warning that the proposals for safeguarding are inadequate and fall short of what’s needed to tackle the risks to human rights and equality.

    Chair of the EHRC, Baroness Falkner, says that although AI comes with advantages, safety nets are necessary to protect people ‘from risks posed by unchecked AI advancement’ and there is a need for significantly more funding for regulators to ensure ‘AI does not worsen existing biases in society or lead to new discrimination’.

    This is not the first time the Commission has fired a warning about the risks that come with AI. Last September they published guidance on the use of artificial intelligence by public bodies in response to emerging evidence that bias built into algorithms can lead to less favourable treatment of people with protected characteristics such as race and sex. The Commission said it was particularly concerned about the use of AI in workplace applications, and specifically mentioned recruitment as a high-risk area.

    The CIPD has advice for HR on this which they published back in August in a thought-leadership piece called: ‘Shaping artificial intelligence for your future business needs’ where they stress the importance of AI–human collaboration. Hetan Shah of the British Academy is quoted saying: ‘Al may automate some jobs entirely out of existence but, in many cases, AI decision-making tools work best when collaborating with, and augmenting, human capabilities. This means that workers will need to understand the strengths and limits of any AI system they are working with. The biggest danger will come to organisations that rely on systems they do not understand without enough human oversight.’

    We agree, human oversight is essential, and there is no reason why that human should not be an HR professional, especially if the AI in question is making people-related decisions. Kate Dodd heads up D&I consultancy Brook Graham and the risk of bias when using AI is one she has been flagging with clients consistently. Earlier Kate joined me by video-link to discuss it. I put it to Kate that HR needs to understand how the technology works before the business goes ahead and buys it:

    Kate Dodd: “Yes, absolutely, and it's really interesting because when you see how AI is marketed towards HR, the key messages are efficiency, this is going to really make your HR function so much more efficient, it's going to take out all the boring admin, the legwork, and we can do this for you, but the worrying thing is that we see very, very little in that advertising literature about how AI can possibly deal with the problem of bias, and how important it is, of course, to avoid bias in HR processes.”

    Joe Glavina: “So, ideally, HR needs to be involved in the procurement process?”

    Kate Dodd: “Yes, absolutely, and it's worrying, actually, to see the number of businesses where HR don't have a voice in that procurement process. So, this will often be led elsewhere in the business, maybe there is a sophisticated procurement team, often there will be a working group who are looking at AI and how it can be used within a business - very rare to have HR representation on that. So, one of the key things, I would say, that our clients should be doing is making sure that HR have a voice in that procurement and it's not left until ‘here you go HR, here's some great piece of software’ which they then have to use.”

    Joe Glavina: “I know you’ve had recent experience of where things can go wrong Kate. What happened in that case, without mentioning any names?”

    Kate Dodd: “Like many of my clients, this company was signed up to Disability Framework. What they had done is they had committed that if an individual met the minimum criteria for a role, if they had a disability that was recognised by the Equality Act, then they will be guaranteed an interview and that's a really common thing to have in the systems of businesses that are accredited for disability purposes. Now, what the client hadn't realised is that the AI that they were using to do their screening, it simply wasn't factored in. So, what they were finding is, actually, the individuals who should have been guaranteed an interview because that's what the company had publicly committed to do, were not being offered interviews and, of course, for them, the PR aspects of that will really, really worrying.”

    Joe Glavina: “So that’s a good example of bias in AI. So how do you avoid it?”

    Kate Dodd: “I think ensuring HR involved in procurement is really important because HR is actually one of those things where you need the human touch. The purpose of HR is there to advocate for the people that work for that business so you need to have HR people involved in the procurement processes and you need to have HR people in there challenging saying, look, these are the things that we've signed up to as a business around diversity and inclusion, whether it's disability, whether it's something else, to say these are the commitments that we've made, and can you please guarantee that the AI that we're using is going to be able to take these things on board and, if not, then you need to have a workaround to that.”

    The Commission’s response to the government’s White Paper on artificial intelligence was published on 27 June and there is a link to it in the press release that appears on their website. We’ve put a link to it in the transcript of this programme for you.

    LINKS

    - Link to EHRC press release – ‘AI safeguards inadequate watchdog warns’

    - Link to EHRC guidance on the use of AI in public services

    - Link to CIPD thought-leadership article on AI

We are processing your request. \n Thank you for your patience. An error occurred. This could be due to inactivity on the page - please try again.