Out-Law News 3 min. read

AI freedom of information ruling sends warning to HMRC

HMRC - SEOSocialEditorial image

Peter Dazeley/Getty Images


A ruling ordering HM Revenue and Customs (HMRC) to hand over information about its use of artificial intelligence (AI) when dealing with research and development (R&D) claims highlights the importance of transparency when using AI, an expert has warned.

Software-based tax claims business owner Thomas Elsbury brought an appeal against the Information Commissioner’s decision that HMRC was entitled to refuse, in response to a freedom of information (FOI) request made by Elsbury, to confirm or deny that it held information regarding its use of large language models (LLMs) and generative AI, such as ChatGPT, specifically within the context of the R&D Tax Credits Compliance Team.

The First-tier Tribunal (Information Rights) has now overturned the Information Commissioner’s decision and ordered HMRC to confirm if it held the information requested and, if it did hold it, either supply the information to Elsbury or serve a refusal notice under section 17 of the FOI Act explaining its grounds for refusal. The tribunal also warned that changing the basis on which it had sought to avoid disclosure was “beyond uncomfortable”.

Jake Landman, a tax expert at Pinsent Masons, said: “HMRC is openly stating that it is using AI to enhance efficiency, for example in its Transformation Roadmap published in July – but the approach HMRC has taken to dealing with this request suggests that it is reluctant to explain how and when it is using it.”

Elsbury had made the FOI request to HMRC to seek confirmation about how it used LLMs within the R&D tax credits compliance team; how those models had been trained and selected; how data was secured; and HMRC’s policies around using AI tools.

Originally HMRC had said it held the information but would not disclose it under section 31(1)(d) of the FOI Act as it could prejudice the assessment or collection of tax or duty. After Elsbury appealed to the Information Commissioner, HMRC changed its position and said it would now not confirm or deny whether it held the information under section 31(3) of the Act.

The Information Commissioner upheld HMRC’s decision. At the tribunal, Elsbury argued that HMRC’s use of AI was resulting in nonsensical responses to R&D claims, putting small businesses off making such claims; and that the body had not shown any evidence that disclosing the information would prejudice tax collection and was against the public interest.

The tribunal agreed, stating: “The panel found compelling the Appellant's submissions that HMRC's failure either to confirm or deny it holds the requested information reinforces the belief based on indicators in HMRC correspondence dealing with R&D claims that AI is being used by HMRC officers - perhaps in an unauthorised manner - thus undermining taxpayers' trust and confidence in HMRC's treatment of claims, in turn discouraging legitimate claimants from making claims thereby hindering the policy objectives of the R&D tax relief scheme itself.”

HMRC has 35 days from the date of the ruling to confirm whether or not it holds the information and either provide it to Elsbury or serve a refusal notice and explain why.

Landman said the ruling showed the dangers for HMRC of using AI without being transparent as to its use.

“There have been a number of published tax cases where AI has been used by taxpayers in preparing their submissions for an appeal which has unfortunately generated hallucinated case law, leading to a waste of time and resources on all sides,” Landman added.

“This case will emphasise for HMRC that it must not fall into the same trap. Taxpayer confidentiality was heavily highlighted by Mr Elsbury – HMRC often uses taxpayer confidentiality and its statutory obligations relating to it to argue that it doesn’t have to disclose, but with the use of AI are they ensuring that it is maintained on their side? In addition, the mechanics of the tax legislation relating to HMRC decisions, assessments and related matters often refer to the actions of HMRC officers and so there is a potential tension if AI is used for HMRC decision making in a material way”.

Cerys Wyn Davies, an expert in AI and intellectual property law at Pinsent Masons, said: “Transparency is the cornerstone of public trust in the use of AI by public bodies. As artificial intelligence becomes more embedded in decision-making across all sectors, it is vital that organisations clearly explain their use of AI and demonstrate that in using AI they can be trusted with business confidential information and personal data.”

ICO guidance emphasises the requirements for accountability and transparency in AI deployment, including clarity over the use of taxpayer data, Wyn Davies added, which requires public bodies to embed principles of privacy, safety and security, transparency and explainability, accountability and fairness in their use of AI systems.

She added: “As Elsbury emphasised, this is particularly important when handling confidential intellectual property information inevitably involved in R&D claims. Concerns that this information will be fed into LLMs, used to ‘train’ them, become accessible to others and create risks to R&D in the UK means that taxpayers need to be reassured with appropriate transparency as to their use.

“To continue the innovation which AI can support in the public interest, it is essential to build public confidence in its responsible use. As noted by the tribunal: ‘transparency on HMRC's part is particularly important when AI's role in decision-making is a pressing concern globally’.”

We are processing your request. \n Thank you for your patience. An error occurred. This could be due to inactivity on the page - please try again.