The role of standards in improving data quality
The increasing reliance on data increases the importance of data quality and the standards that should be adhered to. The Forum identified that there is a lack of consensus on data standards in the financial services sector, including agreement on good practice, and that there may be challenges in applying existing data standards to AI. It was noted that data standards developed as part of the open banking regime may be useful when using AI in financial services.
When looking at data quality, financial services businesses must ensure that any data collected and used, whether from customers or third parties, always complies with existing legislation and regulatory principles relating to data. In particular, compliance with data protection law is a core requirement, as the nature of AI-related services and products offered in financial services – from chatbots to tools offering assistance with credit applications, providing insurance quotes, and data analysis – will commonly entail the processing of personal data.
When thinking about deploying AI or other new technologies that will involve the processing of personal data, businesses should first carry out a data protection impact assessment. A review and any necessary update of internal audit processes is also advised and can assist with ensuring compliance with data protection principles including those relating to anonymised and pseudonymised data where data is aggregated and used or sourced from third parties. We can see a trend in best practice and non-binding guidance in this area, including from the UK Information Commissioner's Office, and the possibility of further guidance – this was suggested in the recent Kalifa review of UK fintech where the creation of new guidance to help UK fintechs understand how financial services regulatory rules apply to AI and how AI should be used in the context of UK data protection laws was recommended.
Additionally, organisations must ensure that they review internal processes and standards so that AI use does not impact the ability to comply with other regulatory requirements, such as the European Banking Authority's guidelines on outsourcing where relevant in respect of data and third parties. Increased data sharing and use of data by AI systems to improve financial services must align with a financial institution's existing regulatory obligations.
Data strategy
The Forum also looked at the need to create a framework for auditing AI, noting that there is currently no best practice for doing so. Responsibility for creating a framework, the scope of audit and the subject of audit – whether the AI model, data or both – were also discussed.
In the absence of universal standards on auditing AI, financial services providers should look at how they can ensure that the AI system it is using can be adequately evaluated and assessed, noting the need to ensure transparency and explainability. Consideration should be given as to whether AI is auditable at all stages in the AI lifecycle from development of models and data sets, deployment and AI outputs.
The Forum recognised that current financial audit and control frameworks could assist with developing good practice for controls and use of data relating to AI. A potential option for creating a risk management and governance framework for AI-related issues was also suggested, such as developing AI risk principles and mapping them against an organisation's existing frameworks.