Out-Law Analysis | 30 May 2022 | 2:31 pm | 8 min. read
Emerging EU legislation will require financial services businesses to ensure artificial intelligence (AI) systems they use meet strict requirements for limiting risk as well as obligations on operational resilience.
The dual requirements under the proposed new EU AI Act and Digital Operational Resilience Act (DORA) can be viewed in the round by financial institutions to reduce compliance burdens where they use AI systems, such as chatbots or credit-assessment tools, to engage better with consumers.
The draft AI Act aims to create harmonised rules for a proportionate, risk-based approach to AI in Europe. The regulation, if passed in its current form, would introduce a strict regime and mandatory requirements for ‘high risk’ AI, such as those used to assess creditworthiness or credit scores; transparency requirements for specific types of AI, such as chatbots; and a ban on certain uses of AI. The proposed new legal framework would apply to all sectors, public and private, including financial services. It extends to providers and users located in the EU as well as those based in other jurisdictions. The regulation will not apply to private, non-professional use of AI.
The DORA proposals are designed to consolidate and harmonise existing requirements for information and communications technology (ICT) within the EU. In particular, DORA looks to enhance and streamline the requirements around ICT risk management, establish thorough testing of ICT systems, create new requirements for responding to cyber risks and ICT-related incidents, as well as introduce powers for financial supervisors to oversee risks arising from financial entities’ dependency on ICT third-party service providers. DORA also includes specific provisions in relation to critical third-party providers, such as cloud service providers.
The scope of DORA is very broad and applies to most entities within financial services including credit institutions, payment institutions, insurance and reinsurance undertakings, credit rating agencies, and ICT third party service providers, in relation to the use and provision of ICT services. The definition of ICT services is relevant to determining which third party arrangements will be subject to its requirements.
It is proposed that activities considered to be “ICT services” under DORA include “digital and data services provided through the ICT systems to one or more internal or external users, including provision of data, data entry, data storage, data processing and reporting services, data monitoring as well as data based business and decision support services”. Businesses eagerly await publication of the final version of DORA, which was recently agreed by EU law makers. However, should DORA cover all digital and data services, this would also capture “AI systems” that fall within scope of the AI Act.
While the final version of DORA has now been agreed at EU level, it is yet to be published publicly. The definition of ICT services therefore is likely to change, however, according to most reports, this idea of broad coverage of ‘digital and data services’ will remain.
“AI systems”, under the AI Act, are considered to be software that “can, for a given set of human-defined objectives, generate outputs such as content, predictions, recommendations, or decisions influencing the environments they interact with”. The services provided using AI “software” in a financial services context will include a number of the types of services considered by DORA as ICT services, such as data monitoring, data processing and data based business and decision support services. Third party service providers that provide cloud services, software and data analytics services are also required to comply with the requirements under DORA.
ICT governance, management of risk and oversight of third party providers of ICT are core elements of DORA. They also play an important role in establishing trustworthy, ethical, and lawful use of AI.
Under the DORA proposals, financial services businesses would be required to implement an ICT risk management framework “which enables them to address ICT risk quickly, efficiently and comprehensively”. Any risk management framework would need to be documented and reviewed at least once a year as well as following any major ICT related incidents and after testing or audit processes are complete.
Financial services businesses would also be required to identify all sources of ICT risk and assess cyber threats and ICT vulnerabilities in respect of their ICT related business functions and information assets.
Where AI systems are being used, it is expected that financial institutions will amend existing governance processes or create specific AI processes to address risks which may arise from AI decision making. This is particularly important where “high risk” AI systems are being used and/or where personal data is processed by AI.
It is therefore important that financial services businesses undertake work to understand the parallels between the approaches taken in each regulation. Streamlining approaches towards implementing each regulation will likely lead to reduced cost, and improved risk management and resilience.
The establishment of a single, well-documented framework which clearly sets out the risks which arise specifically from the use of AI systems, including any cyber risks, and measures to address those risks, can help ensure compliance with DORA and the AI Act, as well as other applicable regulation such as the General Data Protection Regulation. Consideration of the risks arising from processing personal data under data protection impact assessments will also be useful in forming a compliant risk management framework under DORA and the AI Act.
Financial services businesses must adequately protect ICT systems and minimise the impact of risk by putting in place appropriate ICT security tools, policies, and procedures. DORA requires financial services businesses to maintain “high standards of security, confidentiality and integrity of data, whether at rest, in use or in transit”. In doing so, policies and procedures should look to reduce the risk of unauthorised access to, and corruption or loss of, data, and “limit the physical and virtual access to ICT system resources and data to what is required only for legitimate and approved functions and activities”.
With AI systems relying heavily on data to learn and produce outputs, security of systems and data – whether training data, input data or output data – is important. Security policies and procedures implemented in accordance with the requirements of DORA should look to also address AI-specific concerns and risks, alongside security requirements for the broader ICT network across the financial services business.
It is proposed under the AI Act that high risk AI systems developed are “resilient” in relation to “attempts by unauthorised third parties to alter their use of performance by exploiting the system vulnerabilities”. The draft legislation also recognises that specific “technical solutions” will be required to ensure that the security of high risk systems is appropriate to “the relevant circumstances and the risks”.
According to the draft AI Act, processes to address AI specific vulnerabilities should include, where appropriate, “measures to prevent and control for attacks trying to manipulate the training dataset (‘data poisoning’), inputs designed to cause the model to make a mistake (‘adversarial examples’), or model flaws”.
To assist with assessing risks, preparing for ICT related incidents, and identifying issues in the digital operational resilience of a business’s ICT systems, financial services businesses are likely to be required under DORA to establish, maintain, and review a digital operational resilience programme. This should form part of the financial institution’s ICT risk management framework and should include a range of tests including vulnerability assessments and scans, open source analyses, physical security reviews, questionnaires and scanning software solutions, source code reviews where feasible, scenario-based tests, compatibility testing, performance testing, and end-to-end testing or penetration testing.
The proposed requirements for testing of systems under DORA should prove to be useful in respect of managing and governing the use of AI. In particular, regular and extensive testing of systems should assist with complying with requirements under the AI Act around risk and quality management, and accuracy, cybersecurity, and robustness of systems. Processes for testing under DORA may also help with meeting data and data governance principles under the AI Act.
The draft DORA highlights the importance of third party service providers putting in place and testing business continuity plans to “guarantee a secure provision of services” by a financial institution. As part of business continuity processes, financial institutions should also put in place disaster recovery plans which are subject to independent audits. They should also develop back-up policies that note the scope of data that will be backed up, the frequency of the backup, and the options for recovery of data.
These proposals broadly align with requirements specific to AI systems under the draft AI Act, which states that “the robustness of high-risk AI systems may be achieved through technical redundancy solutions, which may include backup or fail-safe plans”. Placing requirements on service providers to implement clear and robust business continuity measures and back-up plans can also provide financial services businesses with comfort; and help service providers with ensuring that data used by AI systems is managed effectively and risks of loss or corruption of data used by AI are reduced.
Managing risk of third party providers of ICT services is at the heart of the DORA proposals. Its requirements extend to services which are sub-contracted by a service provider. The draft regulation emphasises that financial services businesses “should thoroughly assess contractual arrangements to identify the likelihood for such risk to emerge, including by means of in-depth analyses of sub-outsourcing arrangements, notably when concluded with ICT third-party service providers established in a third country”. An assessment of the benefits and risks arising from a potential sub-contracting arrangement should be carried out.
Financial services businesses will also be required under DORA to ensure that contracts with third party service providers of “critical” or “important” functions include a clear description of all functions and services to be provided by the service provider, those that will be sub-contracted, and the terms which apply to the sub-contracting arrangement. Details of the locations where the services and sub-contracted services will be provided, and where data is to be processed and stored, should also be included.
Under the plans, third party service providers would be required to undertake in contract that they will notify the financial institution if any locations are changed. Principles on the “sound management of ICT third party risk” are likely to apply to sub-contractors who are included within the definition of “third party risk” under DORA too.
It is common for financial services businesses to outsource the provision of AI-related services to a third party technology provider. In doing so, financial services businesses should be aware that AI providers may sub-contract elements of the provision of AI-related services, such as in relation to training of systems, developing of algorithms, and collection or management of data sets. Where AI providers do so, requirements under DORA will be relevant and should assist with managing the overall risk associated with using AI systems to provide services.
The range of issues set out in both DORA and the AI Act will be familiar to all accountable for managing and responding to sourcing and third party risk. Ensuring that regulatory remediation activities are not undertaken in isolation will help financial services businesses put in place protections that reflect the underlying objectives of both regulatory developments – to reduce the risk of technology leading to unwarranted customer risk and business disruption.
Co-written by Priya Jhakra of Pinsent Masons.
16 May 2022
28 Feb 2022