Out-Law Analysis 4 min. read

The future of chatbots and online chat in financial services


The significant increase in the use of chatbot technology and online live agents on chat platforms by financial services firms in recent years reflects the rapid growth in more advanced artificial intelligence (AI) tools to help firms better engage with their customers.

Provided legal and regulatory requirements are properly taken into consideration, online chat solutions offer financial services firms an effective means to deal with customer queries in a more efficient way. 

Chatbots in current use

In financial services, chatbots and online live agents are predominantly used as part of customer service functions, but they can also be deployed elsewhere, for example as IT helpdesks. They have proven to be a cost-efficient method for financial services firms in dealing with customer queries and complaints. From using online chat platforms, firms can learn from a customer’s past choices and customise their experience by structuring a profile based on their interactions. 

We know of one bank that recently deployed a chatbot to deal with customer queries that expects to handle 10 million chats a month by 2024. Another bank is operating a new platform that gives customers the option to speak with a live agent if they do not want to engage with the chatbot. These live agents are available to deal with queries via the in-app chat support on a 24/7 basis. That provider has reported that its chatbot can answer up to 26% of customer queries before being passed through to a live agent.  

Regulators have recognised the potential of chatbots in financial services too. The UK’s Financial Conduct Authority in its guidance on vulnerable customers, for instance, highlighted how the automatic generation of a chatbot can help vulnerable customers, along their digital journeys, disclose their needs and get relevant assistance.

Legal and regulatory considerations

Chatbots are likely to be classified as AI systems for the purposes of future regulation. In some jurisdictions, this will require the implementation of robust risk assessment processes.

Consideration will need to be given to the level of transparency consumers and regulators will expect regarding the development and deployment of the chatbot. Under most current regulatory frameworks, financial services providers are expected to disclose whether or not a person is engaging with a chatbot or human adviser.

Carrie McMeel

Solicitor, Pinsent Masons

From using online chat platforms, firms can learn from a customer’s past choices and customise their experience by structuring a profile based on their interactions

If the chatbot is making decisions which could have an impact on the customer’s financial interests, it is expected that the customer will be made aware of its underlying features. The extent to which some features may have limitations may also need to be disclosed. 

In many cases, firms will need to rely on external providers to develop, supply, integrate and update their chatbot solutions and online or in-app chat platforms. The terms on which the service provider is engaged will need to be considered by financial service firms.  

Performance indicators and testing regimes will need to be carefully devised. It likely will not be sufficient if performance is limited to assessing the number of customers that have had their request answered successfully without the need for a human agent to intervene. Consideration will also need to be given to the data quality standards and processes relevant to the system’s development and use; and the extent to which protections have been put in place to protect against unfair bias and software maintenance issues that can arise if the bot relies on different data sources to feed machine learning models, has hidden dependencies on third party labels or incorporates unstable code that has resulted from quick iteration and experimentation. 

Chatbot technology is evolving rapidly so firms will want to maintain the flexibility to change providers if another provider has developed a product that better suits their requirements. In terms of using online live agents to respond to customer queries, firms will need to consider the appropriate arrangements to be put in place – especially if the live agents are provided through an outsourced service provider.   

When a financial services firm engages chatbots or online live agents it is highly probable that there will be processing of personal data. Financial services firms will need to consider GDPR compliance issues and the lawful grounds for processing. If relying on customer consent to process personal data, firms will need to consider how that consent is acquired.  Under the GDPR, consent is not obtained unless it is “freely given, specific, informed, and unambiguous”. One way to obtain customer consent is to include the consent mechanism with a privacy notice.  

Meeting the requirements for consent can impact the customer experience. Financial service firms should consider if it would be less disruptive to rely on ‘legitimate interests’ as a lawful ground for processing. Under the GDPR, controllers do not need to obtain consent from users when they have a legitimate interest in processing the data and can show that the processing is necessary to achieve it. If firms rely on legitimate interests as a ground for processing, they should consider preparing a legitimate interests assessment. This will help them to determine that ‘legitimate interests’ is a proper lawful basis for the data processing envisaged and can help with demonstrating compliance with the GDPR’s accountability principle.

With respect to lawful, fair processing and transparency, steps should be taken to inform the customer about the relevant processing, what personal data is collected from the chatbot, and how that data is used by the firm and the chatbot. This can be achieved by including the information in the privacy notice and implementing measures to bring this to the customers’ attention prior to them engaging with the chatbot. 

Financial services firms also need to consider intellectual property risk when using chatbots. In most situations, the firm will engage a provider to deploy the solution so the ownership and development of any intellectual property should be considered at the outset of any arrangements with providers. Firms should consider whether they should own the solution that third parties provide so that it can be developed further in the future on its behalf, or whether a change in provider would require a new solution to be installed instead. 

Issues have arisen with chatbots that use machine learning tools. It is important to establish who is responsible if your chatbot responds to a customer in a way that might cause harm. If a chatbot uses abusive language or provides offensive responses to the customer, there is potential for the customer to take a defamation claim against the firm or the service provider. To avoid these situations, it is important that the solution is tested vigorously and censored before its launch.  Firms should ensure that there are robust software-as-a-service (SaaS) agreements or licence agreements in place with any service providers. 

Chatbots can be cost savers

Seamless integration of chatbots is likely to become a major part of firms’ client engagement strategies. Many large financial firms are already moving towards responding to the majority of customer queries through online chat platforms and chatbots. This will significantly cut costs for firms and time spent dealing with calls from customers.  

Written by Carrie McMeel of Pinsent Masons.

Rewiring financial services
Digital transformation is accelerating in the financial services sector, particularly in the wake of the global pandemic. We investigate the legal and regulatory landscape in financial services technology and highlight the opportunities for change.
Rewiring financial services
We are processing your request. \n Thank you for your patience. An error occurred. This could be due to inactivity on the page - please try again.