Out-Law Analysis 4 min. read

Consumer fairness to feed into FCA's review of role of big data in insurance, says expert


FOCUS: The Financial Conduct Authority (FCA) is likely to take more of an interest in how customer data is used than in how it is collected during an upcoming study of the use of 'big data' by insurers, so firms should look at the work of other regulators to decide how best to prepare.

The FCA announced in its business plan for 2015/16 that it would begin a market study to "investigate how insurance firms use big data" later this year. We don't yet have many details about what this study will consider, but previous FCA statements on big data imply that its focus will be the interaction between the regulator's conduct risk rules - such as treating customers fairly and acting in the best interests of clients - and the wider application of data protection principles.

According to its business plan the FCA intends to look at insurers' use of web analytics and behavioural data tools, including social media, as part of its study, together with the use of "other unconventional data sources" by insurers.

Firms should be thinking now about whether they have the necessary processes and controls in place to demonstrate fair treatment of their customers and potential customers whenever data and analytics tools are used to evaluate them or to make decisions or predictions for product and premium pricing, marketing and other purposes.

Big data: the FCA's current position

The FCA intends to "identify potential risks and benefits for consumers, including whether the use of big data creates barriers to access products and services" as part of its market study. It will also "examine the regulatory regime to ensure that it does not unduly constrain beneficial innovation in this area".

To give this some context, it is worth going back to the FCA's annual Risk Outlook publication from 2013. That paper was the first in which the regulator discussed the growing use of big data in detail, both in terms of its potential benefits as well as its development as an area of regulatory concern.

On the positive side, the FCA pointed out the potential of big data to promote further innovation and customisation and to act as an enabler for better 'risk profiling' and more accurately and fairly priced products. But it also highlighted gender discrimination as a potential area of concern, indicating that it was already conscious of the risks of segmenting audiences on the basis of specific characteristics. The dangers of restricting access to core products on the basis of consumer characteristics, as well as the potential for misjudgements as a result of data corruption or a failure to validate information, were also highlighted.

At that time the FCA also took the view that not enough firms had developed suitable controls around the use of big data technologies. It said that firms would need to ensure that technologies used were "suitable" and "able to process information into meaningful intelligence (e.g. through contextualisation)". It also said that controls needed to have in place controls that would identify distortions as and when they arose.

In 2014 the FCA in its Risk Outlook again warned firms about the consequences of not putting suitable controls in place. It said that firms that did not develop suitable controls around their use of big data technologies faced not only the prospect of regulatory sanctions, but also the consequences of reputational damage that can flow from a data breach. Growing consumer sensitivity to the use of their personal information was also identified as an issue that needed to be considered, given the increased media attention given to stories about misuse of personal data and data protection.

That year's Risk Outlook report also found the regulator making a direct connection between fairness and acting in consumers' best interests and using big data to price products and determine their suitability for clients. It said that careful consideration was needed before the output of data analytics tools should be used in the context of "existing long term products that customers may hold, particularly where early repayment charges or fees apply could apply".

In those circumstances firms needed to think about whether a strategy of looking at factors revealed through the output of a big data analytics process that may not have been considered in an initial decision was consistent with requirements to act in their best interests. It said that there was a risk that a firm would not be acting in their customer's best interests when it used new information to re-price a product or "alter the terms and conditions postsale to offset changes in the consumer’s risk profile".

Fairness

The need for fairness and transparency in the use of big data is an important part of data protection law. This requires data controllers to process personal data "fairly and lawfully"; and will continue to be an important compliance requirement even after reforms of the General Data Protection Regulation are finalised at EU level. In its paper on big data and data protection from July 2014 the Information Commissioner's Office (ICO) emphasised the need for 'due diligence' to ensure fair data use, particularly in cases where firms obtained personal data from external sources for big data analytics purposes.

In further explaining its approach to fairness in the big data context, the ICO  highlighted that there is a "danger" of algorithms being used "in a way that perpetuates stereotyping or even bias" and that the quality of data used needs to be monitored. It also said that consent to use data for analytics purposes will be necessary where use of data "is not intrinsic to the delivery of the service" for which the customer provided it.

Also in relation to consent, the ICO suggested that it is important to think about it not just at the point that a consumer enters into an initial relationship with an organisation but throughout the whole life cycle of a relationship with a customer. The importance of a 'value exchange' – highlighting how the customer benefits in return for handing over its data - is also an issue that business should focus on rather than simply thinking in terms of a strict compliance strategy.

An international view

Of course, it is not just within the UK that regulators are looking carefully into the issues generated by the use of big data. The Article 29 Working Party, which is a collection of data protection authorities across Europe, has already published a number of different opinions that touch on different aspects of data protection law which relate to big data.

Purpose limitation, meaning clear and specific identification of the purposes for which personal data are collected, and opinions on anonymisation and legitimate interests are all worth considering in the lead-up to the FCA's review.

Preparing for the FCA's review

While the direction that the FCA will take in its review, and the specific issues it will consider in depth, remains to be seen, previous announcements by both the FCA and privacy regulators provide some practical suggestions for actions that proactive insurers can take ahead of its findings being published.

Luke Scanlon is a technology and financial services law expert at Pinsent Masons, the law firm behind Out-Law.com

We are processing your request. \n Thank you for your patience. An error occurred. This could be due to inactivity on the page - please try again.