Out-Law News | 04 Sep 2014 | 11:48 am | 6 min. read
The Pinsent Masons financial services sector team bring you insight and analysis on what really matters in the world of financial services.
If 2012 was the year of cloud, when regulators issued opinions on how to handle the new technology trend, then 2014 is the year of big data. Each of the main regulatory bodies has had their say.
The European Commission has unveiled its strategy. It says that this will, amongst other things, provide "the right framework conditions for a single market for big data."
The UK's Information Commissioner's Office (ICO) has published a paper on big data and data protection (51-page / 470KB PDF) , and while the overarching EU data protection body, the Article 29 Working Party, already considered big data last year, this summer it has also contributed further to the subject as part of a broader opinion.
Below we look at these developments and comment on the impact of each for the financial services sector.
The Commission's view: 'towards a thriving data–driven economy'
The Commission's figures indicate that the global market for big data technology and services will grow by about seven times that of the overall worldwide market for information and communications technology (ICT). This would mean that the total market will grow to USD $16.9 billion in 2015 at a 'compound annual growth rate' of 40%. The expectation is that larger businesses will need to increase staff directly linked to big data activities by 240% over the next five years.
These projections are subject to progress being made in establishing an effective legal framework to govern access to and use of data in a big data context, in the Commission's view. The Commission has suggested that to enable such progress the EU should focus a portion of its public research and development spending on alleviating "legal bottlenecks". It intends to "make sure that the relevant legal framework and the policies, such as on interoperability, data protection, security and IPR are data-friendly, leading to more regulatory certainty for business and creating consumer trust in data technologies."
To achieve these goals the Commission has identified two factors that need to be addressed – complexity in the current legal environment, and access to large datasets and enabling infrastructure. A failure to address these two factors has, it says, up until now contributed to stifling innovation. Unfortunately however, the Commission has not proposed legal solutions which directly address these issues.
To promote legal certainty the Commission advocates rapid implementation of its current data protection and network and information security reform packages. But while implementation of these reforms may have this effect, they do not seek to introduce simplicity into the regulatory process and therefore will not likely address the concern that the regulatory framework is too complicated.
In relation to enabling greater access to large datasets and related infrastructure the Commission has identified 'open standards' and 'data interoperability' as priorities and said that "it will support the mapping of existing relevant standards for a number of big data areas" including for financial services. The benefit of firms investing time and resources into a project of this nature however, depends on whether greater flexibility can be built into the data protection reform packages currently being negotiated and finalised at EU level by the Council of Ministers.
If the reform packages, for example, were to enable greater use of psyuedonomised data, where identifying elements are replaced with pseudonyms, there may be good reason for firms to participate in this process. On the other hand, if they have the effect of further restricting the circumstances in which data can be used where remote privacy risks exist, the benefits of frameworks for open standards and data interoperability reduce.
The Commission also says that it will work with EU countries and other stakeholders in preparing guidance on "issues such as data anonymisation and pseudonymisation, data minimisation, personal data risk analysis, and tools and initiatives enhancing consumer awareness." Businesses operating in the financial services sector should keep track of these developments and participate in the formation of any formal guidance documents whatever the outcome of data protection reform.
The Article 29 Working Party's view
Last year the Working Party said that businesses should focus on the purpose for which they have obtained data and consider whether any further 'new' purposes for which they seek to use data are compatible with that original purpose, which is known as the compatibility test. In an opinion the Working Party said that in respect of big data analytics initiatives "even more so than elsewhere, there is a need for a rigorous but balanced and flexible application of the compatibility test to ensure it can be applied in our modern, networked society."
Factors to be considered in applying this compatibility test include a consideration of the context in which data was collected, the expectations of the person to whom the data relates, the nature of the data, that is, whether highly sensitive or not, and the impact use of the data could have on the person to whom it relates, for example, whether its use would lead to negative conclusions being drawn, affecting them financially or personally.
If data is to be used specifically for the purpose of drawing an inference about a person to whom it relates, the Working Party's view is that "free, specific, informed and unambiguous 'opt-in' consent would almost always be required, otherwise further use cannot be considered compatible." Interestingly, the Working Party does not distinguish between positive and negative inferences, presumably meaning that consent would be needed regardless of the final outcome – therefore, even where used to decrease an insurance premium or reimburse a customer.
However, in a further opinion published this year, the Working Party acknowledges that there may be some circumstances where consent is not required. Initiatives which "let individuals 'share the wealth' created by big data and incentivise developers to offer additional features and applications to their users" could "help tip the balance" between protecting a person's privacy and data security concerns and a business' legitimate interests in analysing data. Under data protection laws, businesses do not need to first obtain consent to use personal data if they have a legitimate interest in using that data and that legitimate interest does not prejudice a person's privacy or other fundamental rights.
The ICO's view
Having had the opportunity to consider the Commission's views and input into the Working Party's opinion, the ICO has now released its own big data guidance. Like the Working Party's opinion, the ICO addresses the issue of balancing a business' legitimate interests against the privacy rights of individuals.
Differing in emphasis from the Working Party opinion however it states that "The benefits cannot simply be traded with privacy rights." It also indicates that whether or not data can be re-purposed for analytics purposes depends on organisations asking at least three questions: does it intend to use personal data or has it effectively anonymised all personal data?; if it intends to use personal data, how was that data obtained?; and, is the intended processing 'fair'?
In terms of anonymisation techniques the ICO says: "The issue is not about eliminating the risk of re-identification altogether, but whether it can be mitigated so it is no longer significant." It advises that "Organisations should focus on mitigating the risks to the point that the chance or re-identification is extremely remote." The inclusion of 'extremely' suggests that the ICO is moving closer to the views of its regulatory counterparts in other EU member states which hold a narrow interpretation of the circumstances in which data can be re-used without consent.
In terms of the second question the ICO says: "Long term uses must be articulated or justifiable, even if all the detail of the future use is not known." If a decision is made to re-use data for analytics purposes and consent has not been obtained it is therefore important justifications are documented. Those justifications could include details of the balance struck between the purpose of the use of the data (the business' legitimate interest) and the prejudice such use may cause to an individual to whom that data relates.
The key for the ICO however seems to be the outcome of assessments as to whether a use of data for analytics purposes is 'fair'. Determining whether an intended use of data is fair, according to the ICO, depends on whether that use would have been "within customer expectations." The guidance sets out the well known example of US retailer Target correlating the date of purchases for certain baby related products with pregnancy due dates. Although a father had complained that his teenage daughter was inappropriately receiving targeted advertising relating to pregnancy products, it transpired that Target's predictive model was correct – the daughter was pregnant.
The ICO says that this is a good example to highlight "issues of fairness and customer expectations that can arise in the context of big data analytics." The guidance however falls short of saying whether this is an example of circumstances that should have been within customer expectations or whether Target (had it been subject to EU laws) would have been in breach of data protection fairness requirements.
Given that organisations worldwide are beginning to rely on big data analytics to predict and model consumer behaviour, answers are needed as to whether it is within customer expectations that when products are purchased or services are procured, related products and services may be marketed that customer. More clarity therefore is needed on this central question.