Out-Law News | 11 Apr 2017 | 10:37 am | 3 min. read
The Information Commissioner's Office (ICO) has identified an anomaly between the definition of profiling in the GDPR and how profiling is described in other parts of the Regulation. The ICO has published its findings in a new discussion paper (28-page / 390KB PDF).
Under the GDPR, profiling is defined as "any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person's performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements".
Article 22 of the Regulation provides people with a qualified right "not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her".
The ICO said, though, that because the definition of profiling "makes no mention of 'solely' automated processing" and article 22 does, it is "debatable … whether 'automated processing' means purely automated, or whether human involvement at any stage takes the processing out of the definition".
"The interpretation of the word 'solely' … requires further consideration," the ICO said. "However, we think it is intended to cover those automated decision-making processes where a human exercises no real influence on the outcome of the decision, for example where the result of the profiling or process is not assessed by a person before being formalised as a decision."
In its paper, the ICO also said that to avoid having to rely on "the subjective view" of businesses and consumers over what effects profiling has, "it may be useful to establish an external recognised standard to measure such effects".
The ICO also warned businesses engaging in profiling that such activity "creates new data that needs to be GDPR compliant in its own right". It said that profiling is inherently less transparent than other data processing activities, which raises issues about how businesses "give effective and timely fair processing" and account for the reasonable expectations of data subjects.
"Organisations may find it difficult to decide when and how to give fair processing about profiling to individuals, both from a practical and technological perspective," the ICO said. "Profiling can be a continuous, evolving process, with new correlations discovered all the time."
The ICO pointed businesses towards its privacy notices code of practice for guidance on how businesses can best present information about profiling to people to ensure that the processing of their personal data is fair.
The ICO also warned businesses engaged in profiling that they must ensure that the personal data they are working with is accurate.
"Decisions may be made based on outdated or inaccurate data or on the basis of the incorrect interpretation of external or third party data," the ICO said. "Errors or bias in collected or shared data can increase the risk of an organisation making inaccurate classifications or incorrect decisions."
"Organisations should have robust procedures in place to protect the quality and accuracy of the personal data they process. They should have ways of testing their systems and the algorithms they use to demonstrate that the data is accurate and free from bias," it said.
The GDPR sets out when businesses can engage in profiling. Decisions based on automated processing will be permitted if such decisions are necessary for entering into, or performance of, a contract between the data subject and a data controller, or where the person being profiled has given their explicit consent to that activity.
In addition, profiling will be permitted under other laws applicable within the EU as long as those laws contain "suitable measures to safeguard the data subject's rights and freedoms and legitimate interests". The GDPR specifically refers to laws that allow for profiling for "fraud and tax-evasion monitoring and prevention purposes" as examples.
Where the data being processed is particularly sensitive and qualifies as a 'special category' of personal data, tighter rules will apply to profiling. Only if the data subject has given explicit consent to the processing or if it is necessary for reasons of substantial public interest, and "suitable measures to safeguard the data subject's rights and freedoms and legitimate interests are in place", will the profiling be permitted in such cases.
A recital in the Regulation states that individuals subject to decision-making solely by automated processing should, among other things, have "the right to obtain human intervention" and be provided with "an explanation of the decision reached after such assessment".
In its discussion paper, the ICO said businesses engaged in profiling "should consider clarifying: the categories of data used to create a profile; the source of the data; and why this data is considered relevant", rather than "providing a detailed technical description about how an algorithm or machine learning works".
In another discussion paper on the topic of big data and the GDPR published earlier this year, the ICO said businesses would have to understand and explain the rationale behind the decisions taken by machine learning algorithms they use to comply with the new laws.