Tribunal sets out guidance on public benefit test for use of rooftops as telecoms sites
Out-Law Analysis | 25 Feb 2015 | 4:58 pm | 6 min. read
From retail to publishing, public services to health, we are leaving digital fingerprints and trails of activity wherever we interact with organisations. Those organisations are keen to use this information to provide us with a better service, or to offer us a service that is more profitable for them.
They do this by creating a detailed picture of users, and this is called 'profiling'. It can benefit consumers, as it allows businesses to deliver personalised products and services, from an individualised shopping experience to cheaper car insurance. Or it can harm their interests by, for example, allowing an underwriting decision for home contents insurance to factor in additional factors such as an individual’s credit rating.
So there are privacy implications that are governed by EU law, and regulators are paying attention to the rules governing profiling as they amend and update Europe's privacy laws.
This isn't the first regulation of profiling. Existing data protection rules set out in 1995 address profiling in general terms. So what do those rules say and how might the legal environment and business responsibilities change under the new General Data Protection Regulation?
Profiling and the Data Protection Directive 1995
Current EU data protection laws stem from the Data Protection Directive of 1995. The Directive was implemented when the world was just getting a handle on the opportunities of the internet but before many of the technologies in use today were invented.
The approach taken to profiling in that Directive is general enough that it applies to computerised decision making about individuals.
Article 15 of the Directive on 'automated individual decisions' says individuals have a general right "not to be subject to a decision which produces legal effects concerning him or significantly affects him and which is based solely on automated processing of data intended to evaluate certain personal aspects relating to him".
This means that individuals can take issue with automated processing used to evaluate their performance at work, their creditworthiness, reliability and conduct, among other things. In the UK, that means that an individual can request that a data controller revisit the decision with human intervention.
However, automated processing of the kind that produces legal effects concerning an individual or which significantly affects them are justifiable in certain cases, under the Directive. This includes if processing is done "in the course of the entering into or performance of a contract", so long as the individuals' legitimate interests are recognised in the contract process, or otherwise by law.
The General Data Protection Regulation
The Data Protection Directive is being replaced amidst widespread agreement among EU policy makers, the legal community and beyond that the Directive is not fit for the digital age.
Although the Directive contains rules on 'automated processing', there is not a single mention of 'profiling' in the text. This is set to change, in light of the emergence of 'big data' technologies, under the proposed new General Data Protection Regulation.
Yet, in many respects, what is likely to follow does not differ greatly to the current rules on 'automated processing' under the Directive.
In the original draft of the Regulation (119-page / 436KB PDF), published in 2012, the European Commission said individuals should have a general right "not to be subject to a measure which produces legal effects concerning this natural person or significantly affects this natural person, and which is based solely on automated processing intended to evaluate certain personal aspects relating to this natural person or to analyse or predict in particular the natural person's performance at work, economic situation, location, health, personal preferences, reliability or behaviour".
Therefore, the Regulation expands on the list of examples and seems to suggest that "personal preferences" could cover highly targeted direct marketing. However, "significantly affects" seems a high threshold and it is difficult to envisage a marketing decision that would "significantly affect" an individual.
Conversely, the list could potentially extend to underwriting decisions. However, whether "significantly affects" would be limited to an accept/reject decision, as opposed to premium setting remains unclear. Therefore, even though the Regulation extends the list of examples where the right not to be profiled applies, the "significantly affects" test means that it is difficult to envisage a situation where the trigger for the right applying would materially differ from the current trigger under the Directive.
Under the Commission's plans, profiling would be permitted with individuals' "freely given specific, informed and explicit" consent, or without consent where the profiling is "carried out in the course of the entering into, or performance of, a contract", subject to certain conditions, or where other EU or national laws permit the activity providing "suitable measures to safeguard the data subject's legitimate interests" are in place.
The main change to the current regime proposed by the Commission was therefore that businesses would need individuals' explicit as opposed to unambiguous consent to process personal data in an automated manner for profiling purposes in many circumstances. This is consistent with the Regulation's general shift from unambiguous consent to explicit consent to legitimise processing, even where not fully automated.
Will MEPs have the final word?
The Commission's draft General Data Protection Regulation has been the subject of intense debate since it was published. The wording of the new rules must be agreed by both the European Parliament and the Council of Ministers (the Council).
Major changes to the Commission's draft have been proposed by MEPs and EU justice ministers within the Council, but a final text is likely to be many months away.
On profiling, the Parliament's proposals represent a tightening of the rules compared to the Commission's plans.
Like the Commission, MEPs want businesses to face a general ban from engaging in individual profiling without consent if it "leads to measures producing legal effects concerning the data subject or does similarly significantly affect the interests, rights or freedoms of the concerned data subject".
Such profiling would only be permitted without consent if it the activity is otherwise "expressly authorised by law" or "carried out in the course of entering or performance of a contract".
Whatever the legal basis for profiling, however, businesses would be banned from engaging in profiling enabled solely by automated processing of personal data where it has a legal effect or significantly affects individuals, under the Parliament's plans. A human would need to be involved in assessing such measures stemming from profiling activities and explain decisions reached after that assessment.
In addition, under the Parliament's proposals, profiling that "has the effect of discriminating against individuals on the basis of race or ethnic origin, political opinions, religion or beliefs, trade union membership, sexual orientation or gender identity, or that results in measures which have such effect" would be banned.
Businesses would be required to "implement effective protection against possible discrimination resulting from profiling" and could also not engage in profiling based solely on the use of 'special categories' of personal data - particularly sensitive personal information.
However, the Parliament's proposals would leave it open to businesses to engage in profiling and rely on alternative legal grounds from consent to do so where the profiling is "based solely on the processing of pseudonymous data".
While the Parliament's proposals are a concrete reflection of what MEPs will push for in final negotiations on the data protection reforms, the plans outlined by the Council are less set. However, they closely resemble the plans put forward by the Commission.
The Council's last recorded position (232-page / 700KB PDF) is that businesses making decisions about individuals' "personal aspects" through "automated processing" of personal data should generally need individuals' explicit consent where the decisions taken, including in a profiling context, have "legal effects" concerning individuals or "significantly affects" them.
The view of the regulators
In 2013 data protection authorities across the EU, represented through the Article 29 Working Party, gave their backing to the two-tiered data protection regime for profiling activities outlined by the European Commission and subsequently broadly supported by the Parliament and Council.
In an advice paper on how profiling should be addressed (4-page / 163KB PDF) under the new Regulation, the watchdog gave its support for individuals' explicit consent to be necessary where businesses wish to process their personal data for profiling activities that have a "significant affect" on their rights.
In these cases, the Working Party said businesses should be open with individuals about the use of their data for profiling and explain "the purposes for which the profiling is carried out and the logic involved in the automatic processing". It also backed further privacy safeguards, including measures to reduce the amount of data processed by businesses, incentives for the anonymisation or pseudonymisation of data, and "human intervention" in some cases.
When profiling does not "significantly affect" the individual's rights, the specific rules and safeguards should not apply, it said. Instead, the processing would be deemed legitimate providing general data protection rules are observed. In practice this would allow businesses to engage in less controversial profiling activities without individuals' consent if it is in their 'legitimate interests' to do so and where their interests are not trumped by the individuals' privacy rights.
Accordingly, it will be imperative that regulator guidance is issued to provide some clarity on when the "significantly affects" threshold will be met given the importance of this trigger.
Kathryn Wynn is a data protection law expert at Pinsent Masons, the law firm behind Out-Law.com. This article is part of a series examining EU data protection reform. You can also read our views on what the reforms will mean for obtaining 'customer consent', data protection impact assessments and the duty to appoint of data protection officers.
Tribunal sets out guidance on public benefit test for use of rooftops as telecoms sites