To help organisations understand whether they can rely on ‘legitimate interests’ processing, the government has proposed a list of recognised legitimate interests.
One example on the proposed new list is where the processing is necessary for detecting, investigation, or preventing crime. This clarification may be helpful for financial services firms in particular in relation to their anti-money laundering and fraud prevention activities.
However, it is unclear whether organisations will still need to carry out some form of legitimate interests assessment to determine the necessity of processing for the desired purpose. What exactly is expected from a compliance perspective may be unclear until further guidance is provided.
On its list of recognised legitimate interests, the government has chosen to include only purposes relating to the public interest and not general commercial purposes. Including some business-as-usual purposes had the potential to be helpful for businesses but would have been difficult to draft in a way that reduced burdens on them. In this respect, there was the potential for lengthy assessments to need to be undertaken on whether activities fell under a listed purpose, which would have been counterproductive.
Subject access requests
UK data protection law provides individuals with a right to request a copy of the data organisations hold about them. Complying with data subject access requests (DSARs), however, can be a complex and burdensome process, where the costs and resources required to respond can escalate quickly. The Bill is designed to ease some of the challenges organisations face.
The Bill contains wider grounds in which organisations would be able refuse to respond to requests in their entirety, or charge a fee, where they have determined the requests are “vexatious or excessive”. This replaces the current exemption that allows DSARs to be refused where they are “manifestly unfounded”.
It would be for organisations to determine if the “vexatious” threshold is satisfied. However, to assist organisations, the Bill lists a number of relevant circumstances to be considered, similar to those already listed in the ICO’s DSARs guidance.
Helpfully, the Bill also provides concrete examples of “vexatious” requests. This includes those intended to cause distress, which are not made in good faith, or which constitute an “abuse of process”.
Organisations will likely welcome these examples, as UK case law to date has confirmed that DSARs are to be treated as “purpose blind”. In contrast, these examples suggest the wider context in which the DSAR is made, such as ongoing litigation proceedings, could potentially be taken into account.
Further guidance on which requests are “vexatious or excessive” would be welcome. For example, it would be valuable for financial services firms to have clear guidance on whether they can consider bulk requests by claims management companies to be “abuse of process”. Such requests require organisations to respond to multiple requests in the same timeframe as a single request, and may not always be directed at gaining information that is helpful for the data subject.
It is to be hoped that the ICO promise to offer more guidance and resources to help businesses comply with UK data protection law will include addressing such issues that arise in the context of DSARs.
Automated decision-making
One of the areas in which the government hopes data-driven innovation can flourish is in the context of the use of artificial intelligence (AI) systems.
In its response to its consultation on data protection reform earlier this summer, it confirmed that it was considering amending existing rules in relation to automated decision-making and profiling under UK data protection law but wanted to “align proposals” with measures expected to be set out in an upcoming white paper on AI governance. The AI white paper is not expected until later this year, though the government did published a paper outlining its proposed approach to the regulation of AI use in the UK on the same day as it published the Bill.
At the heart of the issue is the extent to which there should be human oversight of decisions taken by AI systems that affect individuals based on the processing of their personal data by those systems.
In the new Bill, the government has proposed to reframe existing provisions regarding automated decision-making in the terms of a positive right to human intervention. Individuals understandably are not always clear on what the existing right "not to be subject to a decision based solely on automated processing" really means. However, the newly cast right to human intervention would only apply to "significant" decisions, rather than decisions that produce legal effects or similarly significant effects.
There are concerns that restricting the right could threaten the UK's adequacy status, but it could be argued that this wording would create a broader right to human intervention than exists currently. Whether this happens could depend on how the secretary of state exercises powers under the Bill to set new rules clarifying which decisions are ‘significant’ for the purpose of the right.
Access to business data
The Bill would give the secretary of state and the Treasury the power to issue regulations requiring “data holders” to make available “customer data” and “business data” to customers or third parties, as well as regulations requiring certain processing, such as collection and retention, of such data.