Out-Law / Your Daily Need-To-Know

Out-Law Analysis 11 min. read

UK Data Protection and Digital Information Bill: in detail

New data protection laws proposed by the UK government are intended to promote data-driven innovation and reduce some of the burdens organisations have come to associate with the General Data Protection Regulation (GDPR).

The Data Protection and Digital Information Bill, introduced into parliament on Monday, is the product of a major consultation the government held last year on reforming the UK GDPR and other privacy legislation that has its origins in EU law.

Kirsop Jonathan

Jonathan Kirsop

Partner, Head of Technology, Media, and Telecoms

Those seeking a substantial streamlining of requirements and the removal of obstacles to innovation and business may feel the Bill does not go far enough; on the other hand, the proposals could be viewed as diverging sufficiently from the EU GDPR to threaten the UK’s adequacy status

Significant changes are proposed as the government seeks to demonstrate advantages of Brexit, but there remains an open question over whether the UK reforms will diverge too far from the standards applicable under the EU GDPR and risk the continued free flow of personal data – and the resultant trade that supports – between the EU and UK.


The swift introduction of the Bill in parliament’s final week before the summer recess shows how politically important it is for this government. Its future will naturally depend on the next prime minister’s priorities, but so far the government’s plans for data protection reform have not attracted debate between candidates to the same extent as the Online Safety Bill – another Bill that would have a major impact on the regulation in the digital world, if implemented. One of the candidates seeking to lead the governing Conservative party and become prime minister is former chancellor Rishi Sunak. He said that data protection reform will be one of his four top priorities. 

At this point, it looks like the Bill will be set to make its way through parliament in September. The Bill’s impact assessment reiterates that “the government’s view is that reform of UK legislation on personal data is compatible with the EU maintaining free flow of personal data from Europe,” but the UK’s adequacy status will depend on the balance struck in the final text.

Amended definition of “personal data”

The Bill seeks to clarify, and possibly limit, the information to which data protection law applies – ‘personal data’. Businesses that are not also subject to the EU GDPR will see this as a welcome development since there are currently conflicting theories as to how the existing term should be interpreted and often difficulties in applying it in practice.

The Bill proposes a new section that would limit the scope of personal data to:

  • where the information is identifiable by the controller or processor by reasonable means at the time of the processing; or
  • where the controller or processor ought to know that another person will likely obtain the information as a result of the processing and the individual will likely be identifiable by that person by reasonable means at the time of the processing.

This proposed change would limit the assessment of identifiability to the controller or processor, and persons who are likely to receive the information, rather than anyone in the world. This is often a debated point, although the addition only seems to clearly set out the position that the CJEU appeared to adopt in its 2016 ruling in the case of Patrick Breyer v Bundesrepublik Deutschland.

The addition could make it easier for businesses to achieve anonymisation in the eyes of the law. This is because they would no longer need to concern themselves with potential future identifiability, with the focus instead being on identifiability “at the time of the processing”. On the other hand, the changes proposed do not eliminate the risk of identification through “singling out”, which makes anonymisation so elusive.

Reform of the accountability framework

A substantial driver of reform is the government’s view that the GDPR’s existing accountability framework is too prescriptive and burdensome for businesses to comply with. The proposed new streamlined accountability regime is built around the concept of ‘privacy management’ programmes that organisations would implement to manage data risk.

DCMS has said that “organisations currently compliant with the GDPR would not need to significantly change their approach”. However, it is unclear from the text whether organisations will need to make some changes to meet the new requirements for their accountability frameworks. 

For example, we had understood that organisations would be able to continue to use a data protection officer (DPO) appointed in line with the previous requirements if they wished. However, the new requirement to designate a “senior responsible individual” who is “part of the organisation’s senior management” seems at odds with the current GDPR requirement to avoid conflicts of interest for the DPO and might also imply that the “senior responsible individual” cannot be an external consultant. Guidance confirming that organisations whose accountability frameworks meet the current requirements will not need to make changes would be most welcome.

International transfers

The amendments in Schedule 5 pave the way for organisations to take a risk-based approach to assessing the impact of transferring personal data internationally using mechanisms such as standard contractual clauses (SCCs). It also sets out the framework for the Department for Digital, Culture, Media and Sport (DCMS) to make new adequacy decisions for transfers from the UK using the same risk-based framework. DCMS has already announced that the US is a priority jurisdiction for concluding an adequacy decision

This risk-based approach may be at odds with the approach in the EU, where some data protection authorities have said that the GDPR’s provisions on transfers of personal data to third countries do not allow for a risk-based approach. 

Organisations are likely to be keen for a simplification of their processes around international transfers. However, if concerns around the UK's approach to onward transfers lead to the UK losing its EU adequacy status, the government has admitted that the costs of this change would outweigh the benefits.

In the impact assessment issued alongside the Bill, the government estimates the annual benefit to trade brought about by these amendments would be between £80 million and £160m. The estimated impact of adequacy with the EU being discontinued "on top of these measures" is between £190m and £460m in one-off SCC costs, with an annual cost of between £210m and £410m in lost export revenue.


The Bill does not contain any material change to required responses to personal data breaches. The need to report and respond to security incidents will remain unchanged.

For now, it seems that the legislative focus for cybersecurity reform in the UK is directed towards managed service providers and other suppliers of services of major infrastructure operations, under the proposed reforms to the NIS Regulations.

However, under the Bill, the information commissioner would not be obliged to encourage public bodies to produce codes of conduct which expressly may include on the notification of personal data breaches to the commissioner and the communication of personal data breaches to data subjects.

The Information Commissioner’s Office (ICO) recently issued its own ransomware guide, including a checklist for businesses, and it is clear that such practical guidance is of assistance for businesses planning their cyber readiness.

Legitimate interests

It is a common misconception that individuals’ consent is needed in every instance to enable the processing of their personal data. Other lawful bases for processing personal data exist in data protection law.

The so-called ‘legitimate interests’ basis for processing is the most flexible of the lawful basis provided for under the GDPR, but organisations must carry out a balancing exercise and have to carefully consider if there is in fact a legitimate interest behind the processing; if the processing is in fact necessary for that legitimate purpose; and if the legitimate interest is overridden by the individual’s rights, interests and freedoms.

However, the government intends to scrap the existing balancing test for some activities under the new Bill.

It is unclear whether organisations will still need to carry out some form of legitimate interests assessment to determine the necessity of processing for the desired purpose

To help organisations understand whether they can rely on ‘legitimate interests’ processing, the government has proposed a list of recognised legitimate interests.

One example on the proposed new list is where the processing is necessary for detecting, investigation, or preventing crime. This clarification may be helpful for financial services firms in particular in relation to their anti-money laundering and fraud prevention activities.

However, it is unclear whether organisations will still need to carry out some form of legitimate interests assessment to determine the necessity of processing for the desired purpose. What exactly is expected from a compliance perspective may be unclear until further guidance is provided.

On its list of recognised legitimate interests, the government has chosen to include only purposes relating to the public interest and not general commercial purposes. Including some business-as-usual purposes had the potential to be helpful for businesses but would have been difficult to draft in a way that reduced burdens on them. In this respect, there was the potential for lengthy assessments to need to be undertaken on whether activities fell under a listed purpose, which would have been counterproductive.

Subject access requests

UK data protection law provides individuals with a right to request a copy of the data organisations hold about them. Complying with data subject access requests (DSARs), however, can be a complex and burdensome process, where the costs and resources required to respond can escalate quickly. The Bill is designed to ease some of the challenges organisations face.

The Bill contains wider grounds in which organisations would be able refuse to respond to requests in their entirety, or charge a fee, where they have determined the requests are “vexatious or excessive”. This replaces the current exemption that allows DSARs to be refused where they are “manifestly unfounded”.

It would be for organisations to determine if the “vexatious” threshold is satisfied. However, to assist organisations, the Bill lists a number of relevant circumstances to be considered, similar to those already listed in the ICO’s DSARs guidance.

Helpfully, the Bill also provides concrete examples of “vexatious” requests. This includes those intended to cause distress, which are not made in good faith, or which constitute an “abuse of process”.

Organisations will likely welcome these examples, as UK case law to date has confirmed that DSARs are to be treated as “purpose blind”. In contrast, these examples suggest the wider context in which the DSAR is made, such as ongoing litigation proceedings, could potentially be taken into account.

Further guidance on which requests are “vexatious or excessive” would be welcome. For example, it would be valuable for financial services firms to have clear guidance on whether they can consider bulk requests by claims management companies to be “abuse of process”. Such requests require organisations to respond to multiple requests in the same timeframe as a single request, and may not always be directed at gaining information that is helpful for the data subject.

It is to be hoped that the ICO promise to offer more guidance and resources to help businesses comply with UK data protection law will include addressing such issues that arise in the context of DSARs.

Automated decision-making

One of the areas in which the government hopes data-driven innovation can flourish is in the context of the use of artificial intelligence (AI) systems.

In its response to its consultation on data protection reform earlier this summer, it confirmed that it was considering amending existing rules in relation to automated decision-making and profiling under UK data protection law but wanted to “align proposals” with measures expected to be set out in an upcoming white paper on AI governance. The AI white paper is not expected until later this year, though the government did published a paper outlining its proposed approach to the regulation of AI use in the UK on the same day as it published the Bill.

At the heart of the issue is the extent to which there should be human oversight of decisions taken by AI systems that affect individuals based on the processing of their personal data by those systems.

In the new Bill, the government has proposed to reframe existing provisions regarding automated decision-making in the terms of a positive right to human intervention. Individuals understandably are not always clear on what the existing right "not to be subject to a decision based solely on automated processing" really means. However, the newly cast right to human intervention would only apply to "significant" decisions, rather than decisions that produce legal effects or similarly significant effects. 

There are concerns that restricting the right could threaten the UK's adequacy status, but it could be argued that this wording would create a broader right to human intervention than exists currently. Whether this happens could depend on how the secretary of state exercises powers under the Bill to set new rules clarifying which decisions are ‘significant’ for the purpose of the right.

Access to business data

The Bill would give the secretary of state and the Treasury the power to issue regulations requiring “data holders” to make available “customer data” and “business data” to customers or third parties, as well as regulations requiring certain processing, such as collection and retention, of such data.

Fava Lauro

Lauro Fava


The sections could also be a precursor for a law that is similar to the EU’s proposed Data Act

These proposed sections would provide the government with a basis to issue data sharing obligations similar to those being imposed on “gatekeepers” under the EU’s Digital Markets Act, which will provide business users of major online platforms with rights to obtain certain data held by gatekeepers when the legislation takes effect.

The sections could also be a precursor for a law that is similar to the EU’s proposed Data Act, although draft Data Act deals with data that is generated by the use of a product and it is not clear if the term “business data” under the UK Bill, which includes “information about goods, services and digital content supplied” and “information relating to the supply or provision of goods, services and digital content”, would cover this type of data.

Cookies and other tracking technologies

The Bill contains draft provisions which would, if implemented, alter existing UK law as it applies to ‘cookies’ and other tracking technologies.

Currently, UK law prohibits the storing and accessing of information on users' computers unless those users have given their consent on the basis that they have had access to clear and comprehensive information about the purposes of the processing. An exception to the consent requirements exists where the cookie is "strictly necessary" for the provision of a service explicitly requested by the user.

The Bill seeks to widen the situations in which tracking technologies may be used without the end-user’s consent. Consent would not be needed, for example, where the technologies are used to collect information for statistical purposes with a view to making improvements to the service provided, for enabling enhancements to the appearance or functionality of the service on the user’s device, or for enabling software updates. However, the user would still need to be given an opportunity to object, or opt out, in each of these cases.

The proposed amendments would also clarify that:

  • The rules around tracking technologies apply to technologies which do not directly access information stored on the user’s device but which identify a device by analysing information that the device automatically transmits. This would capture technologies such as device and browser fingerprinting, which are not clearly caught by the current rules – though authorities such as the ICO and the European Data Protection Board view them as in-scope;
  • Storage of or access to information for the purpose of ensuring security, protecting against fraud, automatic user authentication and detecting technical faults, is considered as being necessary for the provision of the service requested by the user. The security and fraud examples are in line with current interpretation, but this provision is nevertheless a welcome clarification; and
  • The rules do not prevent storage of or access to information which is required for automatic emergency calls – similar to “ecalls” by connected vehicles.

The UK proposals broadly align with the EU’s plans for a new ePrivacy Regulation and may prove useful to businesses looking to harmonise the way they request consent from users across Europe. Divergence in this area could be particularly difficult to manage from a technical perspective.

The Bill also re-introduces a proposal that was made by the European Commission in 2017 but which proved highly controversial and was quickly discarded – it gives the secretary of state the power to issue regulations requiring providers of services, such as web browsers, to give users an ability to give or withhold consent to tracking across all websites they visit.

While this option was not pursued by EU policymakers, there is merit in re-considering this. It could help users to rid themselves of the countless consent requests they receive while browsing the internet. Implementing such an approach would not be straightforward, however, as it would be hard to argue that any general consent given by users satisfies the EU GDPR’s requirements. Businesses providing browsers or publishing websites across Europe would need to grapple with two very different regimes.

Digital verification services

The Bill contains draft provisions which would oblige the secretary of state to establish and maintain a register of digital verification service providers and prepare and publish a document setting out rules concerning the provision of digital verification services. The secretary of state would also have the power to designate a trust mark for providers of digital verification, if the Bill is implemented as currently drafted.

These provisions are important for the "information gateway" envisaged for the sharing of information between public authorities. However, it is also likely that they will play a part in compliance with the Online Safety Bill, which is set to introduce an obligation for certain service providers to offer users the option to verify their identity and cites age verification as a way to comply with various obligations.

Our view

The detail of the Bill largely confirms the DCMS stated intention for there to be evolution rather than revolution from UK data reforms. The overall framework remains very much based on the GDPR. 

There is, though, a risk that the proposed law falls fails to satisfy anyone – on the one hand, those seeking a substantial streamlining of requirements and the removal of obstacles to innovation and business, whether perceived or real, may feel the Bill does not go far enough; on the other hand, the proposals could be viewed as diverging sufficiently from the EU GDPR to threaten the UK’s adequacy status, something which could potentially plunge global companies back into expensive and cumbersome remediation programmes less than five years after they conducted extensive work to comply with the GDPR before it took effect.

Co-written by Lauro Fava, Rosie Nance and Stephanie Lees of Pinsent Masons.

We are processing your request. \n Thank you for your patience. An error occurred. This could be due to inactivity on the page - please try again.