Out-Law Analysis 8 min. read
Andriy Onufriyenko/Getty Images.
16 Jun 2025, 10:54 am
The Data (Use and Access) Bill (DUAB) has been approved by UK law makers following protracted parliamentary proceedings which focused on AI and copyright issues and risked diverting attention from meaningful reforms that will impact how businesses across sectors can realise the value of data.
As we explore below, the DUAB provides for a potential smart data and AI revolution in the UK, enhances the rights of researchers, means construction works should result in less disruption in future, makes digital verification easier, and raises the stakes for businesses’ compliance with rules governing use of ‘cookies’ and around direct marketing.
The idea that data held about a customer by one business should be made available to other businesses to use, so they can offer new products and services to those customers, to promote competition and innovation in markets, is not new. It is an idea that has been put into practice in UK banking.
The UK’s open banking regime requires major banks that hold certain customer accounts to enable third party access to that data, at customers’ request, and has spurred the emergence of a range of innovative fintech and payment solutions.
The DUAB provides for the concept of open banking to develop into one of ‘open finance’, where holders of customer data on pensions, insurance and investments, for example, enable third party providers’ access to that data under strict conditions.
Regulation-making powers further provide the government will scope to extend the open banking to other sectors, such as energy and telecoms. It is envisaged that those new data access frameworks would operate for customer and business data in a similar way to that provided in relation to data derived from connected products under the EU Data Act, meaning businesses in the specified sectors would be obliged to comply with specific standards and make use of intermediaries to facilitate data portability rights belonging to individuals rather than undertake the work to do so themselves.
Currently, UK data protection law provides people with a general right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning them or similarly significantly affects them. There are limited carve outs to those rights, including where the processing is based on the individual’s explicit consent.
The DUAB effectively permits automated decision-making (ADM) in many circumstances, as long as the organisation using the relevant AI or other technology implements safeguards, allowing individuals affected by those decisions to make representations, obtain meaningful human intervention and to challenge decisions made by solely automated means.
Restrictive provisions, similar to those currently in place, would continue to apply where an automated decision is “significant” and it is based on the processing of “special category” data. A decision is deemed ‘significant’ if it produces a legal effect for, or has a similarly significant effect for, the data subject. Those restrictions would also apply where decisions are based on information falling within UK GDPR Article 9, which includes genetic data or biometric data, such as that collected for facial recognition, where it is used for the purpose of uniquely identifying an individual, as well as information revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, health, sex life or sexual orientation.
Earlier this month, the UK’s data protection authority, the Information Commissioner’s Office (ICO) said it plans to issue a new AI and ADM code of practice within the next year.
Currently, a guiding principle of UK data protection law is that the purpose of personal data processing is explained to data subjects at the outset and that the lawful basis for processing that data attaches to that purpose described. Those rules are designed to safeguard against the risk of people’s data being used for purposes they have not agreed to once they have given an organisation access to that information. However, the purpose limitation principle has had a stifling impact on scientific research – this is because research projects typically evolve, meaning researchers often have to go back to data subjects and explain new purposes that they wish to use their data for to have a continued lawful basis to process the data.
The DUAB aims to address this by creating the concept of flexible consent, so that researchers are free to engage in scientific research projects that can shift in scope over time, without the need to revert to data subjects for fresh consent.
These sensible reforms were the subject of scrutiny as they passed through parliament due to provisions that broaden the scope of activities deemed to constitute processing for ‘scientific research purposes’. The concern expressed revolved around whether the changes would give AI developers too much unchecked scope to use data for AI training purposes.
The wording in the DUAB clarifies that processing for ‘scientific research purposes’, among other things, can include processing for the purposes of technological development or demonstration – but only if those activities “can reasonably be described as scientific”.
UK data protection law provides individuals with a right to request a copy of the data organisations hold about them. For organisations, complying with data subject access requests (DSARs) can be a complex and burdensome process, where the costs and resources required to respond – and to do so in the prescribed period – can escalate quickly.
A welcome change that DUAB brings are new ‘stop the clock’ provisions, meaning organisations can ask individuals who seek access to their data to verify that they are who they say they are before they engage in a search for the data, without that process, which can entail a delay, eating into the time they have to respond to such requests.
Another change, which will take effect immediately when the DUAB receives Royal Assent, clarifies the obligations on organisations to undertake a ‘reasonable and proportionate’ search of documents to meet their obligations in relation to DSARs.
The change is to be treated as having come into force on 1 January 2024. This reflects ICO guidance in place at the time as well as case law established in an information rights case involving businessman Mike Ashley. Ashley successfully argued that a search HMRC ran internally in response to his DSAR was too narrow – HMRC argued that while data held by its investigative unit was in scope of Ashley’s request, it was disproportionate to expect it to run a search for further data on files held by a separate unit within the organisation. A judge disagreed.
The DUAB provides a new framework for the operation of so-called digital verification services. These services are often provided online by third parties to help businesses confirm the identity of people they are dealing with or claims they have made about themselves – such as in the context of recruitment, to confirm how long a candidate spent in previous employment.
Some uncertainty has arisen around the legal basis on which providers of digital verification services have accessed data. The development of a new ‘trust framework’, via regulations provided for under the DUAB, should help to address this.
The DUAB makes provision for the creation of a new ‘national underground asset register’ (NUAR) for England, Northern Ireland and Wales that aims to enhance understanding of where energy, water and telecoms infrastructure is installed underground, to enable planning of roads and street works in a way that minimises disruption and addresses the risk of damage being done to existing infrastructure assets when new works are undertaken.
The detail around exactly businesses will be required to input data to the new register, what information they will be expected to share and in what format, and around access to the register, will be set out in further regulations, but work has been ongoing behind the scenes in relation to the NUAR project.
There have been some concerns expressed that the creation of the new register should not serve as a signpost to bad actors and foreign adversaries that wish to target UK critical infrastructure.
While businesses face fines of up to £17.5 million, or 4% of their global annual turnover – whichever is highest – for breaching UK data protection laws, much smaller penalties can only be levied for non-compliance with other privacy rules under the Privacy and Electronic Communications Regulations (PECR).
Among other things, PECR sets out rules around the use of ‘cookies’ by website operators and online advertisers, as well as around electronic direct marketing. The ICO currently can only impose penalties of up to £500,000 to businesses that breach those requirements.
Under the DUAB changes, the ICO’s PECR enforcement powers will be brought into line with its data protection powers. This is significant in the context of existing ICO activity – it recently broadened out its scrutiny of so-called ‘cookie banner’ compliance to many more websites, and it routinely issues fines for breaches of the PECR direct marketing rules.
On cookies, the ICO has previously made it clear that it considers the PECR regime to be applicable to alternative tracking technologies to cookies, such as beacons and device fingerprinting.
DUAB also provides a long-awaited benefit to charities with the extension of the “soft opt-in” provisions relating to direct marketing. Until now, soft opt-in has been available only to commercial organisations, allowing them to send direct marketing to existing customers and to those who have expressed an interest in the organisation’s products or services, as long as the organisation meets associated requirements in PECR. Crucially, the organisation must have provided an easy way to refuse or to opt-out when the individual’s contact details were first obtained. Direct marketing must relate to the organisation’s own “similar” products or services, and the individual must be given an opportunity to unsubscribe or opt-out in each subsequent communication.
DUAB also confirms, for the purposes of the UK GDPR, that direct marketing can be a “legitimate interest” for the purposes of identifying a lawful basis for processing personal data.
While increasing penalties for non-compliance, DUAB does usefully confirm that storing information or gaining access to information stored on a device is not prohibited where it is necessary for purposes relating to the provision of an information society service, such as security or automatic authentication. DUAB also provides an exception where information is collected solely for statistical analysis with a view to making improvements to the service, to tailor the appearance of a website or app, or to obtain emergency assistance – for example, in the context of connected vehicles.
Finally, the PECR amendments effected by DUAB include an extension of the time allowed to electronic communication service providers or network operators to notify the ICO in the event of a personal data breach, Rather than the 24-hour period provided under current law, DUAB aligns the obligation in relation to electronic communications providers with the 72-hour period stipulated by UK GDPR.
The DUAB is shortly expected to receive Royal Assent, at which point the legislation will be called the Data (Use and Access) Act.
Most of the provisions in the legislation requires a commencement order before they can take effect. The next “common commencement date” by which the government is expected to issue a commencement order is 1 October 2025, though it is possible that some provisions could be brought into effect ahead of that date.
The data protection reforms contained within the DUAB are expected to be scrutinised by EU policymakers as part of their assessment of whether the UK should continue to benefit from so-called ‘adequacy decisions’ – decisions the European Commission can issue, to designate a country or territory as providing for data protection standards essentially equivalent to those applicable in the EU. Adequacy decisions provide an automatic legal basis for organisations in the EU wishing to transfer personal data to the designated jurisdictions.
The EU’s existing UK adequacy decisions are due to expire later this year.