Out-Law / Your Daily Need-To-Know

Out-Law News 5 min. read

Intermediaries the focus of EU Digital Services Act


Online intermediaries in the EU face a complex set of new obligations relating to the supply of goods, services and digital content under the proposed new EU Digital Services Act (DSA).

The draft legislation published by the European Commission is wide-ranging and was issued alongside separate proposals for a new Digital Markets Act.

One of the DSA's core purposes is to establish a revised framework for the removal of illegal content and standardise EU rules in relation to the liability of providers of intermediary services – building on the existing framework outlined in the E-Commerce Directive.

New "due diligence" obligations on 'providers of intermediary services' are also proposed. Baseline requirements would apply to all providers under the Commission's four-stage plans, with further obligations added at each step for hosting providers, online platforms and very large online platforms respectively.

Stage one – the proposed baseline requirements

Providers of intermediary services are defined as those that provide a 'mere conduit' service, a 'caching' service, or a 'hosting' service.

All three terms feature in the existing E-Commerce Directive and the current definitions are carried over into the draft DSA.

In essence, a mere conduit is a service that passes information between a sender and recipient over a communication network, or facilitates access to that network.

A caching service is one that involves the automatic, intermediate and temporary storage of information to smooth the further transmission of information.

A hosting service consists of the storage of information provided by, and at the request of, recipients.

Under the draft DSA, all such providers would be required to identify and disclose a single point of contact, as well as reference the restrictions they impose on the use of their services within their terms and conditions. A new duty to report on content moderation activities is also envisaged.

Stage two – proposals impacting hosting providers and online platforms

The second tranche of obligations proposed would impact hosting providers, including online platforms.

An 'online platform' is a hosting service that goes further by disseminating information to the public on behalf of recipients, subject to listed exceptions.

Hosting providers and online platforms would need to establish "easy to access, user-friendly" mechanisms that enable users to notify them of the presence of illegal content on their platforms and provide an explanation to users in cases where they acted to disable access to or remove information.

Stage three – proposals impacting online platforms

Among the most significant additional requirements applicable to just online platforms under the Commission's proposals is a duty to notify law enforcement agencies where they become aware of "any information giving rise to a suspicion that a serious criminal offence involving a threat to the life or safety of persons has taken place, is taking place or is likely to take place".

Online platforms would also be required to suspend "frequent" publishers of illegal content and be responsive to "trusted flaggers" – bodies designated as specialists in detecting, identifying and notifying illegal content – in promptly reviewing and making determinations on the removal or disablement of content flagged to them.

Under the proposals, online platforms that facilitate trade between businesses and consumers would also be obliged to require traders to gather a number of details from the trader to enable them to be traced. The platforms would also be obliged to make reasonable efforts to assess whether the information provided is reliable, ask traders to update the information provided when it is not, and ultimately suspend traders that fail to provide complete information.

Additional transparency reporting is also envisaged. Online platforms would need to report, among other things, the number of average monthly users they have, the number of suspensions they imposed on users, and the number of out-of-court settlements reached in relation to content removal.

New obligations around digital advertising on online platforms are also proposed. Online platforms would need to ensure adverts are signposted to users "in a clear and unambiguous manner and in real time". This would include informing users of the identity of the advertiser and providing "meaningful information about the main parameters used to determine" how the advert came to be displayed to them.

Data protection law expert Michele Voznick of Pinsent Masons, the law firm behind Out-Law, said: "The proposals for transparency, and meaningful explanations about the parameters for online advertising being displayed to specific individuals, particularly when based on profiling, would give new information to users and complement the GDPR. This additional information about online advertising should allow individuals to better understand how their personal data is used and why they are receiving certain advertising."

Stage four – proposals impacting very large online platforms

'Very large online platforms' are defined as those that provide their services to a number of average monthly active recipients of the service in the EU equal to or higher than 45 million – roughly 10% of the 450m consumers within the EU market. The most extensive requirements proposed by the Commission under the draft DSA would apply to those businesses.

Under the Commission's plans, very large online platforms would face a new duty to identify, analyse and assess "significant systemic risks stemming from the functioning and use made of their services" in the EU, including in relation to the dissemination of illegal content through their services; any negative effects for the exercise of a range of fundamental rights, including to privacy, freedom of expression, prohibition of discrimination and children's rights; and the intentional manipulation of their service. "Reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified" would then need to be implemented by the platforms.

Very large online platforms would also be expected to appoint compliance officers to monitor compliance with the DSA, and open themselves up to external audit, share data with regulators upon request, and be transparent to users about recommendations they display.

The draft DSA also makes provision for the establishment of new codes of conduct on issues such as addressing illegal content and systemic risks, and on meeting the proposed new obligations around online advertising.

Scope is also provided in the proposals for the Commission to work with industry and other stakeholders on new "crisis protocols" to help disseminate public security or public health information.

Supervision and enforcement

Each EU member state would be required to appoint a Digital Services Coordinator (DSC) – a regulator responsible for enforcement of the DSA. Technology law expert Dr. Nils Rauer of Pinsent Masons said it is proposed that these DSCs would be given far-reaching investigation and monitoring powers, and that a stiff penalties regime is also envisaged.

Rauer said: "Member states would be responsible for establishing 'effective, proportionate and dissuasive' penalties for non-compliance, but the Regulation leaves it open for fines of up to 6% of a provider’s total turnover in the preceding financial year to be imposed if core provisions of the legislation are not adhered to. Smaller, more technical, infringements could trigger fines up to 1% of the total turnover. In addition, periodic penalty payments not exceeding 5% of the average daily turnover in the preceding financial year per day are outlined as an option."

The position in the UK

There is a separate, but similar, journey towards greater regulation of 'big tech' underway in the UK.

Last year, the government consulted on proposals for how online intermediaries might address what it has termed as "online harms". In February this year, the government published an initial response to the feedback it had received, and earlier this week it published its full consultation response.

The government has committed to introducing a new Online Safety Bill in 2021. This legislation, it said, will, like the DSA, outline what intermediaries will be required to do to address both illegal content and other harmful content, and will draw a distinction between those two things. One significant change proposed is the introduction of a new duty of care for some online service providers.

Both the draft DSA and the Online Safety Bill, when it is published, will be subject to change as the legislation is scrutinised by law makers.

We are processing your request. \n Thank you for your patience. An error occurred. This could be due to inactivity on the page - please try again.