The DSA builds on existing provisions of EU law under the E-Commerce Directive, which governs what online intermediaries need to do currently when they become aware of the existence of illegal activity on their services.
The DSA will introduce a new ‘notice and action’ procedure for reporting, and for the potential removal or blocking of access to, content.
The regulation also includes provisions aimed at curbing targeted advertising based on the use of individuals’ sensitive personal data. Targeted advertising at children using any of their personal data will be prohibited too. Restrictions on the way platforms can influence user behaviour through the way their interfaces are designed or operate will also apply.
Very large online platforms, and very large online search engines too, will be obliged to carry out an annual assessment to identify systemic risks associated with their services with a view to them addressing those risks. The dissemination of illegal content is one specific risk that is listed in the DSA that the businesses will be required to assess against. Other risks listed include actual or foreseeable negative effects on civic discourse and electoral processes, and public security.
The European Commission will be responsible for supervising the activities of very large online platforms and very large online search engines under a new regulatory framework that will involve national regulators taking responsibility for overseeing the compliance of smaller businesses that are subject to the DSA.
Fines of up to 6% of annual global turnover of will be able to be levied on online platforms and search engines that fail to comply with the new legislation.
Luxembourg-based Aurélie Caillard of Pinsent Masons has highlighted how the DSA could also influence the way online intermediaries manage disinformation risk.