Out-Law / Your Daily Need-To-Know

Out-Law News 2 min. read

Commission begins proceedings against X under Digital Services Act


The European Commission’s decision to begin formal proceedings against social media platform X under the Digital Services Act (DSA) shows it “can’t wait” to start enforcing its content moderation provisions, an expert has said. 

The proceedings are the first to be launched by the commission to enforce the EU-wide framework for online platforms’ responsibility. Cyber, data and technology expert Wouter Seinen at Pinsent Masons said the decision reflects the Commission’s attitude towards DSA enforcement: “This announcement shows that the European Commission can’t wait to start enforcing the DSA. It is no surprise that a social media platform is amongst the first to be put on the spot, as some of the new rules on content moderation were specifically designed for the likes of X.”

Gijs van Mansfeld, technology law specialist at Pinsent Masons, called the decision to begin proceedings at this stage “remarkable”, saying: “Essentially, it’s very remarkable that X is already targeted at this stage in the DSA’s existence”. This is because general applicability will only come into effect on 17 February. However, due to X being a ‘very large online platform’ (VLOP), most of the rules apply shortly after the date it was designed as such, on 25 April 2023. It means that the DSA has been in effect for X as of 25 August. 

The decision to open formal proceedings against X, formerly known as Twitter, follows a preliminary investigation carried out to analyse a risk assessment report submitted by X in September, transparency report in November and X’s replies to a formal request for information. This request involved concerns of the dissemination of illegal content in the context of Hamas’ terrorist attacks against Israel, with the Commission taking the decision to open formal proceedings under the DSA. 

The Commission’s proceedings will focus X’s compliance with DSA obligations to counter the dissemination of illegal content in the EU. The DSA imposes risk management and mitigation requirements on digital platforms as well as the need to operate appropriate notice and action mechanisms to counter such content. Seinen said that this focus may allow for clarity in terms of the levels of risk assessment required under the DSA, giving guidance to other online platforms. “Some have criticised the DSA for being too vague and not having specified what good looks like – in particular where controls like risk assessments are concerned. If this investigation results in actual enforcement this might be the beginning of the first CJEU decision on how robust risk assessments need to be.”

The proceedings will also focus on the effectiveness of measures taken to combat information manipulation on X, as well as measures it has taken to increase transparency. Attention will also be given to suspected “deceptive design” of the user interface, notably in relation to the blue ‘checkmarks’ related to subscription products that often present some users as more ‘official’ than others. 

The opening of proceedings empowers the Commission to take further enforcement steps, such as implementation of interim measures. After the formal proceedings begin, the Commission will continue to gather evidence through additional requests for information, conducting interviews or completing inspections.

There is no deadline for bringing formal proceedings to an end, with each investigation dependent on several factors such as the complexity of the case, company cooperation and the rights of defence. 

We are processing your request. \n Thank you for your patience. An error occurred. This could be due to inactivity on the page - please try again.