Home News Digital Services Law, what do you need to know? 

Digital Services Law, what do you need to know? 

March 17, 2023

On October 4, 2022, the Digital Services Regulation (DSA) was approved by the Council of the European Union and published in the Official Journal of the European Union on October 27. The main objective of this measure is to fight against the proliferation of illegal content and products on the Internet, protect the users’ rights (specially the youngest) and offer better sources of information, among others. The standard follows the pillars of the Electronic Commerce Directive. The regulation will introduce innovations in content moderation, the publication of information regarding the withdrawal or blocking of content and, in the case of search engines of massive size, obligations related to the systemic risks derived from its use. 

Before analysing the obligations and measures that the DSA introduces, it will be better to clarify the meaning of digital platforms of intermediary services. These are providers of telematic means that facilitate the meeting between the service provider and the users and consumers of those services intended for a public residing in the EU, regardless of the place of establishment of the intermediary provider. Some of the services they offer refer to the classification that the DSA makes of them: intermediation services that provides network infrastructure, data hosting (cloud services), cache memory service, online platforms that bring together sellers and consumers, search engines and massive platforms and engines. 

In short, the regulations will be applied, with different degrees of obligations, to data hosting services, online search engines, social networks and marketplaces, leaving interpersonal communication services, which include electronic and private messaging, outside the scope of application. 

The set of measures that the regulation brings is broad, but they include elements such as KYBC traceability or the creation of new figures such as digital service coordinators or reliable whistle-blowers. The idea is not only to offer better and more regulated content, but also to offer greater traceability of online companies, achieve an effective guarantee for users by avoiding abuse (quality services at a low price) or prohibiting selective ads. 

This regulation will not be fully applied until next February 2024, even though some precepts entered into force on November 16, 2022. The most recent implementation has been the obligation for providers of services to report the average number of European users they work with. This report is essential to the subsequent designation of a platform as “massive” which, in turn, will result in a series of more restrictive obligations that only this category of platform must assume. In a couple of months, some providers with a massive user reach (more than 45 million users) will begin to have the ‘supervision fee’. In addition, they will have to performance independent audits before July 2023. 

Another deadline to consider is February 16, 2024, by which time the Member States must have designated the competent authority at the national level that will oversee regulating the guidelines from the European Commission. On the other hand, the Commission is creating a European Center for Algorithmic Transparency (CETA) to support its new supervisory role with internal and external multidisciplinary expertise. 

The what and how of the Digital Services Law 

This new regulation establishes a regulatory framework applicable to intermediary services, preserving the liability exclusion regime for these service providers from Directive 2000/31/EC, but imposing transparency obligations. However, a general imposition of supervision on the part of the platforms of the content uploaded by users is still not a must, so they are exempt from verifying the legality of said content. If the providers know about illegal content, they will not be responsible for it, as long as they act diligently to obtain its withdrawal. On the other hand, users may act knowingly against the platform when their content is moderated, having tools to challenge moderation decisions and submit claims through an out-of-court resolution body or request redress before the courts. 

If you have any questions about the obligations derived from this new law, contact one of our experts by email at ponti@ponit.pro, by calling 934 87 49 36 or following us on our social networks to find more articles like this. 

Do you want us to help you?

Contact us and we will put a team of experts at your disposal.