European Parliament approved the Digital Services Act in its plenary session
On 5th of July, the European Parliament approved the Digital Services Act in its plenary session, which establishes a new set of rules regarding the obligations and accountability of intermediaries in the single market.
These new rules strive to guarantee a better consumer protection and harmonise the legislative framework concerning the transparency and accountability of online platforms. This diploma also aims to better control systemic platforms and mitigate their risks, such as manipulation or disinformation.
This regulation is applicable to providers of intermediary services irrespective of their place of establishment or residence, in so far as they provide services in the Union, as to ensure a level playing field within the internal market.
The various intermediate services have specific obligations depending on their type and according to their dimension and impact on the digital world. In this context, different specific categories of intermediate services are set, namely the hosting services, the online platforms and the very large online platforms (with 45 million European users or more).
The Digital Services Act establishes new measures to counter illegal goods, services and content, including a mechanism that allows users to flag such content. After being notified, the service providers must react in a timely manner, while always respecting fundamental rights such as freedom of expression and data protection.
There are also new obligations on traceability of business users in online market places to help identify sellers of illegal goods with greater efficiency, including reasonable efforts to randomly check for the reappearance of illegal products or services.
Additionally, the platforms are required to be more transparent in a variety of issues, such as content moderation and algorithms used for recommendations. Users should also be able to challenge the platforms’ decisions relating to content moderation.
The Act bans certain types of targeted adverts on online platforms, for instance, when they target children or when they use specific categories of personal data, such as ethnicity, political views or sexual orientation.
The very large platforms and online search engines must prevent the misuse of their systems by adopting measures based on their risks and being subject to independent audits. Furthermore, these platforms must allow their users to opt out of recommendations based on profiling and are obligated to grant key data access to competent authorities and researchers, as to understand the evolution of online risks.
The creation of an oversight structure to address the complexity of the online space is also projected. The EU countries will have the primary role, while being assisted by a new European Board for Digital Services.
After being formally adopted by the Council of the EU, the Digital Services Act will be published in the Official Journal of the European Union and enters into force 20 days after its publication.