What is the Digital Services Act?

Written by Jeremy Werner

Jeremy is an experienced journalists, skilled communicator, and constant learner with a passion for storytelling and a track record of crafting compelling narratives. He has a diverse background in broadcast journalism, AI, public relations, data science, and social media management.
Posted on 09/22/2023
In Blog

What is the Digital Services Act?

As the European Union works on the final touches of its AI regulation legislation, the Harmonised Rules on Artificial Intelligence, or the EU AI Act, we look at one regulation that has service providers scrambling to comply with before next year. The Digital Services Act (DSA) was submitted to the European Parliament in December 2020. After a year and a half of discussion, the European Council approved the DSA on October 4, 2022 and will be directly applicable across the EU on February 17, 2024.

 

To simply put, the DSA regulates digital services, marketplaces and online platforms operating within the EU, with the aim to create a safe and more open digital landscape while protecting the fundamental rights of users, and establishing clear responsibilities and accountabilities for online platforms. As long as an outlet is offering a service in the EU, regardless of their place of establishment, they are impacted by the DSA. This means that companies that provide digital services like cloud services, data centers, content delivery, search engines, social media, app stores and others will be impacted. That includes platforms like Google, Meta, Amazon, Apple, TikTok and more. So, while this is a European Union law, it will resound globally.

 

The DSA has core obligations that require platforms to assess and mitigate risks brought by their systems. It also requires platforms to remove illegal content, protect children, suspend users offering illegal services, ensure the traceability of online traders and empower consumers through various transparency measures. This also means that platforms must publicly report how they use tools for automated content moderation as well as disclose all instances of illegal content which is flagged by content moderators or by the automated content moderation.

 

The DSA highlights additional requirements for large platforms, which are referred to as very large online platforms (VLOPs). A VLOP will face additional requirements in the field of risk management, external and independent auditing, transparency reporting, access to data and algorithms, advertising transparency and user choice for recommendation algorithms. The threshold for a VLOP is 45 million+ monthly active EU users, which will most likely impact the platforms we mentioned above as well as several large EU firms. In an effort to catch other potential VLOPs, the DSA has these obligations aimed at fast-growing start-ups which are approaching similar scales and risk profiles of other VLOPs.

 

Unlike others, VLOPs designated under the DSA will have four months before next February to comply with obligations like risk assessment, transparency reporting and data access. That means there are staggered timelines based on platform size before the final date when the European Commission and national Digital Services Coordinators will oversee enforcement. The DSA establishes oversight and enforcement cooperation between the European Commission and EU countries. As for penalties for non-compliance, it includes fines of up to 6% of global turnover, which means some VLOPS could face hundreds of millions in fines if they’re found to be non-compliant.

If you have questions about how to stay compliant with the Digital Services Act, reach out to BABL AI.

Subscribe to our Newsletter

Keep up with the latest on BABL AI, AI Auditing and
AI Governance News by subscribing to our news letter