• Home
  • Press Release
17 Sep 2025

Try Now

Get 10 FREE credits by signing up on our portal today.

Sign Up

EU Digital Services Act (DSA)

Author: teresa_myers | 17 Sep 2025

1. Overview

The EU Digital Services Act, effective in November 2022, offers a general framework for online platforms to act on systemic threats, i.e., the bulk spread of deepfakes. The key concern of the DSA is actually more about content that is dangerous, i.e., manipulations created with AI that trick or injure users.

2. Scope of the Law

The DSA will cover all online services based in the European Union, albeit VLOPs bear more obligations. VLOPs must enforce policies on discovery, deletion, and reduction of the dissemination of illegal content, including deepfakes, and collaborate with the authorities on risk-mitigation activities. No specific procedures and methods are provided by the DSA for the detection of deepfakes, though it leaves regulation to more extensive content moderation policies.

3. Key Provisions

The DSA has some essential provisions that bear on the question of deepfakes:

  • Content Moderation: The websites and platforms should have simple, readable, accessible, and transparent content moderation policies in order to handle illicit content, including deepfakes, illegal under current law (e.g., defamation, fraud).
  • Risk Elimination: Such enormous online platforms need to analyze and reduce the systemic risks inherent in their product. One of them is the risk of spreading dangerous content, such as deepfakes, during pivotal events, such as elections.

4. Cooperation with Authorities

Platforms will need to engage with national Digital Services Coordinators and other appropriate EU authorities in order to comply with DSA, to guarantee that future harms stemming from the dissemination of deepfakes or any other form of malicious content are minimized.

It is also interesting to observe that non-compliance with the DSA may be fined anywhere between €6 million and 1% of the worldwide turnover in the event of less severe infringements and €30 million or 6% of the worldwide turnover in the event of more serious infringements. National Digital Services Coordinators are overseen by the European Commission for enforcement. 

5. Major Cases or Precedents

The European Commission has launched investigations into how internet giants YouTube and X (formerly Twitter) are dealing with deepfakes and other AI-powered disinformation. The investigations are intended to find out whether the platforms have been doing as much as they can on their own against the propagation of harmful deepfakes, particularly at times like during elections, and to enforce the DSA measures for addressing systemic threats.

6. Practical Implications

Under the present DSA, platforms will be required to delete illegal deepfakes immediately once they are aware of their presence. They can be penalized by their creators or distributors pursuant to their terms of service and, where applicable, pursuant to the concerned national law. Such users who incidentally come across deepfakes are required to report the same to platform administrators, who have an obligation to act upon such complaints pursuant to the content moderation regime under the DSA and on their platform policies.

7. Future Outlook

The European Commission is also making further guidelines for the identification and treatment of deepfakes after the introduction of the DSA. As technology is continually evolving in AI, the DSA might have to be modified to consider concerns brought forth by new types of deepfakes and other man-made media.