UPDATE – FEBRUARY 2026:
The European Commission has expanded its Digital Services Act (DSA) election integrity framework through new operational tools, enforcement actions, and strategic initiatives. In February 2025, the Commission released a dedicated DSA Elections Toolkit. This toolkit helps national Digital Services Coordinators implement election-related risk mitigation measures. It provides guidance on monitoring AI-generated content and responding to disinformation campaigns. Additionally, it addresses strengthening media literacy, and coordinating with election authorities and civil society.
Enforcement activity has also intensified. In December 2025, the Commission imposed a major fine on X (formerly Twitter) for violations related to transparency, deceptive platform design, and insufficient researcher access. These issues are directly tied to DSA obligations governing election integrity and platform accountability. Investigations into Very Large Online Platforms and Search Engines remain ongoing. Meanwhile, regulators monitor compliance with transparency, risk assessment, and content moderation requirements.
In November 2025, the European Commission also announced the European Democracy Shield, an initiative designed to counter foreign interference, disinformation campaigns, and election manipulation. The initiative builds on existing DSA enforcement mechanisms. It also strengthens safeguards around online political discourse and AI-generated content.
These developments confirm that the original election integrity guidelines remain a central pillar of EU digital governance. However, the addition of operational toolkits, stricter enforcement actions, and new strategic initiatives signals a shift from preparation to sustained oversight of online platforms. This is especially important as generative AI and synthetic media introduce new risks to democratic processes.
ORIGINAL NEWS STORY:
European Commission Unveils Guidelines to Safeguard Election Integrity Online
To strengthen electoral transparency and trust in the digital age, the European Commission has unveiled comprehensive guidelines targeting Very Large Online Platforms and Search Engines. These platforms—those reaching more than 45 million users in the European Union—must take concrete steps. In particular, they should reduce risks that could threaten fair elections.
The guidelines are part of broader enforcement efforts under the Digital Services Act (DSA) and were initially focused on safeguarding the 2024 European Parliament elections.
Election-Specific Risk Mitigation
The Commission’s strategy for protecting elections involves multiple safeguards. Platforms are encouraged to:
- Establish internal teams dedicated to election risk monitoring
- Use local context analysis to adapt platform responses
- Promote official voting information from trusted sources
- Adjust recommender systems to limit the spread of disinformation
Media literacy efforts and clear labeling of AI-generated content—including deepfakes—are also emphasized. Platforms must update their terms of service and strengthen content moderation processes to reduce manipulation and misinformation.
Collaboration and Response Planning
The guidelines emphasize cooperation with EU institutions, national authorities, civil society organizations, and external experts. This collaboration is intended to improve monitoring of disinformation campaigns, foreign influence operations, and cyber threats.
Platforms are also encouraged to implement incident response mechanisms during elections to respond quickly if harmful or misleading content threatens voter participation or election outcomes.
After each election, platforms should conduct post-election reviews assessing the effectiveness of their risk mitigation efforts. Portions of these assessments should be made public. In turn, this supports transparency and continuous improvement.
Enforcement and Flexibility
Although the guidelines establish clear expectations, platforms retain some flexibility in how they implement risk mitigation strategies. If alternative approaches are used, platforms must demonstrate that those measures are equally effective.
The European Commission retains authority to request additional documentation or initiate formal proceedings under the DSA if compliance concerns arise.
Stress Test Ahead of Elections
To assess readiness, the Commission conducts “stress test” exercises that simulate high-risk scenarios such as coordinated disinformation campaigns. These exercises evaluate how well platforms and regulators respond under pressure and help refine response mechanisms ahead of major elections.
A Pillar of the DSA
The election integrity guidelines form part of a broader European framework that includes the Code of Practice on Disinformation, the Political Ads Transparency Regulation, and the Recommendation on Electoral Processes. Together, these initiatives aim to maintain a transparent, rights-respecting digital environment during elections across the European Union.
Need Help?
If you’re wondering how these guidelines, or any other AI regulations and laws worldwide could impact you and your business, don’t hesitate to reach out to BABL AI. Their Audit Experts can address your concerns and questions while offering valuable insights.

