EU
Unpacked

EU Code of Conduct on Disinformation

An In-Depth Analysis for Non-Experts

Introduction and Policy Rationale
The EU Code of Conduct on Disinformation is the most advanced effort so far to address disinformation through shared responsibility rather than solely legislation. First introduced in 2018 and strengthened significantly in 2022 and 2024, it shows how EU thinking has changed and evolved. Today, disinformation is seen not just as a minor online issue but as a serious threat to democracy, fair markets, and social unity.
The Code is unique in bringing together a diverse range of stakeholders, from global tech giants like Google and Meta to specialised fact-checkers and civil society organisations, all of whom commit to a shared set of rules. The initiative is strongly based on European values, seeking to carefully balance the fight against harmful manipulation with the need to protect fundamental rights like freedom of expression and privacy. Its ultimate goal is to ensure an open, safe, and reliable internet where well-informed citizens can participate in public debate. The Code is explicitly designed to complement binding EU legislation, in particular:
  • the Digital Services Act (DSA),
  • the Artificial Intelligence Act (AI Act) ,
  • the Regulation on Transparency and Targeting of Political Advertising,
  • the European Media Freedom Act (EMFA).
The main idea of the code is to put these legal frameworks into action by establishing clear practices, measures, and collaboration methods for private and civil society groups that influence the information landscape. Conceptual Framework The Code moves away from previous “false content” models by adopting a systemic risk approach. It views disinformation as the result of interacting structural factors rather than solely individual acts of deception and is structured around several key pillars of action:
  1. Demonetisation: Cutting advertising revenue for those who are spreading false information.
  2. Transparency in Political Advertising: Ensuring users can clearly identify and understand political ads.
  3. Integrity of Services: Putting measures in place to prevent manipulative tactics like bot amplification and operations such as hacking and leaking.
  4. User Empowerment: Giving people the tools to spot, report, and respond to misinformation.
Supporting Researchers and Fact-Checkers: Making platform data more accessible and incorporating fact-checking into digital services. Let’s have a closer look at each pillar. Pillar One:  Demonetisation – Cutting the Profit from Disinformation The Code recognises that disinformation is often a business, and it persists not only because it persuades, but because it pays. Therefore, code shifts responsibility upstream to advertisers, platforms, and intermediaries. The parties signed the code of conduct, are committed to significantly improve the scrutiny of ad placements. This will include industry-wide cooperation among advertisers. Particular actions foreseen:
  • Preventing the placement of ads next to disinformation content;
  • Excluding repeat disinformation actors from monetisation programmes;
  • Increasing transparency for advertisers regarding ad placement;
  • Integrating brand-safety tools and third-party source assessments;
  • Enabling independent auditing of monetisation practices.
Platforms and advertisers will also refine their safety measures by using specialised tools to avoid unintentionally funding of the harmful pages or accounts. Additionally, signatories will collaborate with third-party source-rating services and fact-checkers, integrating external data to identify better and block those spreading disinformation. Pillar Two: Transparency in Political Life This pillar directly addresses covert influence operations, foreign interference, opacity in micro-targeting, and the erosion of electoral integrity. It aligns political advertising governance with democratic norms of traceability and contestability. To ensure that digital spaces do not become “black boxes” for political influence, the Code sets high standards for political ads, particularly:
  • Common definitions of political and issue advertising;
  • Clear labelling of paid political content;
  • Verification of advertiser identity;
  • Public, searchable ad repositories;
  • Disclosure of targeting criteria and ad spend;
  • User-facing explanations of “why am I seeing this ad?”
Additionally, there is a focus on ongoing monitoring, especially during election cycles, to oversee blackout periods and ensure independent scrutiny, helping to prevent last-minute manipulations. Pillar Three: Guarding the Integrity of Services This pillar focuses on the technical tactics used to spread disinformation, particularly:
  • Combatting Manipulation – which means commitment to policies that address inauthentic behaviour, such as the creation of fake accounts, impersonation, and the purchase of fake engagement;
  • AI Transparency -which includes obligations for the transparency of AI systems and the detection of “malicious deep fakes”.
  • And Cross-Platform Cooperation – which supports sharing of information on “tactical migration,” where known bad actors move from one platform to another to escape moderation.
Pillar Four: Empowering Citizens, Supporting Researchers, and Fact-Checkers The Code shifts from a top-down approach to empowering the wider ecosystem by employing various tools. According to the new strategy, citizens are equipped with tools to flag harmful content and access reliable sources during crises. Media literacy initiatives are another significant part of a document, aiming at supporting teaching critical thinking and digital skills, which will help users navigate complex information landscapes. A new framework is also introduced to provide researchers with strong access to platform data, enabling them to study the spread of disinformation. Additionally, platforms pledge to incorporate fact-checkers’ work into their services and to provide fair financial support to sustain these efforts. Monitoring and Accountability: The Transparency Centre Unlike many voluntary agreements, the Code has a robust framework to ensure signatories keep their promises.
  • Transparency Centre: A publicly accessible website serves as a central hub where implementation reports, qualitative data, and quantitative metrics are published for all to see.
  • Permanent Task-Force: Chaired by the European Commission, this group meets regularly to evolve and adapt the Code to new technological and societal threats.
  • Service Level Indicators (SLIs): Signatories report on specific metrics—such as the number of appeals received or the reach of fact-checking labels—to provide a data-driven view of progress.
Conclusion The EU Code of Conduct on Disinformation represents a significant shift in how democratic societies manage the digital information landscape. Instead of viewing disinformation as isolated false claims, it recognises it as a systemic problem driven by economic motives, platform structures, and coordinated efforts. Its key contribution is integrating democratic protections into the core functions of advertising, political messaging, platform design, and artificial intelligence. Whether this approach will be effective depends on consistent enforcement by the DSA, credible oversight, and the EU institutions’ ability to adapt their governance methods to a world where automation and adversarial tactics are increasingly common.

January 2026

George Robakidze

George is a diplomat and expert in international politics, security and European integration. During his career in the Georgian public service (2004–2023), he held senior positions focused on political affairs, European and Euro-Atlantic integration and regional security. Beyond diplomacy, he has contributed extensively as an author and researcher, specialising in the rise of radical and populist movements in Eastern Europe. He currently serves as the executive director of the EU Awareness Centre, a Brussels-based NGO promoting democratic reforms, good governance, and EU values. He continues his work as an independent researcher on political and international issues.

Scroll to Top