By EUvsDisinfo

Tanks or missiles are no longer the main weapon in the attacks on democratic societies. Foreign Information Manipulation and Interference (FIMI) has become one of the defining security and foreign policy challenges of the 2020s along with other hybrid threats. The term may sound technical, but the threat is anything but abstract, referring to coordinated efforts by authoritarian regimes to distort public debate, influence elections, and sow distrust in democratic institutions.

This article explains what FIMI is, how it differs from disinformation, how malign actors use it to influence and manipulate Western societies, and why understanding these operations is vital for democratic resilience.

What Is FIMI?

The EU defines FIMI as coordinated, deceptive behaviour by foreign actors designed to manipulate the information environment of another state. In simple terms:

FIMI refers to foreign, hostile, and covert campaigns that try to shape what people think, believe, or fear, without their knowledge.

The key elements are:

  1. Foreign origin
    The operation is directed by actors outside the target society, often a state or state-affiliated organisation. Russia’s military intelligence, China’s state media networks, or Iranian cyber units all fall into this category.
  2. Manipulation of the information space
    FIMI campaigns target media, social platforms, messaging apps, search engines, and online communities. They mix real content with fabricated stories, misleading narratives, selective leaks, and emotional triggers.
  3. Deceptive methods
    These campaigns rely on fake accounts, bots, front organisations, covert funding, proxy media outlets, inauthentic amplification, and now increasingly AI-generated content.
  4. Intent to interfere
    The goal is not merely to spread falsehoods, it is to interfere in elections, policymaking, public opinion, and societal cohesion. FIMI is a tool of geopolitical competition and hybrid warfare.

Countering FIMI does not target free speech, it protects it

Importantly, the efforts to counter FIMI are not attempts at censorship of individuals or their personal views. In liberal democracies, everyone is entitled to an opinion, even if it is factually incorrect or politically controversial. The concern arises only when there is coordinated, deceptive, and manipulative action of foreign origin that seeks to distort public debate or undermine democratic processes. Addressing FIMI is about addressing the how – the manipulative tactics and infrastructure used for it by hostile states –, not the what – people’s speech.

How FIMI differs from disinformation

Although often used interchangeably in public debate, ‘disinformation’ and FIMI are not the same. For years, the term ‘disinformation’ has dominated public conversation. The problem is that it’s both too broad, covering everything from misleading advertising for healthcare products that aims to increase sales to state-level manipulation, and too narrow, focusing only on manipulative content.

Information manipulation goes way beyond that and includes amplification of genuine information and censorship, as an example. That is why governments, researchers, and security agencies are increasingly relying on the concept of FIMI to better capture external, state-level threats in the information space. These campaigns – directed by hostile foreign actors – blend covert tactics such as emotional manipulation, AI-generated content, cross-platform coordination, and information suppression. In terms of methodology, FIMI relies first on observing behaviour, rather than just the content or narratives.

Disinformation – Describes false or misleading content created or shared knowingly. Anyone can produce it, whether foreign or domestic, a state, an organisation, or an individual.

FIMI – Describes coordinated manipulative behaviour by foreign actors using deceptive methods to interfere in another society’s information environment.

Here is the key distinction:

  • Disinformation focuses on the content.
  • FIMI focuses on the actor and their behaviour.

This means that FIMI campaigns can include true facts presented in misleading ways, narratives that amplify existing grievances, or content that is not technically false but blown out of proportion through emotionally charged language. FIMI relies on manipulation.

This conceptual shift matters because democratic governments cannot counter these threats solely through fact-checking. They must understand the networks, methods, financing, and strategic intent behind the campaigns.

Threat Actors

At the centre of global concern lies Russia, whose information operations have targeted Europe and North America for more than a decade. Russia also pioneered many of the online manipulation techniques that define today’s FIMI landscape, from troll farms and botnets to sophisticated narrative warfare.

Crucially, Russia has poured, and continues to pour, billions of euros into these influence operations, prioritising propaganda and geopolitical disruption even as its own domestic infrastructure deteriorates and large segments of its population face persistent poverty and declining living standards.

Russia’s FIMI aims to erode trust in democratic institutions abroad while strengthening the Kremlin’s strategic position by shaping perceptions in its favour. At its core, it seeks to fracture Western unity, amplify societal tensions, and create an information environment where Moscow’s narratives appear credible and actions inevitable.

China is also a complex player in the field of FIMI. It uses a wide mix of tactics: from pushing conspiracy narratives to intimidation and silencing of critical voices abroad by transnational information suppression. These methods often work in tandem and can link up with other forms of interference, such as economic pressure, legal intimidation, or cyberattacks.

China and Russia’s FIMI efforts, as well as their cooperation and synergies in FIMI, represent a challenge to democratic societies.

How Russia uses FIMI against Western societies

Russia is the most prolific user of FIMI against European and North American targets. Its modern information warfare doctrine sees the information space as a battlefield, where public opinion can be shaped long before any physical conflict begins.

Russian FIMI campaigns typically pursue four strategic goals:

  1. Undermining trust in democratic institutions
    The Kremlin promotes narratives that try to discredit democratic elections, governments, media, rule-of-law institutions and even scientific experts. The aim is to confuse citizens and convince them that democratic processes are incompetent or corrupt.
  2. Polarising societies
    Russia identifies existing fractures, such as immigration, LGBTQ+ rights, public health, climate measures, or inequality, and amplifies the most extreme voices to pull societies apart from within and promote the culture of conflict rather than debate.
  3. Weakening support for Ukraine and sanctions
    Russia spreads narratives that question the value of military aid, exaggerate the cost of sanctions, or portray Ukraine as corrupt or failing.
  4. Eroding trust in the EU and NATO
    The Kremlin consistently frames these institutions as weak, aggressive, dysfunctional, or subservient to shadowy elites. The objective is to weaken transatlantic unity and reduce resistance to Russian geopolitical/imperialistic ambitions.

Tools and tactics of modern FIMI

Contemporary Russian FIMI campaigns use a broad ecosystem of assets:

Official state channels

Russian government institutions, ministries, and diplomatic missions actively disseminate Kremlin narratives through official statements, press briefings, and social media accounts. These messages provide the initial framing of events and lend an appearance of legitimacy that can later be amplified by state media, proxy outlets, and coordinated online networks.

State-controlled media networks

Channels like RT and Sputnik, along with their countless mirror sites, disseminate pro-Kremlin narratives in dozens of languages. Even where they are banned, their content circulates through sympathetic influencers and alternative platforms.

Proxy websites and ‘independent’ outlets

Russia maintains networks of pseudo-local news websites, online magazines, and think tanks designed to look Western, but in fact made to recycle Kremlin messaging. Their purpose is to give propaganda a veneer of authenticity.

Social media manipulation at scale

Botnets, troll farms, and coordinated inauthentic accounts create artificial trends, flood hashtags, impersonate individuals, and push divisive narratives. These networks are often used to make certain phenomena appear far larger on social media than they actually are. They do so by artificially boosting engagement through mass liking, sharing, and commenting.

AI-generated content and deepfakes

Synthetic media is now a central part of Russia’s playbook. AI-produced videos impersonate politicians, create fake news clips, and fabricate “evidence” that is difficult for ordinary users to verify. This accelerates the speed, volume, and plausibility of influence operations.

Hybrid coordination with other influence tools

FIMI is often paired with:

  • Cyberattacks
  • Leaks timed for political effect
  • Covert political financing
  • Use of diaspora or religious networks
  • Support for extremist or anti-establishment groups
  • Hybrid attacks, which often combine digital sabotage, intimidation, or pressure tactics with coordinated FIMI campaigns in the information space
  • Acts of intimidation or physical violence

This hybrid design makes the campaigns more durable, impactful and harder to detect.

Real-world examples

Analysts and governments have documented numerous high-profile Russian FIMI campaigns:

  • European elections, including operations during the 2024 European Parliament vote.
  • Efforts in Moldova, where Russia has used political parties, religious networks, social media manipulation, and disinformation to pull the country away from the EU.
  • Global campaigns to weaken support for Ukraine, using covert media, fake, AI-generated websites, Kremlin-aligned social media influencers, bot networks, and deepfake videos.

Each case shows the same pattern, coordinated, deceptive, foreign-directed behaviour designed to alter political outcomes.

How democracies can respond

A comprehensive response to FIMI typically includes:

  1. Understanding the threat – detection and exposure
    Monitoring units like EUvsDisinfo, national cyber centres, and civil society researchers play a crucial role in identifying FIMI incidents and making them public.
  2. Making ourselves as well-protected as possible – resilience-building
    Raising threat level awareness, providing trainings for journalists and civil society organisations and educational tools for citizens, supporting independent media, and integrating media literacy into all school programmes create a society that is harder to manipulate.
  3. Making sure that there is a level playing ground – regulation
    Rules on political advertising, requirements for labelling state-controlled media, AI content transparency, and platform accountability help reduce the space for covert interference.
  4. Making it as difficult and costly for the perpetrators as possible – diplomatic and security measures
    Public exposure and political naming and shaming, sanctions, prosecuting illegal activities, and counter-hybrid responses impose costs on foreign actors conducting FIMI operations. Partnering with like-minded countries both directly and multilaterally via fora such as the G7 Rapid Response Mechanism is key in developing and delivering coordinated and collective responses to the asymmetric threats posed by FIMI.

Conclusion

FIMI represents the evolution of modern propaganda, a coordinated and technologically enhanced form of foreign interference that targets the very foundations of democratic societies. While disinformation focuses on false information itself, FIMI allows us to see the broader strategy: the networks, intentions, and behaviours that drive hostile influence campaigns.

By EUvsDisinfo