By EUvsDisinfo

The emergence of a global, large-scale disinformation industry has privatised influence operations, granting states strategic reach with plausible deniability.

A quiet revolution has taken place in the world of propaganda. Operations that used to be run by authoritarian governments and intelligence agencies are now outsourced to private firms that sell disinformation and deception as a service. From fake social-media armies to AI-driven smear campaigns, disinformation and Foreign Information Manipulation and Interference (FIMI) have become a global business, giving authoritarian regimes new ways to influence others – and to deny everything.

From state propaganda to disinformation for hire

For decades, information operations were tightly controlled by states. The Soviet Union perfected the craft of dezinformatsiya; later, Russia institutionalised it through modern digital operations such as the Internet Research Agency (IRA).

But over the past decade, this model has commercialised. Disinformation and deception have become a for-profit service offered by companies with intelligence, military, or marketing backgrounds. These firms, operating around the world, sell complete FIMI campaign packages that include fake social-media campaigns, hacking, data leaks, and ‘narrative management’ in order to spread false and manipulated content in democratic countries.

Outsourcing as a shield

This outsourcing provides both efficiency and deniability. Authoritarian states are now actively trying to externalise information operations to private intermediaries, while shielding themselves from diplomatic and legal consequences.

Through this model, malign actors can also experiment with risky tactics such as AI-generated content, hacking, or deepfakes – operations that would be politically or diplomatically explosive if carried out directly by state institutions. In doing so, they can target foreign populations through tailored influence campaigns while maintaining plausible deniability by claiming no connection to the private entities running them.

Outsourcing also enables information laundering — hiding the true origin of disinformation by passing it through private firms, fake accounts, and proxy media. As these actors repeat and amplify the message, it begins to look organic and locally produced. This lets malign actors spread targeted narratives while denying any involvement.

All this is the informational equivalent of using mercenaries: the client enjoys the results without bearing the blame.

Team Jorge and the commercialisation of deception

The 2023 Forbidden Stories investigation into an entity called ‘Team Jorge’ exposed the inner workings of this new influence-for-hire ecosystem. The firm claimed to have interfered in 33 presidential elections, winning 27 of them. Its clients included political parties, corporations, and, allegedly, state-linked actors.

At the heart of Team Jorge’s system was Advanced Impact Media Solutions (AIMS), software capable of creating and coordinating thousands of fake social-media accounts, complete with synthetic photos, biographies, and backstories. These avatars could be mobilised to flood debates, spread narratives, or harass opponents.

Russia continues to be a major player in this outsourced ecosystem. Privately owned companies such as the Social Design Agency (SDA) and Structura now run large-scale influence operations that mirror, and in many ways replace, the functions of the old St. Petersburg troll factories. These firms manage covert online assets, push state-aligned narratives, and provide the Kremlin with an additional layer of deniability.

Undercover journalists recorded the firm demonstrating hacking techniques, media infiltration, and the planting of fabricated news stories. The scale of these operations and their accessibility to paying clients revealed how disinformation has become a global commodity.

Hybrid operations: where online meets offline

Modern influence campaigns no longer live solely online but operate in the hybrid space between digital and physical realities.

The Internet Research Agency (IRA) demonstrated this during the 2016 US election when Russian operatives posing as American activists organised real-world rallies, paid participants, and coordinated online amplification around them. What began as meme warfare ended as physical mobilisation.

Today’s hybrid operations blend hackingcovertly funded local influencers, and covert media fronts. Campaign operators build credible-seeming news sites and influencer personas to insert tailored narratives into the public sphere. Once in circulation, these narratives mix with authentic content and spread across both digital and traditional media, making manipulation difficult to detect.

Automation and AI: the new force multiplier

The original troll-farm model – hundreds of young workers posting manually in shifts – is being replaced by AI-driven automation.

Systems like Team Jorge’s AIMS, or newer tools powered by large language models, can now manage thousands of fake accounts and generate multilingual content tailored to target audiences in real time. AI allows campaigns that once required hundreds of people to be run by a handful of operators or even a single individual. What once took a troll farm and a whole building in St. Petersburg now takes a laptop.

Asymmetrical information warfare

The emergence of these influence-for-hire firms has created a new strategic imbalance – asymmetrical information warfare.

In this asymmetry, autocracies enjoy maximum reach with minimal risk. At home, they are protected by censorship, control, and deniability. Democracies, however, are more exposed. Bound by transparency and law, they face maximum vulnerability with limited defences.

This imbalance is not just political, but structural. Authoritarian regimes can use disinformation and AI tools to shape global narratives, influence elections abroad, and undermine trust while trying to avoid direct accountability. Democracies, meanwhile, must play defence on open networks designed for free expression.

The stakes for democracy and the road ahead

These operations are already reshaping political realities. Influence-for-hire firms have targeted elections in Africa, Europe, and Latin America. Disinformation campaigns amplify polarisation, delegitimise media institutions, and exploit social divisions to weaken democratic cohesion.

The marketisation of disinformation risks creating a global grey zone where truth is optional and accountability elusive. As AI tools become cheaper and more capable, these operations will likely only grow in scale and sophistication.

Recognising this asymmetry and responding with resilience and regulation is the only way to prevent truth itself from becoming a commodity.

By EUvsDisinfo