by Sopo Gelava for DFRLab

More than one hundred Facebook assets promoted links to external websites sharing pro-Kremlin propaganda in Bulgaria.

Cover photo: Journalists record Italian Air Force Eurofighter Typhoon fighter jet during Dragon 24 military exercises in Poland on March 14, 2024. (Source: Dominika Zarzycka / SOPA Images via Reuters Connect)

The DFRLab identified a cluster of Facebook assets amplifying websites that target Bulgarian audiences with misleading and sensational content that often aligns with Kremlin propaganda. The cluster contains at least forty-four Facebook pages, thirty groups, and twenty-eight accounts. The cluster’s activity is primarily amplification of the external websites, which the Bulgarian organization Foundation of Humanities and Social Studies (HSSF) described as “mushroom websites” in 2023. The term highlights the phenomenon where certain domains appear, then become inactive over time, only to be rapidly replaced by new ones.

According to HSSF, up to 400 anonymous websites with similar designs monetize Kremlin propaganda. The website network is centered around four primary domains –,,, and Each had hundreds of subdomains operating as independent websites and publishing identical content. In 2023, the network of mushroom websites published more than 350,000 news articles, which appeared to be generated through automated means. According to HSSF, “mushroom websites have been the most powerful media tool of online dissemination in Bulgaria, particularly of propaganda materials.” HSSF reported that the network used the Share4pay platform to advertise content. In 2023, the Center for the Study of Democracy (CSD) additionally reported a link between the network and, a digital advertising platform in Bulgaria run by Bozhidar Kostov.

In 2023, the local fact-checking platform reported that the network of mushroom websites disseminated Kremlin disinformation that claimed children of Ukrainian refugees in European Union countries were taken from their parents. Additionally, in 2022, the Ukrainian fact-checking organization StopFake reported that Bulgarian websites, including those within this network, spread disinformation alleging that Ukraine had committed genocide against the people of Donbas.

The ‘mushroom’ websites

The DFRLab analyzed the creation dates and registration information for forty-six websites in the network that are connected by advertising trackers and registrant emails. Half of the websites (23) were registered using .eu domain extensions. Most websites were created between 2018 and 2020, with some created in batches on a single day by the same registrant, indicating a high likelihood of centralized coordination. By checking registration data using Iris Domain Tools, we found that emails linked to the AdRain advertising platform registered at least ten websites in the network between 2013 and 2018.

The left table shows websites created on the same day by the same registrant. The right shows websites created using emails associated with the AdRain advertising platform. Emails partially redacted. (Source: DFRLab via Domain Tools)

The DFRLab analyzed the content of these websites and found that they targeted Bulgarian audiences with various topics ranging from foreign and domestic political developments to sports, entertainment, lifestyle, and more. The content on the websites was not always inaccurate; however, political content was often presented sensationally and misleadingly, using click-bait headlines that echoed the Kremlin’s agenda and perpetuated propaganda.

One of the most recent narratives we identified centered around the Dragon 24 military exercises held in Poland on March 4-5, 2024. Dragon 24 is an operational and tactical military exercise that is part of NATO’s Steadfast Defender 24 program. At least twenty-five Bulgarian websites published identical content about Dragon 24 with the misleading headline, “Dragon 24: NATO prepares for war with Russia, soldiers cross the Polish Vistula River and….” The headline is intended to imply the military exercises at the river crossing were in preparation for a war with Russia. The incomplete headline trails off as a click-bait tactic to encourage users to read further.

A comparison of copy-pasta articles pushing the narrative that NATO is preparing for war with Russia by undertaking exercises in Poland. Many websites shared identical designs. (Source: Left to right, top to bottom:

Amplification on Facebook

By tracking the amplification of the Dragon 24 articles on Facebook, the DFRLab uncovered a cluster of 102 Facebook assets used to amplify the mushroom websites, including forty-four pages, thirty groups, and twenty-eight accounts. As some pages and groups were dormant or inactive, the accounts also disseminated the mushroom websites in Facebook groups beyond this cluster. This indicates that the accounts capitalized on existing Bulgarian Facebook groups with thousands to tens of thousands of members to promote the websites. Additionally, these accounts shared links in Facebook groups supporting the Kremlin and pro-Kremlin political parties in Bulgaria.

Facebook pages

Nineteen pages within the cluster incorporated the domains of the associated websites into their names. However, most of these pages had close to zero activity.

The list of Facebook pages promoting the Bulgarian “mushroom” websites. Nineteen pages used the website domains in their names. (Source: DFRLab via CrowdTangle)

Twenty-one pages in this cluster registered on Facebook as “media/news companies,” “news & media websites,” or “magazines.” Other page categories varied but included “food & beverage,” “athletes,” and “just for fun.”

Using historical data from the social media monitoring tool CrowdTangle, the DFRLab analyzed the posting activity of the Facebook pages between August 1, 2022 and March 25, 2024, as this was the period for which data was fully available. In total, the pages posted 95,122 posts containing links to the mushroom websites. The highest number of links were posted in January 2024, with 8,767 posts, followed by October 2022, with 7,202 posts. On average, the pages posted 4,700 links per month.

Chart shows the frequency with which the Facebook pages posted links to the external websites. The posting activity peaked in January 2024, followed by October 2022. (Source: Sopo Gelava via CrowdTangle and Flourish)

In some cases, the pages posted links leading to identical articles within a short period, typically spanning one to ten minutes, which is another indicator of the assets possibly being centrally coordinated. Only five pages shared the misleading article on the NATO Dragon 24 exercises. Notably, one of these pages, ‘The Mainline,’ was verified with Meta’s blue badge on Facebook.

A snapshot of CrowdTangle historical data shows that pages posted identical content within the span of minutes. Timestamps are in EDT. (Source: DFRLab via CrowdTangle)
Facebook pages within the cluster amplifying misleading content related to the Dragon 24 exercises. Among these pages, “The Mainline” has a blue verification badge. (Source: Left to right, top to bottom: “Новина” (“News”)/archive; “Новините днес” (“News today”/archive; “The Mainline”/archive; “”/archive; “ОБИЧАМ ТЕ !!! Опп грешка-ОБИЧАХ ТЕ” (“I LOVE YOU !!! Oops error-I LOVED YOU”)/archive

While most pages generated interactions through shared links, the two pages with the highest number of followers, “Засмей се” (“Laugh”) and “Картички и пожелания за рожден ден” (“Birthday cards and wishes”), 330,000 and 123,000 followers respectively, predominantly posted photos. Three other pages, “,” “,” and “The Mainline,” had more than 100,000 followers each. The follower count for nine of the pages ranged from 10,000 to 87,000. Twenty pages within the network had fewer than 100 followers.

A bubble chart shows the following for the cluster’s Facebook pages, with each bubble’s size corresponding to the number of followers. (Source: Sopo Gelava via CrowdTangle and Flourish)

The DFRLab also discovered that between 2020 and 2023, at least three websites — “,” “” and “” — were amplified to Bulgarian audiences via Facebook ads. Four ads ran without the disclaimer that is required when posting ads about social issues, elections and politics. Two ads were subsequently removed. The remaining ads provided only the page’s name in the beneficiary and payer fields.

Screencaps of ads that ran between 2020 and 2023 promoting domains within the mushroom websites network. (Source: DFRLab via Meta ads library)


At least seventeen of the identified twenty-eight accounts displayed signs of possible inauthenticity, including discrepancies between profile names and URLs, generic photos sourced from elsewhere on the internet, and multiple accounts with similar names. These indicators, combined with the coordinated promotion of identical websites and managing groups within the network, suggests that the identified accounts could be inauthentic. Three accounts in the cluster used the same profile or cover photo as the groups they managed.

An example of a likely inauthentic account that used a generic photo taken from elsewhere on the internet as its profile image. The account managed a group dedicated to President Rumen Radev. (Source: “Journal Journal”/archive, top left; “ЗА РУМЕН РАДЕВ И БЪЛГАРИЯ С ЛЮБОВ” (“FOR RUMEN RADEV AND BULGARIA WITH LOVE”)/archive, top left; Google Lens; bottom)

The DFRLab also discovered three accounts sharing almost identical names with slight variations. These accounts managed groups within this network that openly supported the Kremlin, and they promoted statements from Putin or other leaders supporting the Kremlin’s agenda. Additionally, the cover photo of one of the accounts mocked European Union values, while another cover photo expressed opposition to “participating in medical experiments,” likely referring to COVID-19 vaccines.

Screencaps of accounts in the cluster with identical names. (Source: “Елка Кунчева” (Elka Kuncheva”)/archive, top; “Кунчева Елка” (“Kuncheva Elka”)/archive, center; “Елка Кунчева” (“Elka Kuncheva)/archive, bottom.

Accounts played a significant role in promoting the mushroom websites on Facebook. While most pages and groups within the network had low engagement rates and few followers, the accounts worked to promote the links beyond the assets in the cluster by sharing links in popular Bulgarian groups. For example, a suspicious account named “Марвин В.” posted within seconds the misleading article about the Dragon 24 exercises in at least three groups outside of the cluster.

Screencaps of suspicious accounts promoting copy-pasta disinformation content on NATO Dragon 24 exercises in external groups within seconds. (“ПОЛИТИКА” (“POLITICS”)/archive, left; “ПРОКУРАТУРА – СЪД – СПРАВЕДЛИВОСТ” (“PROSECUTION – COURT – JUSTICE”)/archive, center; “Новини БГ” (“News BG”)/archive; right)

Facebook groups

A single page, “,” managed twenty-four of the twenty-eight groups in the cluster. Using CrowdTangle data, the DFRLab discovered that the groups appeared to be primarily created to promote links to the external websites. An analysis of the posting behavior between December 22, 2023, and March 22, 2024, demonstrated that up to 60 percent of group posts contained links to the external websites. Photos, which comprised 36 percent of the total posts, were predominantly shared in two groups, “Картички и пожелания за рожден ден” (Birthday cards and wishes) and “Бабини илачи” (Grandmother’s cures), which are the two largest groups in the cluster, amassing 241,000 and 184,000 followers respectively.  The next largest groups, “ЗА РУМЕН РАДЕВ И БЪЛГАРИЯ С ЛЮБОВ” (“FOR RUMEN RADEV AND BULGARIA WITH LOVE”) and “Вкусни рецепти, билки и полезни съвети” (“Delicious recipes, herbs and useful tips”), each had more than 10,000 followers. The remaining twenty-six groups had followings ranging from 3,000 to 5,000 accounts.

A graph illustrating that interactions within Facebook groups were primarily generated through links posted by accounts in the cluster. (Source: DFRLab via CrowdTangle)

The names of the groups varied but some were dedicated to promoting Bulgaria’s pro-Russian President Rumen Radev, Russian President Vladimir Putin, and establishing closer ties between Russia and Bulgaria, as well as other unrelated topics such as the mafia and Pablo Escobarrecipessports, and more. This diversity in group names is a common tactic to attract audiences across a broad spectrum of interests. The accounts within this cluster managed the identified groups and shared links to websites within those groups. Much of the content being linked consisted of direct translations of statements made by Russian leaders.

Screencaps of accounts promoting Vladimir Putin in Facebook groups. (Source: Left to right; top to bottom: “ВОЕННА ЗОНА – ВОЕННИ, ОРЪЖИЕ, ГЕНЕРАЛИ” (“MILITARY ZONE – MILITARY, WEAPONS, GENERALS”/archive; “ПУТИН, ТРЪМП, ОРБАН И ОСТАНАЛИТЕ” (“PUTIN, TRUMP, ORBAN AND THE REST”/archive; “ПОЛИТИКА И ХУМОР” (“POLITICS AND HUMOR”)/archive; “БЪЛГАРИЯ И РУСИЯ- СЪРЦЕ И ДУША” (“BULGARIA AND RUSSIA – HEART AND SOUL”/archive)

by Sopo Gelava for DFRLab

Sopo Gelava, “Suspicious Facebook assets amplify pro-Kremlin Bulgarian ‘mushroom’ websites,” Digital Forensic Research Lab (DFRLab), March 26, 2024,