Topics of the Week

The murder of Zelimkhan Khanghoshvili in Berlin orchestrated by FSB, Bellingcat reveals.

US military tracking disinformation around the coronavirus outbreak.

Foreign Affairs: How technology can strengthen autocrat rulers.

Good Old Soviet Joke

During the communist era, an old woman is trying to get into a speeding tram. In the end, she succeeds and sighs: “Thank God!”

A man turns to her and says: “But Grandma, don’t you know that nowadays we have to thank Stalin, not God!”

“Right”, grandma says and thinks about it for a bit. “But what if Stalin dies, who are we going to thank then?”

“Then we will thank God!” the man replies.

Policy & Research News

The murder of Khanghoshvili in Berlin orchestrated by FSB

Investigators of Bellingcat, The Insider and Der Spiegel have revealed that Zelimkhan Khanghoshvili’s murder which took place in August 2019 in Berlin has been organized by the Russian intelligence agency FSB. The FSB provided the assassin with the necessary training and the false identity documents. According to Bellingcat’s report, the FSB together with the Russian police also attempted to hide the identity of the assassin, who was identified in previous investigations as Vadim Krasikov.

Fake coronavirus scare: U.S. soldier infected in Lithuania

Brought to you by the Vilnius Institute for Policy Analysis

Russian disinformation machine is once again trying to set off panic about NATO forces in Lithuania. The recent outbreak of a novel coronavirus was used as a tool in order to disseminate nation-wide fear and distrust through Lithuanian media.

According to LRT and DELFI, three news portals – kauno.diena.ltdiena.lt and klaipeda.diena.lt – have been compromised with a sole goal of putting a slur upon NATO Enhanced Forward Presence (EFP) troops. The action resulted in publicising a series of fake articles. All publications contained false information stating that a U.S. soldier residing in Lithuania had contracted the new coronavirus, which originated in Wuhan, China. Lithuania’s Ministry of National Defence was indicated as the official point of contact and the Baltic News Service (BNS) as the original author of the articles. All articles were taken down 10 minutes later and are now unavailable to the wider public.

However, shortly after the incident, an additional fraudulent website was created and has been spreading on different social media platforms since last week. The website includes a series of staged articles in both Lithuanian and English as well as two articles warning about the so-called “Lieutenant Mo”, who was allegedly hospitalized in Lithuania with the symptoms of the coronavirus. It ends with a fear-mongering warning that “in January the Lieutenant MO visited public places and participated in city events with child and youth participation”.

Laimis Bratikas, a spokesperson for the Lithuanian Armed Forces Strategic Communications Department, confirmed to BNS that both hacking incidents are known to the officials and are currently being examined in more detail. He also reassured the public that “each and every U.S. soldier deployed in Lithuania is safe and feeling well”, and there have not been any cases of the coronavirus registered among the EFP troops.

Top disinformation investigator targeted by trolls

In August 2017, disinformation investigator Ben Nimmo was declared dead by 13,000 Russian bots on Twitter. The message was immediately shared thousands of times by the network of automated accounts.  As the New York Times reports notes began pouring in from worried friend— even though Mr Nimmo was very much alive. It didn’t take long for Mr Nimmo, who helped pioneer investigations into online disinformation, to figure out what was going on: He had been targeted by a shadowy group after reporting, along with others, that American far-right groups had adopted pro-Kremlin messages on social media about Ukraine. His fake death notice was a sinister attempt at disinformation. For the last five years, Mr Nimmo, a founder of the Atlantic Council’s Digital Forensic Research Lab, has been a leader of a growing community of online researchers, serving as an informal internet police force that combats malicious attempts to use disinformation to sway public opinion, sow political discord and foment distrust in traditional institutions.

US Developments

US military tracking disinformation around the coronavirus outbreak

The outbreak of coronavirus in China has made the issue the most popular topic in the Russian social media environment. US military is actively tracking possible disinformation operations linked to the issue. In the US army North study, hashtag coronavirus is by far the most popular topic in Russian social media at the moment, after topics like Brexit and Trump. Two issues that have provoked the most of the reactions are Russia´s China-border closure and the first two confirmed coronavirus cases inside Russia. However, RT and Sputnik are also spreading a story about a Trump cabinet member who thinks that coronavirus is a good thing for the US economy and a “deadliest day for China.” US is worried about the possible disinformation around the outbreak because in past Russia has used global pandemics such as AIDS as sources for large scale disinformation operations against the US.

World Health Organization commented that it is concerned about the “infodemic” around the coronavirus, which has caused widespread myths and rumours on social media. Because of this infodemic, Facebook, Google and Twitter started a campaign to take down posts and close accounts which spread misinformation about the cures and unproven theories around the coronavirus.

FBI chief gives congressional testimony on Russian disinformation

FBI Director Christopher Wray participated in a US congress hearing last week, where he confirmed that the US is still seeing an ongoing foreign influence campaign by the Russians. In addition to Russia, China is also active in foreign malign influence efforts inside the US. However, China´s most important goal has been to shift US policy and public opinion into a more pro-China direction, whereas Russia´s objective has been a larger-scale effort to divide the nation and undermine US democracy. Russia´s use of disinformation campaigns was at its highest around the time of 2016 elections, after which the scale of operations has decreased. The US has not seen any attacks against the election infrastructure so far, including not in the democratic Iowa primaries last week. Director Wray also warned that other adversary nations have started to adopt successful disinformation tactics from Russia. Chief among these nations is Iran, which for example spread disinformation about the crash of the US military plane in Afghanistan two weeks ago.

Kremlin Watch Reading Suggestion

The Digital Dictators – How Technology Strengthens Autocracy

By Andrea Kendall-Taylor, Erica Frantz, and Joseph Wright for Foreign Affairs

This week’s long read highlights how technology can strengthen autocrat rulers in their quest to preserve power. Autocracies perceive anti-government protests as their most significant threat and, in order to curb them, the regimes are harnessing a new arsenal of digital tools. Those are less intrusive and more efficient than traditional surveillance, are less human-dependent and use fewer resources. They also induce citizens to alter their behaviour without explicit physical repression. The nascent social credit system in China is an example, as it punishes dissent and rewards loyalty, thus shaping behaviours.

The digital age changed the context in which authoritarian regimes operate. New communication tools made it easier for citizens to mobilise and question the government. Between 2000 and 2017, 60 per cent of all dictatorships faced at least one anti-government protest of 50 participants or more, with 10 unseated authoritarian regimes, and another 19 that lost power via elections following the protests. However, digitally savvy authoritarian regimes are using technological innovations to push against popular mobilizations. Autocracies that use digital repression face a lower risk of protests, and a lower likelihood that the protests that do happen will evolve to large, sustained mobilisation efforts. 

Led by China, digital autocracies have grown far more durable than their pre-tech predecessors. Between 1946 and 2000, dictatorships lasted for around 10 years; now the average is 25. Countries such as Iran, Russia and numerous autocracies across Africa are following the Chinese footsteps and incorporating AI-powered surveillance to monitor citizens and identify dissidents in a timely and sometimes even pre-emptive manner. Russia also excels in the use of bots to amplify influence campaigns and shape public perception of the regime and its legitimacy. Maturing technologies such as microtargeting and deepfakes are also likely to further the capacity of authoritarian regimes to manipulate regime perception and discredit the opposition. The risk that technology will usher in a wave of authoritarianism is even more concerning as the authors indicate their findings that digital tools are associated with an increased risk of democratic backsliding.

New technologies have dual use: enhancing government efficiency and providing states with the capacity to address challenges such as crime and terrorism, but also with the tools to oppress, persecute and restrict opponents and citizens. Pushing back against digital authoritarianism requires addressing the detrimental effects of the new technologies on governance. The United States should expand legislation to ensure that local entities are not enabling human rights abuses by, for example, exporting hardware that incorporates AI-enabled biometric identification to rogue countries or by investing in companies that build AI tools for repression. Sanctions should also be applied against foreign individuals that are involved in AI-powered human right abuses. Finally, the United States should also make sure it leads in AI and, as such, helps shape the global norms for its use in ways that are consistent with democracy and human rights.

Kremlin Watch is a strategic program of the European Values Center for Security Policy, which aims to expose and confront instruments of Russian influence and disinformation operations focused against the liberal-democratic system.