Biology is an interesting avenue to make sense of disinformation, as deception is an important element of life. For their survival, many life forms are dependent on their ability to mislead others. Think of mimicry and camouflage, which enable animals to appear to be other than they are. A snake feigns to be dead, a butterfly’s wings resembling the rocks in the background or a pro-Kremlin outlet claiming Navalny was poisoned by the British government. The natural perspective can be insightful.
Another biological concept relevant for disinformation is the meme.
A meme is not necessarily a picture combined with a witty text in a white font.
Rather, a meme is an idea, behaviour, or style. It becomes a trend because it spreads from person to person. Memes are carriers for cultural ideas, symbols, or practices. They have many forms, as they are transferred from one mind to another through writing, speech, gestures, rituals, or other imitable phenomena.
The biologist Rich Dawkins coined the term “meme”; a neologism originating from his 1976 book The Selfish Gene.
Social media make spreading memes easier and speedier than ever before.
Nick Flann, a professor of computer science at Utah State University said: “So how quickly can I receive a meme and have it penetrate my mind and then change my behaviour and press the button that says ‘Share’? Now that’s seconds.”
Social media also make it possible to spread memes to a big audience when shared by popular accounts, like celebrities and politicians.
Given these conditions, it becomes possible to create a “rumour bomb”. This term, coined by professor Jayson Harsin, refers to the widespread phenomenon of rumoresque communication in current relations between media and politics. An example we covered recently, is a NATO report that exposed an information-laundering network that distributes rumoresque disinformation in the Baltics.
A meme is also a good instrument to hide a certain message under another: think of Pepe the Frog.
Most interesting about memes however, is the notion that the spread of ideas principally works in the same manner as other processes of natural selection.
This implies that disinformation narratives can be fit for survival, or not. Every meme has a certain replication rate (fitness). However, important to realize is, fitness is a relative term; it is not an essential quality of the narrative. Rather, fitness depends on its power to interact successfully with its surrounding landscape.
This has consequences for how spreaders of disinformation behave. To be successful, requires an experimental or adaptable approach. One example of such an approach is “hahaganda”, a termed suggested by the Latvian scholar Solvita Denise-Liepnice. In a 2017 report, NATO’s StratCom Centre of Excellence published a report explaining how Russian and pro-Kremlin disinformation outlets use humour to discredit Western political leaders. This particular brand of disinformation concentrates on ridiculing institutions and politicians. The goal of hahaganda is not to convince audiences of the truth of a particular joke, but rather to undermine the credibility and trustworthiness of a given target via constant ridicule and humiliation.
Being too flexible with truth can also harm the spreader. Last week, research by Michael Weiss showed the Kremlin pushed so many of its institutions toward distorting reality, consequently a cynical, defensive view toward reality became institutionalized. If you gaze long into the abyss, the abyss also gazes into you.
For example, Weiss describes the transformation of GLAVPUR after the collapse of the Soviet Union. This organization was established shortly after the October Revolution. Its responsibility: to maintain control over the Red Army. Its method: psychological operations. After 1991, it was transferred to the GRU and was given a new name, but the structure, methods and personnel remained. Gradually, it elided the distinction between war and peace. The “waiting-preparatory mode,” gave way to permanent “active hostilities.”
An evolutionary approach to disinformation, also has consequences for the ones fighting it. First, the narratives and the networks have to be exposed. Our job, basically. Second, the evolutionary landscape needs to be modified, so disinformation narratives become less fit.