By EU vs Disinfo

Since the US election in 2016, Facebook has worked with improving its knowledge of and resilience towards information operations that are using the platform to spread disinformation and sow discord.

On 31 July Facebook announced that they had removed 32 pages and accounts from their platforms (Facebook and Instagram) for “coordinated and inauthentic behaviour.” Facebook concludes so far that the actors behind the accounts (called ‘bad actors’ by the platform) have been more careful covering their tracks than previous bad actors, in part due to the measures Facebook have taken over the past year to prevent abuse of the platform. However, using lessons learned from the previous investigation into the operations conducted by the St Petersburg troll factory – the Internet Research Agency (IRA) – Facebook could identify the recently disabled inauthentic accounts. The deleted accounts and pages had created about 30 events since May 2017 of which the largest had approximately 4,700 accounts interested in attending, and 1,400 users saying that they would attend. One of the accounts had created a Facebook event for a protest on August 10 to 12 which enlisted support from real people, indicating an active influence operation.

To help with the analysis, Facebook has shared the information with the Digital Forensic Research Lab (@DFRLab), who will conduct a thorough analysis of the accounts. An initial analysis from the DFRLab shows that although attribution is hard to make at this early stage, behavioural and language patterns were similar to the operations run from the Internet Research Agency from 2014 to 2017; such as non-native language use, plagiarism and divisive content.

Follow the ongoing analysis here.

By EU vs Disinfo