It is not a secret that the StopFake Polska staff uses in their work different tools, including those publicly available. One of them is a group on Facebook, where a bunch of people sends us interesting content found in the Web. Some of those are links to fake news, published in depths of the Internet. We verify all of them and describe the most dangerous cases to warn the internet users. 

To my honest surprise, Facebook itself marked one of the links published on the private group as “fake”. It was a link to website kraina.biz.ua with “information” about creation of hungiarian autonomous district by Ukrainian government in Zakarpattia. “None of the Ukrainian governments, even Yanukovych, did not make such concessions, bearing in mind all of the threats for the stability of the region and for the national security. Zelensky’s government [Zelensky is the president of Ukraine – red.] has chosen a dangerous path”, we read in the article. 

Obviously, Facebook was right to mark the link as fake. It is untrue, nothing similar happened and the site kraina.biz.ua is on our list of misinformation seedbeds. Something different, though, got my attention – the fact that Facebook uses external factcheckers to intervene in private groups. I thought that the reach and hermeticity of such groups has to be indeed a problem, if they also are penetrated. And I was not mistaken. 

As we find out from the analysis on the website sprawnymarketing.pl, Facebook group are the result of changing Facebook’s policy, which consider this form of communication the future. Mark Zuckerberg and his people recognised that direct relations of groups members, relations with the closest ones and focusing on specific issues can help in unifying the community. The head of Facebook argued that “people are more careful then” and that “they prefer the intimacy of communicating one-on-one or with just several friends”. Indeed, it is easy to find groups focused on local issues, history enthusiasts or animal lovers – they work ably, like circles of interests. But this is merely one side of the coin. 

First of all, groups are an easy target for the marketing agencies, which are given profiled group of customers (most of the latest ad campaigns on Facebook are focused on groups). Secondly – and most interestingly for us – closed, focused on a particular topic communities are vulnerable to all kinds of misinformation, fake news and conspiracy theories. Especially, if it turns out that in was this groups objective from the very beginning or it got hijacked (e.g. by appearance of new members, who were passing as “normal” to get moderator status after some time). Leaders and communities of such groups can impose narratives and a lonely user, deprived of “external” verification of the content and the possibility to be warned (e.g. by friends or family who can see his activity on Facebook outside of the group), is especially vulnerable to such narrative. It comes to a paradox – when in the group of similarly thinking people we are extremely alone and, therefore, vulnerable to other’s influence. 

In the US the issue is taken very seriously. As wired.com reports in the article “Facebook groups destroy America”, coronavirus pandemic made “thousands of Americans take for truth conspiracy theories about microchips in vaccines and wonder about healing properties of hairdryers” [one of the supposed ways to “heal” coronavirus was to blow the hot air from the hairdryer into one’s throat – red.]. 

Even the Polish Facebook is not free of such “breaking news”. The group “I don’t believe in coronavirus – support group / YOU’RE NOT ALONE” has almost 92 thousand members (including 9 of my friends). Among the content we will find such ones as calling Tom Hanks, who called for wearing the masks “another bought creature, pawn of Soros and the rest of Zionist gang”.  It is merely a sample of content which with high frequency reaches tens of thousands people. All of this in doses of few up to over a dozen posts per day with their “discussions”, result of which is easily imaginable. 

The Wired refers to Wall Streer Journal analysis which states that one of the solutions could be making groups with huge number of users (such as 5000 people) only non-private and non-secret (in the opposite to e.g. family, sports team or school friends groups). In that case, theoy could be verified by journalists or scientists. Also, the algorithms should not suggest any groups – users should be required to find them on their own, this way they would make more conscient choices.  

Perhaps those are reasonable ideas. According to late 2019 information, on Facebook there are over 10 million groups with over 1.5 billion users. The Website itself says that a user can join up to 6000 groups. After reaching the limit the user has to leave some groups to join others – we can read in Facebook’s rules and conditions. It is surprising as it is hard to imagine that one person could participate in so many communities. At the same time it creates a perfect opportunity for anyone who would like to reproduce fake and mischievous content. 

The limitation of access to only few dozens of groups would efficiently make it more difficult to penetrate them by automated troll farms. In the case of the groups something else is certain – what was initially meant to make people less vulnerable to external influence became to a large degree its own contradiction.

Wojciech Mucha

Phot. pixabay.com

Follow StopFake PL on Facebook, Twitter, Instagram and Telegram.