loader image

Scientific research on cognitive biases and radicalisation

arrow

ln this section we will present the main scientific outcomes, contributing to a better understanding of how and why some extremist narratives are effective.
Our research is based on a multidisciplinary approach, combining the analysis of extremist narratives on social media through discourse analysis with the reception of such narratives through social psychology and communication research on cognitive biases.

Below you can find two reports analysing the impact of the project’s campaign towards young people conducted during the project and the lessons we have learned during its development. Our experiences and findings can surely be a useful starting point for other projects and campaigns tackling similar topics and audiences

hashtag2

Discourse Patterns used by extremist Salafists on Facebook: Identifying potential Triggers to cognitive Biases in radicalized Content

Understanding how extremist Salafists communicate, and not only what, is key to gaining insights into the ways they construct their social order and use psychological forces to radicalize potential sympathizers on social media. With a view to contributing to the existing body of research which mainly focuses on terrorist organizations, we analyzed accounts that advocate violent jihad without supporting (at least publicly) any terrorist group and hence might be able to reach a large and not yet radicalized audience. We constructed a critical multimodal and multidisciplinary framework of discourse patterns that may work as potential triggers to a selection of key cognitive biases and we applied it to a corpus of Facebook posts published by seven extremist Salafists. Results reveal how these posts are either based on an intense crisis construct (through negative outgroup nomination, intensification and emotion) or on simplistic solutions composed of taken-for-granted statements. Devoid of any grey zone, these posts do not seek to convince the reader; polarization is framed as a presuppositional established reality. These observations reveal that extremist Salafist communication is constructed in a way that may trigger specific cognitive biases, which are discussed in the paper.

(Restricted) Online access: https://www.tandfonline.com/eprint/YPTR9Y3S87GATHAHHUTD/full?target=10.1080/17405904.2021.1879185

Facebook’s policies against extremism: Ten years of struggle for more transparency

For years, social media, including Facebook, have been criticized for lacking transparency in their community standards, especially in terms of extremist content. Yet, moderation is not an easy task, especially when extreme-right actors use content strategies that shift the Overton window (i.e., the range of ideas acceptable in public discourse) rightward. In a self-proclaimed search of more transparency, Facebook created its Transparency Center in May 2021. It also has regularly updated its community standards, and Facebook Oversight Board has reviewed these standards based on concrete cases, published since January 2021. In this paper, we highlight how some longstanding issues regarding Facebook’s lack of transparency still remain unaddressed in Facebook’s 2021 community standards, mainly in terms of the visual ‘representation’ of and endorsement from dangerous organizations and individuals. Furthermore, we also reveal how the Board’s no-access to Facebook’s in-house rules exemplifies how the longstanding discrepancy between the public and the confidential levels of Facebook policies remains a current issue that might turn the Board’s work into a mere PR effort. In seeming to take as many steps toward shielding some information as it has toward exposing others to the sunshine, Facebook’s efforts might turn out to be transparency theater.

Open access on: https://firstmonday.org/ojs/index.php/fm/article/view/11705

How Jihadi Salafists Sometimes Breach, But Mostly Circumvent, Facebook’s Community Standards in Crisis, Identity and Solution Frames

We analyzed posts written by Facebook profiles who advocate violent jihad without supporting any terrorist group. They share extremist content in the middle of regular posts, thanks to which they are likely to reach a large audience. We identified to what extent their ingroup-outgroup opposition is constructed in crisis, identity, and solution frames and how they use these frames in posts which sometimes breach Facebook’s community standards, but which mostly circumvent them through various strategies of doublespeak. Among them, myth, in the sense of Barthes, and eudaimonic content appeared as particularly powerful to naturalize and spread jihadi ideology on social media. (Restricted) Online access: https://www.tandfonline.com/doi/abs/10.1080/1057610X.2021.1963092?journalCode=uter20
hashtag2

CONTACT US