While preventing dehumanization of outgroups is a widely accepted goal in the field of countering violent extremism, current algorithms by social media platforms are focused on detecting individual samples through explicit language. This study tests whether explicit dehumanising language directed at Muslims is detected by tools of Facebook and Twitter; and further, whether the presence of explicit dehumanising terms is necessary to successfully dehumanise ‘the other’ – in this case, Muslims. Answering both these questions in the negative, this analysis extracts universally useful analytical tools that could be used together to consistently and competently assess actors using dehumanisation as a measure, even where that dehumanisation is cumulative and grounded in discourse, rather than explicit language. The output of one prolific actor identified by researchers as an anti-Muslim hate organisation, and four (4) other anti-Muslim actors, are discursively analysed, and impacts considered through the comments they elicit. While this study focuses on material gathered with respect to anti-Muslim discourses, the findings are relevant to a range of contexts where groups are dehumanised on the basis of race or other protected attribute. This study suggests it is possible to predict aggregate harm by specific actors from a range of samples of borderline content that each might be difficult to discern as harmful individually.
Keywords: dehumanisation, out-groups, dangerous organisations, digital platform policy, right wing extremism, content moderation.
Read the report HERE.
Professor Mohamad Abdalla, Director, Institute for Islamic Thought and Education, UNISA and Adjunct Professor, School of Education, Victoria University of Wellington Mohamad.Abdalla@unisa.edu.au
Dr Mustafa Ally, Senior Lecturer in Information Systems, School of Management and Enterprise, University of Southern Queensland, Mustafa.Ally@usq.edu.au
Rita Jabri-Markwell, Advisor and researcher with the Australian Muslim Advocacy Network (AMAN), email@example.com