A new study by London-based counter-extremism organisation, the Institute for Strategic Dialogue (ISD), looks at the connections between white supremacist and pro-terrorist channels on Telegrams. In this CARR insights piece, CARR Practitioner & Policy Fellow and ISD Senior Research Manager Jacob Davey highlights the key findings of the research and what that means for better enforcement against promotion of terrorism on the platform.
Introduction
Through the ISD’s work monitoring the extreme right online we consistently seek to identify and explore internet platforms which provide a comfortable home for extremist groups to share material, organize and establish communities. Over the past year, it has become increasingly apparent that the encrypted messaging platform, Telegram, has become one such place, providing a safe space for a global cohort of white supremacists.
Qualitative analysis by a range of civil society organisations, including Hope Not Hate and the Community Security Trust (CST), has highlighted how the platform plays host to a particularly violent cohort of white supremacist groups who often advocate for the use of terrorist tactics to advance their agenda. Given the recent global surge in terrorism from the extreme right and the global advancement of extreme right accelerationism which prioritises the use of violence in order to expedite the collapse of democratic systems, these trends should not be taken lightly.
Findings: Violent White Supremacist Networks and Content on Telegram
In order to better understand how violent white supremacist communities function on Telegram, we engaged in a research project designed to track both the network dynamics underpinning these channels, as well as the nature of the conversations taking place within these communities.
As a platform, Telegram is well suited for network based analysis. One prominent feature of Telegram is the ability for users to forward content on from other channels. This allows for users to swiftly identify new communities which are relevant to their interests. It also enables researchers to quickly ‘snowball’ out from a small set of channels to identify a network of related communities. Starting with a set of 18 Telegram channels associated with violent white supremacist groups, including Atomwaffen Division, we followed links to connected channels, ultimately identifying a group of 208 channels which promoted white supremacist ideology.[1] We then manually coded these channels, identifying 49 channels which were primarily set up to promote politically-motivated violence. Additionally, we found that 7 channels were designed to share tactical material including advice on how to prepare for an insurrection and survive off-grid; 74 were general discussion groups; and 78 were content banks which serve as resource hubs for material such as meme dumps and PDFs of white supremacist literature.
We then visualised this network using Gephi, representing individual channels as nodes, and colouring pro-terrorist channels green and other white supremacist channels orange. In the map below, the nodes closer to the centre are channels which have been forwarded to a greater number of other channels in the network, and the larger nodes are channels which forward a greater amount of content from other channels in the network.
Crucially, this network visualisation reveals that pro-terrorist channels do not exist in isolation on Telegram – they are well connected to a broader cohort of white supremacist channels which often link and are linked to them. This suggests that users of Telegram with a general interest in white supremacist ideology could easily find themselves directed to channels advancing a violent, oftentimes accelerationist agenda.

Following on from this network mapping exercise, we sought to quantify the nature of the content shared in these channels in a more systematic nature. In order to achieve this, we extracted a sample of over 1,000,000 messages shared in this network and performed a search for keywords related to particular topics.
One such topic which we searched for was mentions of violence endorsing groups and the names of individuals who have committed terrorist attacks against minority groups. This search found mentions of these terms in 125 of the channels studies, highlighting that even in channels which are not primarily set up to promote terrorism, there exists a culture which glamorises violent groups and individuals.
Conclusion & Recommendations
This study does not represent a comprehensive audit of white supremacist activity on Telegram, and our network visualisation exercise revealed a wider pool of hundreds of additional channels which are connected to the channels identified. What it does help to demonstrate, however, is that there is a well-established network of violence endorsing communities currently operating unimpeded on the Telegram platform. Telegram’s terms of service are scant but they do, on paper, prohibit the promotion of violence on publically viewable channels. This analysis suggests that their enforcement of these terms of service are not effective. As policy makers focus in on the role of social media in advancing violent white supremacy and its accelerationist ilk, it is essential that they set their sights on bringing Telegram to account.
Mr Jacob Davey is a Policy and Practitioner Fellow at CARR and a Senior Research Manager at Institute for Strategic Dialogue (ISD). His profile can be found here.
© Jacob Davey. Views expressed on this website are individual contributors and do not necessarily reflect that of the Centre for Analysis of the Radical Right (CARR). We are pleased to share previously unpublished materials with the community under creative commons license 4.0 (Attribution-NoDerivatives).
[1] For the purpose of this research, we define white supremacy as: “The belief in the superiority of whites over non-whites, and that white people should be politically and socially dominant over non-white people. This can extend to a belief in the need for violence against, or even the genocide of, non-white people.”