In the third part of a series looking at the radical right’s use of alt-tech, Professor Megan Squire takes a look at Telegram – and the shifting sands of technical features and content regulation on the platform.
Telegram is an enormously popular social network and encrypted messaging service with more than 300 million monthly users worldwide. It is, however, still relatively unknown in North America and some European countries. In recent years, Telegram has gained notoriety as the encrypted messaging service of choice for ISIS and – increasingly – by US-based white supremacist terror groups, such as Atomwaffen and The Base. What are the technical features in Telegram that make it so useful to terrorists and hate groups?
Creating the perfect hybrid tool for hate
Telegram was launched in 2013 as an encrypted messenger. It was started by two brothers, Pavel and Nikolai Durov, who were formerly the founders of Vkontakte (VK), sometimes called “the Russian Facebook” due to its similarity in aesthetics and features. Unlike other social platforms, Telegram charges no fees, offers no advertisements, and has no apparent business model outside of accepting donations from its founders.
The purpose of encrypted messaging apps such as Signal, WhatsApp, Wire and others is to help users maintain confidentiality of electronic communication. Such encrypted messaging has always been Telegram’s core feature, but over the years the platform has also added additional privacy- and secrecy-oriented features including “delete everywhere” (the so-called “nuclear option”) for chat messages, “secret chats” which are end-to-end encrypted, and automatic account self-destruction after periods of inactivity.
In 2015 Telegram added two important social features that supplement the encrypted messaging. First, Telegram now offers public and private discussion groups that can host up to 200,000 users. It also instituted broadcast “channels” designed for one-way communication between channel administrators and an unlimited number of people. For clandestine radical right extremist groups, these public Telegram channels are indispensable for spreading their memes, forwarding content from similar channels, and attracting new recruits.
This two-pronged approach provided by Telegram is a key technical asset for hate and terror groups. Why? Because the platform offers both public channels that can be used for recruitment and propaganda, and the encrypted messaging needed for planning clandestine activity. Couple this with lax enforcement of its own policies about prohibited speech, and Telegram becomes a very effective tool for hate.
Adding even more irresistible features
Recently, Telegram has added even more features that are leveraged by hate and terror groups. For example, Telegram’s “file storage” feature allows hate groups to store e-books, podcasts, instruction manuals, and videos in easy-to-use propaganda libraries. This allows the channels to quickly disseminate classic texts alongside newer manifestos and first-person point-of-view terror videos. Researchers note that the “files” feature was also popular with ISIS terrorists, who used it to establish clearinghouses for jihadi material.
In the post-recruitment stage, public and private discussion groups (“chats”) become valuable for inculcating new members, for building camaraderie among members, making plans, settling disputes, and so on. Users in chats primarily communicate with text, audio, or videos. However, on Telegram users can also create and install sets of images called “stickers” which resemble large emojis. Hate and terror groups have created hundreds of different sticker sets to convey different aspects of their ideologies, and encourage users to collect the sets and use them in other channels.
Telegram does not have the “you might like” channel recommendation algorithms so familiar to users of Youtube, Facebook, and Twitter. So how do new users find channels to join? Public Telegram channels are broadly accessible via a Google search, but finding specific content within the platform can be trickier. Telegram does have a rudimentary in-app Search feature, but that only returns results matching words in channel titles, or words that match content from a channel the user has already joined. To remedy this, many established hate channels regularly post “follow lists” of other recommended channels to follow, neatly divided into categories. Some Telegram hate purveyors who were formerly on the 8chan web site have even gone as far as to produce a bi-weekly, color-coded spreadsheet that serves as a leaderboard for channel growth and decline. These reports have become popular enough that channels regularly petition for inclusion on the leaderboard and complain when they are slighted by being left off the list.
Finally, to manage all this communication, Telegram offers a “bot” development infrastructure so that Telegram accounts can be operated by software. For example, all sticker sets are created by uploading images to bots, and bots facilitate financial payments on Telegram. Some of the terror groups on Telegram have also instituted simple “vetting” bots to accomplish tedious tasks such as ensuring that group members have the required account information before joining a group (e.g. a profile photo and username), or that new users introduce themselves in an appropriately offensive way (using a slur, for example). One Christian terrorist channel has implemented a bot that will generate a Bible verse on command.
Telegram’s approach to content removal
Telegram has been banned to varying degrees by Iran, Russia, China, and Indonesia for a variety of reasons, including government censorship. In the Indonesian case, government officials cited the prevalence of terrorist content on the platform as a key reason for implementing the ban. In response, Telegram promised to crack down on channels associated with the Islamic State. The official Telegram ISISWatch channel now documents a count of daily and monthly channel takedowns. For December 2019, ISISWatch claims that over 55,000 ISIS channels were removed, or an average of nearly 1800 channels per day. In addition to removing channels, Telegram also removed accounts belonging to users suspected to be spreading pro-ISIS content.
Telegram’s response to removing white supremacist hate and terror content from groups in North America or Europe has been more subdued. Telegram relies on “restricting” channels from view on certain platforms rather than “blocking” channels outright, as it did with ISIS. Telegram has the ability to place a restriction flag indicating that the channel’s content violates the Terms of Service for whichever company created the mobile platform where Telegram was installed. This restriction flag makes it impossible to view content for flagged channels using the Telegram client (or “app”) downloaded from Apple’s app store, for example. However, these restricted channels are still visible from a web browser, including Apple’s own web browser Safari.
Telegram occasionally places channels in this restricted status for a variety of reasons, including for promoting pornographic content or violence. Recently, in June 2019 and again in January 2020, Telegram placed a few dozen white supremacist channels into this restricted state. It is unclear what criteria were used to choose which channels to restrict, or whether they were asked to do this by a platform provider or some other entity.
Nonetheless, each time this happens, channel operators and users are quick to produce a series of workarounds. In January, users affected by hate group restrictions created infographics explaining how to view restricted Telegram channels in the browser, and circulated tutorials explaining how to use bots to aggregate content from restricted channels into a new, unrestricted channels. The game of cat-and-mouse continues.
In addition to blocking and restricting entire channels, Telegram also claims to rely on users to report individual messages. Unlike other social media platforms, Telegram’s in-app “report message” button has no hate or terror classification, and requires users to describe the problem themselves. Once the report is submitted, an acknowledgement (“Thank you. Your report will be reviewed very soon.”) is shown briefly on the screen before disappearing. I have observed no other follow-up messages or any other communications about the status of any reports, and there is no indication of where to look for follow-up messages from the company.
Without more transparency from Telegram about their channel restriction policies, and more responsive reporting policy, it is likely that white supremacist hate content will continue to grow on the platform, just like it did with other terror ideologies that came before. A more water-tight restriction policy and better transparency around the rationale behind such restrictions is therefore needed to prevent extremist content from spreading. Further, Telegram must promote confidence among users in reporting hateful content by taking swift action and communicating to users about the results of their reports. Telegram must demonstrate that it is serious about tackling the problem of hate on its platform, or the results could be devastating.
Professor Megan Squire is a Senior Fellow at CARR and Professor of Computer Science at Elon University. See her profile here.
© Megan Squire. Views expressed on this website are individual contributors and do not necessarily reflect that of the Centre for Analysis of the Radical Right (CARR). We are pleased to share previously unpublished materials with the community under creative commons license 4.0 (Attribution-NoDerivatives).