On 12 August, Donald Trump endorsed a congressional candidate from Georgia, the victor in a Republican primary runoff held the previous day. Like an alarmingly increasing number of congressional hopefuls and current politicians, Marjorie Taylor Greene embraces the QAnon conspiracy theory, and her primaries victory signals an evolution of QAnon from fringe conspiracy to political platform.
Some argue, in fact, that it is no longer a conspiracy theory but a full-blown cult, and with its growing membership and increasing acts of violence, it would be hard to dismiss this claim as being overblown. Attempts to stop QAnon groups, hashtags, and accounts have been unsuccessful in controlling the spread of the conspiracy theory, and recently the group has evolved its tactics by not only trying to evade censorship, but also attempting to co-opt other movements like Save The Children in response to these bans as well as to widen their recruitment net. Conspiracy theories have long been the far right’s (racist, misogynistic, anti-semitic, and Islamophobic) bread and butter, but QAnon has succeeded where others may have not: it’s become mainstream.
The evolution of QAnon has been nothing short of remarkable. Originating on the infamous image board 4chan, it spread to online communities like reddit and its /r/PizzaGate community, which was banned in 2016 for violating reddit’s rules on doxing and harassment. The conspiracy has even caused people to try and “liberate” the children by going to the pizza restaurant (hence “PizzaGate”) at the center of the conspiracy, Comet Ping Pong, armed with weapons.
Jumping from the fringes of the Internet into political platforms, the movement continues to evolve itself and builds on its original story in order to sustain itself. Conspiracy theories have long relied on evolving their narratives, and the most successful ones are not only reiterated and spread, but also built upon by their participants. They stitch together different aspects of other conspiracies, mutate or evolve the existing story, and—not unlike the way that people create fan fiction—engage in world building that then allows the conspiracy theory to take on a life of its own. Put simply, a successful and long-lasting conspiracy theory is one that provides enough information to make sense of events, but leaves enough room for adherents to fill in the blanks themselves through their own “research” and through engaging—and building—with like-minded others.
Conspiracy theories, particularly online, are dynamic and ever-changing. Besides the dynamism, what keeps many people engaged in these communities—where they are at risk of being further indoctrinated—is the group and sense of belonging that builds up around them. Knowledge is not so much an individual enterprise as it is a group enterprise. Knowledge is created and sustained in groups in order to not just lay claim to power, but also to maintain it.
Rather than trying to analyze the spread of the conspiracy theory across online platforms, attempting to predict how it may change based on previous patterns may reveal more insight into the appeal of the conspiracy itself. In essence, the QAnon conspiracy is successful because of factors that scholars who research persuasion are wholly familiar with:
- It provides an easy-to-process story to explain a horrible event or facet of society (child trafficking);
- Its persuasiveness is steeped in emotional/affective appeals;
- Belief provides a sense of community, group belonging, and a feeling of agency.
QAnon isn’t successful just because it provides information to make sense of the world, but also because it allows its believers to shape the narrative beyond the initial story.
This means that in a new political era with new platforms, the conspiracy has evolved from being focused on Hillary Clinton to attempting to implicate singers like Justin Bieber. It has grown beyond being a Democratic-led pedophile ring based out of a pizza restaurant to include long-held fears about Satanic cults (popular in the 1980s and 1990s) and cannibalism, and even organ harvesting. In some way, the QAnon conspiracy isn’t appealing because it’s a coherent narrative but because it’s a grab-bag of moral panics that have all been squashed together to create a rat-king conspiracy that contains multitudes. Believers can then pick and choose which parts of the conspiracy they most strongly align with, the pieces they choose to spread, and more importantly, what they choose to build upon. Conspiracy theory communities have always to some extent bled into one another, but the Internet has taken it to a new level, giving room for these beliefs to spread but also allowing people to engage at a level of co-construction and co-evolution that is difficult to keep up with.
Conspiracy theories, and especially ones like QAnon, reflect not just alternative epistemologies but also a form of collective consciousness and intelligence. This is where something like QAnon can stop being considered a conspiracy theory and may more accurately be said to resemble a cult or new religious movement. Unpacking what makes the conspiracy theory so wildly popular points us to the basic sociological concept of functionalism: for believers, it gives them a sense of meaning and a purpose to life; helps to reinforce social stability; is an agent of social control and behavior; and motivates them to work for social change. For their part, QAnon adherents believe that they are fighting for a better world vis-à-vis the elimination of child trafficking and exposure of pedophiles. The use of children as political pawns has been a long-standing tactic to advance political goals, and this strategy works well for QAnon believers: to try and argue against them can be turned against the speaker by claiming that they must then support child trafficking.
Trying to de-radicalize a QAnon believer will inevitably involve confronting these circular arguments, and on some level the media attention to QAnon may result in “deviance amplification” due to society’s negative view of the movement (i.e., they’re “crazy”). But the concern around QAnon’s reach is not unwarranted: the FBI has classified them as a threat, and real acts of violence have been carried out by its believers. And the role of platforms in spreading the cult of QAnon also cannot be overlooked. Facebook’s algorithm recommended QAnon groups to users, with millions of group members continually building and spreading the conspiracy. YouTube, Instagram, Twitter, and now TiKToK are hotbeds of QAnon content despite the platforms’ attempts to quell its spread.
The cult has spread globally in the time of COVID-19, and has elbowed its way into mainstream politics by aligning itself with other groups like anti-government paramilitary groups and white supremacists. In some way, attempts to ban and censor the group have only given it more power: the subreddit was banned in 2016, and since then it has only grown. If anything, perhaps we should have seen this coming: QAnon can be the only logical outcome of the growth of right-wing populism, affective networks of paranoia, and the laissez-faire approach to content moderation of these major platforms, and their lack of accountability in the spread of extremism. QAnon doesn’t just demonstrate the power of conspiracy theories; it also highlights the limits of the effectiveness of deplatforming, censorship, and bans, and is helping to usher in a new era of the far right.
Dr Julia DeCook is a Senior Fellow at CARR and Assistant Professor at the School of Communication, Loyola University Chicago. See full profile here.
© Julia DeCook. Views expressed on this website are individual contributors and do not necessarily reflect that of the Centre for Analysis of the Radical Right (CARR). We are pleased to share previously unpublished materials with the community under creative commons license 4.0 (Attribution-NoDerivatives).
This article was originally published at CARR’s media partner, Open Democracy. See the original article here.