Our elite Mastership Sourcebooks for NCFCA and Stoa will release soon! Check them out here!

“Surely, surely you can’t be serious; that’s absurd—borderline conspiracy theorist!”

Yet, sure enough, he was being serious. And as we got into a heated conversation, I soon regretted not taking him seriously.

In this age of rapid information and opinion sharing, especially in the form of social media, never before have sources of extremism, conspiracy theorists, propaganda, trolls, etc. had such a platform for spreading and discussing their message as well as generally destabilizing discourse. Overall, this kind of activity seems to be growing—especially in terms of public visibility and well-known organizations. Yet, it was not until the past year, when I came to college, that I actually found myself face to face with extremism/conspiracy politics in the flesh (as opposed to on a seemingly irrelevant or ignorable screen). Furthermore, as I have done research for projects with topics on Russian foreign policy and military, I have needed to wade through waves of propaganda and other fringe politics. After a couple of heated conversations with actual people, though, I realized that despite all of my debate experience, I was not well-prepared to engage with some of these ideologies. It wasn’t that I couldn’t refute the actual substance of what was being said; the problem was that I was caught off-guard by the rhetorical tactics being used. Eventually, I have begun to discern the tactics and have been able to better counter them. Nonetheless, I can see that despite the trend towards extremism (and arguably, a resurgence of propaganda), even some debaters appear to be unaware of how some people argue for these positions. But it is increasingly crucial that we develop and defend our views, and, when necessary, combat destructive discourse and misinformation. Thus, this article will provide a brief introduction/overview to dealing with this form of rhetoric.

What I’m Talking About

As much as I wish I could just say “here’s a list of propaganda/extremist ideas,” I don’t want to go too deep into these (occasionally mature) topics or present authoritative views on them—especially since some might misunderstand what specifically I am identifying as propaganda or a conspiracy theory. Thus, without going into too much detail, I’ll say that some of the general topics I’ve come across have been:

  • Where ISIS has gotten some of their equipment
  • Israeli involvement in Palestinian groups (and Muslim groups in general)
  • The Sovereign Citizen movement
  • Russian involvement in Ukraine
  • George Soros
  • Certain Alt-Right activities/organization
  • Antifa (and the “alt-left”)
  • Pizzagate

The list could go on, but the point should be clear: there are lots of topics out there, and they aren’t all decades old (e.g. “the moon landing was fake,” “the CIA killed Kennedy”).

The Starting Point: Having the Right Mindset

The first recommendation I have is that if you are going to get involved in a quarrel over one of these kinds of topics, you need to have the proper mindset and/or expectations.

  1. Don’t universally underestimate what you could be dealing with. No doubt, there are plenty of moments of over-the-top “crazy,” and these are what generate the images some people have of “blatantly obvious propaganda” or “nonsensical conspiracy theorists.” And indeed, I would say that it’s more common to encounter these forms of “obviously crazy” extremists/etc. However, it’s important that you don’t assume that all propaganda is obvious or that all extremists are incomprehensible or inarticulate; there are sources of subtle or “sophisticated” propaganda and extremism.
  2. Especially when these sources are “sophisticated,” they and their believers tend to obsess over a topic. As a result, they often are far more prepared to discuss the topic than you are, both in terms of content and rhetoric. Thus, even when reality is in your favor, you can find yourself outmatched in preparation. 
  3. Closely related to the previous point, the kinds of claims and rhetorical tactics that are employed by these speakers are typically very unorthodox, which can make them very disruptive or confusing. 
  4. The audience very well could be against you (and, as one could likely guess, the source/speaker will be very committed to their beliefs). 
  5. Lastly, related to the first point, don’t just flatly reject everything that these sources say, in part because there is almost always some thread(s) of truth, no matter how slim, and also because (as will be discussed in the next section) one of the tactics that some sources/speakers use is to dramatize a simple point so as to incense people to instinctively (i.e. hastily) disagree with a point in its entirety.

Ultimately, all of this is to say: be considerate about what you are getting into. I’m not saying that sophisticated extremism/etc. is the norm, but it is definitely out there.

Practices and Tactics to Watch Out For

As mentioned before, the kinds of rhetorical tactics that some extremists/propaganda/conspiracy theorists/etc. employ are not weak or ineffective. Yet, many people may not be able to recognize the practices. The following is a list of some of the most common tactics I have encountered:

1. Whataboutery

“You shouldn’t invade countries and violate their national sovereignty. What Russia is doing in Ukraine is wrong.”

“Oh yeah? But what about LibyaIraqIranKoreaPhilippinesVietnamMexicoPuertoRicoNicaraguaChile…” etc.

Basically, whataboutery is a form of a tu quoque fallacy, in that it attempts to direct attention away from the action in question by saying “whatever; you did something similar!” especially so as to create moral equivalency between it and things that America (or the West in general) has done in the past. Another common aspect/fallacy of this tactic is a hasty, rapid-fire judgement on the “whatabouts” (for example, not discussing the Cold War context of some of the US interventions).

Overall, the problem with this fallacy is that although America has done bad things (e.g. slavery, abuse of Native Americans), when discussing a specific action’s morality—especially an action that is ongoing (e.g. Russian involvement in Ukraine)—the fact that other countries have done bad things does not justify doing yet another bad thing.

2. Bait and switch

Like the previous tactic, bait and switch is often given through rapid-fire delivery, with the goal of changing the topic. I’ve had to deal with this in person multiple times; the following is a simplified summary of one such encounter:

“Overall, Bashar al-Assad is a great ruler, and has been good for the Syrian people… ”

“What?! You cannot be serious! He’s used chemical weapons on his people, engaged in ethnic cleansing, and—”

“But just look what happened now that the West tried to undermine his regime: the country has gone to chaos; attempted regime change created ISIS. All around the world, regime change has destroyed stability…” etc.

What began as a discussion about supporting Bashar al-Assad’s rule quickly changed into a discussion about the merits of regime change. Thankfully, I was able to see what was happening and didn’t follow along, but the instinctive response is to just anchor into your impression at the beginning of the conversation (which in that case was outrage) and not change, especially when the topic only subtly changes. Furthermore, this allows someone to redirect the topic if they see that it may not go well.

3. Stringing along and denial-baiting

This is similar to the previous tactic: essentially, this is where the speaker/source tries to pull you along into constantly denying what they say, until they get you to deny something obvious (or at least debatable). One way they do this (and what makes it technically different from the previous tactic) is that the original claim might be similar to the claim made at the end of the discussion, except the initial claim is usually exaggerated (or it exhibits the next tactic, ambiguous meaning). For example, someone might say that, “Politics are almost entirely dominated by manipulative, self-interested elites like George Soros.” The generalizations here are inaccurate and harmful: specifically identifying George Soros is often a basis for anti-semitism and/or anti-Leftism. However, in the end it is not unreasonable that “politics are to some extent influenced by wealthy elites (on both sides of the aisle).” Trying to disagree throughout the conversation often will leave you appearing uncredible, which is a major goal of the speakers/sources.

4. Dog whistles and double meaning

A dog whistle is when someone says something that is understood in different ways by different groups of people. More specifically, a dog whistle is often a statement which is interpreted as “acceptable” or “normal” by an opposing or neutral audience, but the source’s supporting audience hears something different—typically, something more radical. Not only does this give plausible deniability to the speaker for promoting extremism, but it can also allow them to reach opposing audiences at the same time. For example, if someone who secretly supports white nationalism wants to credibly pose a question about the changes in racial demographics while also stoking fears and rallying support among other white nationalists, they would not explicitly state their views; they might just “explore questions” while constantly emphasizing/romanticizing terms such as nation (instead of country) or heritage (instead of history). To be clear, though, just because someone uses phrases that could be dog whistles does not make them dog whistles. It’s just that when discussing or analyzing extremism/propaganda/etc. this is something to be aware of, and which should generally be avoided.

Another tactic in this category is to make claims which have different meanings when taken literally, as opposed to taken in context or “generally.” This can be problematic because when one is trying to pin someone down on a claim, the speaker/source can just shift their stance or meaning.

5. Distortive language; omissive framing; false dichotomies/dilemmas

Distortive language is often used as part of the previous tactic (double meaning), and is simply when a situation is phrased in a way that does not accurately represent the situation or actions being taken. Omissive framing is where the situation is misrepresented not by necessarily distorting what is actually said, but by leaving out important details. As mentioned earlier, one example would be to pass judgement on America’s CIA-backed invasions and coups without considering the Cold War context. The last type in this category, false dilemmas, may be a somewhat well-known fallacy, and is also just a mixture of omission and distortion, but it is such a hallmark of extremism and propaganda that I should mention it. Essentially, this is where a situation is presented as, “You can either choose option A, or option B” when there are actually other alternatives. For example, Bolsheviks and a few other Socialist parties in Russia during WW1 were vehemently anti-imperialist, and thus presented the situation as (paraphrased), “You are either for the World Proletariat or (capitalist) Imperialism.”

6. Pigeonholing or otherwise mischaracterizing the opposition

This practice is closely related to a strawman fallacy, in that it seeks to falsely characterize one’s stance. However, pigeonholing is distinct in that it tries to define you (or “your party”) as the one who has unreliable beliefs. For example, some alt-right voices incessantly act as if almost everyone who speaks out about police brutality necessarily is in support of the violence committed by cop killers. The point of this is to try to discredit their opposition’s stances. While that is a rather clear example, it can be far more subtle, and might actually be surprisingly effective when combined with rapid-fire delivery and some of the other tactics mentioned already (e.g. denial-baiting, bait and switch, false dilemmas).

7. Outright falsehoods or exaggerations; unreliable or vague sources

This may be one of the most iconic tactics of conspiracy theorists and propagandists. Yet, much like with perceptions of extremists/etc. overall, it seems that people may not always realize how subtle or sophisticated this can be. One example of this would be extremists and trolls’ (and even political organizations like TPUSA’s) usage of the fake, photoshopped “Antifa attacks police officer” photo to discredit the movement. Additionally, as per the name of the tactic, sources/speakers may also throw around “research” from highly questionable (but perhaps credible-sounding) organizations like “Global Research,” or they may just say “journalists reported that [insert extreme anti-America/etc. story]” or “scientists found that this [e.g. gives you cancer].” Ultimately, there are many different ways that this tactic can be used, but this should give the general idea. Just remember that it isn’t always overt, and may be delivered unexpectedly quickly or in tandem with other tactics.

General Recommendations

  • As covered by a previous section, it’s important to go into the encounter with the proper mindset and expectations.
  • Be the better communicator. You want to be a model for proper, constructive discussion rather than hollow and caustic extremism (or misleading propaganda). This improves your credibility if you avoid stooping to their level. However, perhaps more importantly, insult-arguing, name-calling, etc. is where these speakers/sources often excel; it is trying to fight them on their home turf, and this typically does not end well for someone who cares more about being right than simply insulting their opposition.
  • Don’t get dragged into supporting unfamiliar positions; don’t get pigeonholed or baited into denying everything. In particular, it may help to start your disagreement with your own words and position, rather than just saying the opposite of their stance. These kinds of speakers’ strongest advantage comes from pulling you into unfamiliar territory, throughout which they may have planned argumentative traps.
  • Explicitly identify any tactics they are employing. Not only does it disrupt sources/speakers to show that you are at least loosely aware of what they are trying to do, but it also is important if there is an audiences, so that they can recognize, “Yeah, he did just change the topic when he was getting cornered,” or, “Wait, that is based on a false dilemma.”
  • When possible, don’t let them dominate the conversation. It may not be easy to steer the debate, but you can at least try to play offense and “proactive defense” by preventing them from jumping around on topics, and forcing them to commit on a solid stance.

In Conclusion

Ultimately, this is not meant to be an exhaustive guide on combating extremism/etc., because that could not even be done in an entire series of articles. However, this is meant to be a crucial introduction to most commonly-used, uncommonly-recognized, and disruptive rhetorical tactics that sophisticated extremists, propagandists, conspiracy theorists, etc. employ. (For examples of fallacies which are more common to debate in general, you can see our article series on fallacies.) This is certainly something I could have benefited from knowing, so as to avoid a good deal of frustration. It is crucial that, with the apparent rise of this kind of rhetoric, we as debaters are prepared to credibly engage with and persuasively refute it (although to be clear, I am not saying “now go: actively seek out extremism/propaganda to refute”; I know from personal experience that it is a very annoying and, when taken seriously, time-consuming task). It is also true that you may just encounter wacky, incoherent people rambling on about how “they’re putting chemicals into the water supply to make the children like rap music and goth clothing.” However, I can definitely say that there is more than this; it is important that people do not underestimate the rhetoric that is out there, and are at least minimally prepared to deal with it if necessary.

%d bloggers like this: