Our elite Mastership Sourcebooks for NCFCA and Stoa will release soon! Check them out here!

I’ll be honest: sometimes, I’m just as guilty as the next guy. Once, it was because I had a looming deadline and no inspiration for a topic. Once, it was because I realized that taking one side would make it easier to crank out a paper. Once, I forgot that I had a paper due the next day. And though I criticize the habit, I recognize that it is very tempting to quickly form a conclusion based on cursory analysis, then doggedly support that conclusion while dismissing or ignoring contradictory reasoning.

I feel that this can be even more tempting for debaters: they’re so familiar with being forced into a conclusion and then required to work backwards to come up with support for that conclusion. Still, I normally just reserved this for highly subjective topics like literary analysis, to the point that I dubbed the style of arguing “English-paper analysis” (although I also call it “conclusion-based reasoning,” as opposed to “reasoning-based conclusions”). To be clear, there are some occasions—especially in debate and writing classes—where the stated goal truly is just to improve one’s ability to argue. However, when I began reviewing some of my writing in other classes—including history, area studies, policymaking, and even intelligence analysis, among others—I found that this style of writing would creep into topics where the primary goal was seeking and communicating the truth. This is problematic because the real world does not conform to English-paper analysis.

In this article, I want to emphasize that in the real world debaters have a responsibility to seek and communicate the truth, despite the fact that it can be easier to rely more on conclusion-based reasoning. This responsibility especially applies to policy, science (e.g. psychology, medicine), geopolitical analysis (e.g. conflict trends), justice, history, and any other fields where finding the truth is important.

What conclusion-based reasoning looks like

I would expect that most people have engaged in such reasoning, even if they didn’t realize it. Even if you’ve never debated, if you have ever had a writing assignment that calls for a well-supported answer to a prompt, then you’ve very likely relied to some extent on conclusion-based reasoning. For example, if someone is asked to analyze a recurrent theme, the writer might think back to a few key parts of the book, then quickly construct a conclusion/thesis which they spend the rest of the time trying to support. The conclusion-based reasoning here is when the writer hastily forms a conclusion, uses selective reasoning and facts to support it, and refuses to seriously adjust the conclusion even when faced with contrary evidence. In other words, this approach is characterized when a communicator spends a significantly disproportionate amount of time thinking “how can I persuade my audience” as opposed to asking “what’s the right answer?”

If one has ever competitively debated, they have almost certainly been in a round where they were assigned a position they didn’t really believe. How to ethically argue for a conclusion you don’t believe is a complex question that Wyly Walker once touched on, but I will partially add to his analysis: I have long recommended using reasoning and facts that one does believe, letting the conclusions make themselves. Yet, this is still (partially) conclusion-based reasoning: it is still selectively using facts and reasoning to support a preconceived conclusion. No doubt, it is understandable and/or even ethical to seek reasons to convince an audience of your answer. However, aside from some exceptions (including debate and writing classes), it is often unethical to dogmatically push for a conclusion that is contrary to the available evidence and reasoning—or to confidently push a conclusion without even exploring opposing viewpoints.

Why it is problematic

The reasons why it is problematic are mostly straightforward and may even be intuitive (e.g. people end up believing and acting on false information) but I still wanted to emphasize just how significant it is. I recently finished an excellent book called Superforecasters, which is about estimative analysis (i.e. forecasting), such as that conducted by the Good Judgement Project. To skip a lot of details, one of the project’s key findings was that decorated and/or self-proclaimed “experts” often did only slightly better at forecasting events than average people. This was startling because these experts are the (types of) people that are informing policymakers and business leaders in making crucial decisions. Too often, these “experts” were dogmatic about their beliefs and subscribed to certain ideological lenses which they used to explain or forecast events. For many of these forecasters, it meant starting with a general preconception of what would happen, and then spending most of their time trying to convince others (or themselves) why they were right, while making little to no changes in their estimates based on others’ analysis. The result is that crucial decision makers are being persuaded by “experts”—some of whom became “experts” primarily based on how well they persuade, as opposed to how right they actually are. (The titular “superforecasters,” on the other hand, used practices such as Bayesian reasoning, which emphasizes repeatedly updating one’s conclusions to fit the evidence, rather than trying to contort evidence to fit a conclusion.) There are other reasons, including some similar to the reasons to avoid witch-doctor theory (e.g. weak foundations, inability to persuade some people), but convincing people of the wrong conclusions is the primary impact, because it leads to bad decisions and devaluing of truth.

Don’t just avoid it; don’t accept it either

Many people rightfully get angry when politicians mislead them, so they blame the politicians. However, in some cases the politicians’ actions are just symptoms of the public’s general apathy towards truth, such as when audiences are too content with accepting easy explanations. Sometimes, it’s because the explanations come from people that the audience likes (e.g. “my party”); Sometimes, people accept it because they like the conclusion (“my party was right”); Sometimes people accept it because they don’t care enough to consider the reasoning. There are a lot of possible explanations as to why audiences might accept this type of reasoning, but the general result tends to be the same: people tend to believe the wrong conclusion in the issue, leading to bad decisions and allowing the communicators who care more about persuasion to profit at the expense of communicators who are committed to seeking and telling the complicated and/or uncomfortable truth. When you see this kind of thing happening, don’t just acknowledge it; call it out.

How to avoid these kinds of mistakes

There are a couple of important steps you can take to avoid these problems.

  • Start getting in the habit of reasoning-based conclusions, even when “it’s just school.” Again, there are some exceptions to this (e.g. in competitive debate and other classes where the explicit purpose is to practice persuasion/communication).
  • Catch yourself if you find that you are taking convoluted/contrarian/popular/simple/etc. positions just to be interesting, contrarian, popular, or to save yourself time.
  • Take time to honestly debate something with yourself or others: see all sides of the issue, and legitimately consider new information, especially when it contests your conclusions.
  • Be considerate of how you convey your judgments. You want to avoid ambiguous “weasel words” as well as overconfidence (both of which can be avoided through use of estimative language).
  • Admit when you are wrong. Ideally, this would improve your credibility (if/when people care about credibility), but if nothing else this helps convince yourself that you need to change or do better. As Superforecasters goes into detail on, it may not feel natural or great to have to acknowledge when you are wrong, but it is better than putting up a facade and giving others bad analysis. Acknowledging your mistakes can help you see where you went wrong in your reasoning, and thus improve in the future.

Conclusion

As a debater, it’s easy to get into a habit of backwards reasoning (like I did), where you quickly craft a conclusion/thesis and then spend a disproportionate amount of time thinking about how you can convince your audience that you are correct. However, in the real world, this approach leads to faulty analysis and, subsequently, inaccurate conclusions. Thus, debaters (and other forensics students) have a particular responsibility to recognize and shun such approaches. Instead, debaters ought to use their skills for seeking and communicating the truth. It may be uncomfortable, inconvenient, or unfamiliar, but convenience and comfortability should not be our guides to truth: reasoning and facts should be.

%d bloggers like this: