In This Episode
- The goals of critical thinking, and how critical thinking differs from rational argumentation.
- Why the core of any critical thinking training program must include the study of rational argumentation and the study of cognitive biases and the psychology of persuasion.
- The problem for which the traditional martial arts was a solution.
- The “martial environment” of critical thinking today.
- Why rational persuasion should be viewed as a martial art.
- The distinction between rational argumentation and rational persuasion.
- An example: contradiction and inconsistency, versus cognitive dissonance.
Quote:
“The warrior knows things that no one else knows, valuable things that can inform our understanding of what it means to live a good life, even for those of us lucky enough to never personally experience war or violence.“
References and Links
- My critical thinking education site, Critical Thinker Academy.
- Book: Mistakes Were Made (but not by me), Carol Tavris and Elliot Aronson. (Link goes to my Critical Thinking Library page on the Academy site).
Subscribe to the Podcast
CLICK TO VIEW TRANSCRIPT
This is the Argument Ninja podcast, episode 2!
Hi everyone and welcome to the Argument Ninja podcast. My name is Kevin deLaplante, and in this episode I talk about the relationship between critical thinking and rational persuasion, and why, even though I identify as a critical thinking educator, and I have a website called The Critical Thinker Academy, the focus of this podcast is rational persuasion, rather than critical thinking more broadly.
I also push the martial arts theme a little further, and give some reasons to think of rational persuasion as a martial art.
And I give an example that illustrates the distinction I want to draw between rational argumentation and rational persuasion.
…
So, just to recap, in the first episode I mentioned six different components, or skill sets, that are relevant to critical thinking, but I said that if I was forced to identify the two most important skill sets that should be included in any instructional program on critical thinking, the first would be skill in rational argumentation, and the second would be an understanding of cognitive biases and debiasing strategies.
We need skill in rational argumentation because we need to know what it means to have good reasons to believe anything, and what it would mean to have good reasons to believe any particular claim at issue. These are the questions that theories of rational argumentation are trying to answer — what it means to have a good argument for a particular conclusion.
We need to understand cognitive biases and debiasing because much of our thinking and behavior is the result of automatic, unconscious cognitive processing that make us predictably irrational in various ways, and we need to understand how this works, and what we can do to avoid or minimize the negative effects of the cognitive biases that we’re prone to.
So, this gives us part of the story of how I distinguish critical thinking from rational argumentation.
Critical thinking is an umbrella for a collection of skills and dispositions that are relevant to achieving certain goals. Rational argumentation is just one of those skills that fits under the umbrella.
But there’s a more important difference that I want to emphasize.
If I say that I want to be a better critical thinker, or I want to improve my critical thinking skills, that tells you something about my goals, but by itself it doesn’t say much about the methods I’m using to achieve those goals.
The methods are like tools in a toolkit. Rational argumentation is a tool in the critical thinker’s toolkit. But how you use these tools, and for what ends, is determined by higher level goals.
…
So what the primary goals, or aims, of critical thinking?
I think there are two distinct sets of goals.
The first has to do with the quality of our thinking. One of the aims of critical thinking is to improve the quality of our beliefs, judgments and decisions.
What does this mean? Well, it can mean different things, depending on which of these we’re talking about.
When we’re talking about beliefs, the most obvious measure of quality is how likely they are to be true. All other things being equal, I want my beliefs to be true, not false.
When we’re talking about judgments, in this context I’m using the term to refer to the process by which we arrive at a belief or a decision. I want my judgments — the process by which I arrive at a belief or a decision — to be reasonable, justifiable, reliable, and so on.
When we’re talking about decisions, or choices, that’s a different category again. Decisions are actions of some kind; they can’t be true or false. But they can be rational or irrational, justified or unjustified, effective or ineffective, and so on.
These are all different ways that the quality of our thinking can be improved, and this is one of the goals of critical thinking — to improve the quality of our thinking.
…
So, I said that critical thinking has two goals; what’s the second goal?
The second goal is one captured by phrases like, to learn to think for yourself; I want to teach my kid how to think for him or herself; I want to be an independent thinker.
What do these phrases mean?
First and foremost, they express values, things we care about.
They express the value of freedom of thought.
They express the value of autonomy, the ability to make decisions for ourselves and pursue our own goals.
They express the values of agency and responsibility, the notion that as individuals we want to claim authorship and ownership of our own beliefs and values. We don’t want to think of ourselves as mindlessly parroting what we’ve been told to believe by governments, corporations, the media, religion, our peers, and so on.
These values are often associated with the aims of critical thinking, and they should be.
I summarize them like this: another important aim of critical thinking is to learn to think for ourselves, to be able to claim ownership and responsibility for our beliefs, judgments and decisions.
So, to sum up, when we say that we want to be critical thinkers, we’re saying at least two things:
One, that we want to improve the quality of our beliefs, judgments and decisions.
And two, that we want to be able to think for ourselves, to be able to claim and ownership and responsibility for these beliefs, judgments and decisions.
…
There are still lots of questions about these sets of goals and how they relate to each other, and what exactly terms “ownership” and “responsibility” mean. But that’s okay. We can recognize the goals even if it’s a struggle to articulate them. And that’s all we really need to begin the process of becoming better critical thinkers.
I think the analogy with martial arts training is helpful here.
In any traditional school of martial arts, if you ask what the ultimate purpose of the training in that art is, you’ll most likely get an abstract, philosophical answer. No one will say that the ultimate purpose is to win fights.
The Japanese martial tradition of Budo, for example, was influenced by the three great philosophical traditions of Shinto, Confucianism and Zen Buddhism.
It might be hard to articulate or understand what the ultimate goals of the martial arts influenced by this tradition are, and different masters might emphasize different elements — self-mastery, enlightenment, non-dual awareness, inner peace, virtue.
But none of this prevents a student from starting training, learning your stances, learning strikes and kicks, learning your first forms.
Similarly, it might be difficult to articulate or understand the ultimate goals of critical thinking, but that doesn’t prevent a student from beginning to train and cultivate the skills and attitudes that are essential to critical thinking.
…
Now, let’s get back to our original question, which was about the relationship between critical thinking and rational persuasion, and why the primary focus of this podcast is going to be rational persuasion.
Yes, there are interesting questions one can ask about the ultimate aims and goals of critical thinking.
But let me appeal to the martial arts analogy again.
For me, this is like asking questions about the ultimate purpose of studying a martial art. It’s important to have some idea of how to answer this, but you won’t spend most of your day at the dojo debating the meaning of non-dual awareness and self-mastery.
You’ll spend most of your day learning and practicing the various elements of the martial art. Conditioning, stretching, stances, forms, strikes, grappling, sparring, and so on.
These are the various skill components that constitute the martial art. It can take years to learn how to perform a movement correctly, with speed and precision and power.
It will take years to internalize the techniques so that your body responds automatically in a sparring or a combat situation.
It may take years to understand how learning these skills contributes to the higher goals and values of the martial art.
But there’s no shortcut, if the path you’ve chosen, to wisdom or virtue or enlightenment, is a martial path.
“Martial” means “inclined or disposed to war”. “Of, suitable for, or associated with war or the armed forces”. “Characteristic of or befitting a warrior”.
The martial arts were born out of a practical reality, the reality that in this world there are people, or groups of people, who may threaten us, with coercion and violence; and to maintain our autonomy and our way of life, we may need to defend ourselves, and resist the aggressor, with our bodies.
In this world there are occasions where we need to able to resist violence, and inflict violence, when necessary.
The question is, how we can do this effectively, without falling into mindlessness and brutality? How can we practice violence without abandoning our higher ideals and values?
That’s the problem that the martial arts were designed to solve.
And for the traditional martial arts, the solution went beyond mere self defense. The solution became a way accessing a kind of insight and wisdom that can only be accessed by whose who have traveled the martial path.
The warrior knows things that no one else knows, valuable things that can inform our understanding of what it means to live a good life, even for those of us lucky enough to never personally experience war or violence.
…
So what does this have to do with critical thinking and rational persuasion?
The goals of critical thinking define the higher ideals and values that we want to realize.
These goals may be abstract — truth, rationality, understanding, self-awareness, ownership, responsibility, autonomy, agency — but they are the star that we aim for, that we strive to navigate towards.
But we live in a world that isn’t always supportive of these ideals. The world throws up obstacles, that obscure the stars, that distract us, that sometimes actively work against these ideals.
What sorts of obstacles? There are many. Our cognitive limitations can be an obstacle. Our social position and the opportunities, or lack of opportunities, that come with it, can be obstacle. People can be an obstacle, institutions can be an obstacle, governments and the media can be an obstacle. Anyone or anything that has an interest in influencing what we believe, what we value, and how we behave, can be an obstacle.
The reality of this world, the real world we live in, is that we are bombarded on a daily basis with competing messages from many different sources, seeking to influence how we think, what we believe and what we care about.
A few of these sources may have our best interests at heart — our parents’ influence, for example — but for the vast majority, we are nothing but nameless targets of influence campaigns hatched by companies or institutions, designed to serve the goals of those companies and institutions, not our goals.
Each of us lies at the intersection of these multiple, diverse lines of external influence, constantly pushing and pulling us in different directions.
It’s important to know that these persuasion campaigns are most successful when we’re unaware that we’re being persuaded, when their influence is hidden, or unconscious, and we’re convinced that we’re responsible for our own beliefs and decisions, even when we’re not.
And for most of us, most of the time, this is exactly how we experience things, this is how persuasion works. We don’t see the lines of force, we aren’t consciously aware of their influence.
This is the challenge of critical thinking in the real world.
I’m not exaggerating when I say that we face a situation not that different from the one faced by those who were inspired to develop the traditional martial arts.
The challenge we face is a martial challenge. Battles are being waged for our minds. We are vulnerable to real harm, sometimes even physical harm, when bad ideas and bad ideologies turn into motives for violence.
And so we need to ask these questions:
How do we train ourselves to see these influence campaigns for what they are, to make the lines of force visible?
How do we defend ourselves against these forces that threaten our autonomy, our agency, and obscure our pursuit of the truth?
How do we actively combat the forces that are the most pernicious, the most damaging?
Well, I think the whole array of tools in the critical thinker’s toolbox is important here.
But at the center of this effort must be a two-pronged program; one for developing our awareness of what rational belief and rational decision-making actually looks like, and one for developing our awareness of the psychological mechanisms that govern how people actually think and act.
When this awareness is present, the lines of force can be made visible. We can learn to see the manipulation and persuasion tactics for what they are.
And more importantly, we’ll have tools for reclaiming our autonomy and agency, as independent thinkers. We can ask ourselves, “do I really have good reasons to believe this?”, and we’ll know what it means to answer that question.
This skill set, this two-pronged program of training and instruction, is what I’m calling “rational persuasion”.
This is the martial art that lies at the center of our commitment to ideals of critical thinking. And like a traditional martial art, it is also a means by which can pursue and fulfill these ideals.
…
So, this is part of the story for why I want to pursue this conception of critical thinking and rational persuasion as a kind of martial art.
It’s not the whole story, there’s still lots more to come, but I hope you’re starting to see the outlines of what I’m getting at.
Now, before I wrap up this episode, I want to draw your attention to a distinction that you might have missed.
I’ve been talking about rational persuasion as a martial art, but earlier, when I was talking about a two-pronged approach to critical thinking, I used the term rational argumentation, and I distinguished that from the study of cognitive biases and the psychology of persuasion.
I want to clarify this distinction between rational argumentation and rational persuasion.
When I talk about rational argumentation, I’m referring to the study of good and bad arguments as this is normally presented in logic and critical thinking courses taught philosophy departments.
So, in a class like this, you’ll talk about premises and conclusions, you’ll talk about the difference between the truth properties of individual claims and the logical properties of whole arguments, you’ll learn the distinctions between valid and invalid arguments, strong and weak arguments, deductive and inductive arguments, sound and cogent arguments, and so on. You’ll learn about formal and informal fallacies, and if there’s time you’ll start talking about the distinctive features of different types of arguments, like scientific arguments, or moral arguments, and then time is up.
There are dozens of textbooks that are routinely used to teach this material, in critical thinking classes all around the world.
Now, apart from the fact that a course like this really only gets you started on this topic — it’s like, just enough to get you your yellow belt in argumentation — the main problem with this approach to critical thinking is that too often it ignores the psychological realities of how real people use arguments in different social contexts for different purposes, and how real people respond to, and resist, efforts to persuade them.
So consequently, this approach doesn’t adequately prepare students for how to reason and argue in the real world, with real people, about real issues that matter to them.
What I’m calling “rational persuasion” is an approach to argumentation that takes this important logic-based material, and integrates it with our psychological and social knowledge of human behavior and human reasoning.
The goal is to bring both these bodies of knowledge to bear on a given situation, so that our argumentative moves are informed by, and are responsive to, a much wider range of factors.
…
Let me give you an example to illustrate this argumentation/persuasion distinction.
In a logic class you learn about consistency and contradiction. A consistent set of claims is one in which all the claims can be true at the same time, without generating any contradictions.
A contradiction is a special kind of claim. It’s any claim that can be reduced to a situation where you’re asserting one thing, P, and simultaneously asserting it’s contradictory, not-P. So a contradiction is a claim where you’re simultaneously saying that P is true and that P is false.
In classical logic, we treat all contradictions as false, because it’s assumed that they describe an impossible state of affairs.
So, we define an inconsistent set of claims as one that entails a logical contradiction of some kind. And this means, as a matter of sheer logic, that not all the claims in the set can be true – at least one of them must be false.
This is a powerful concept for argumentation, because one of the ways you can refute an argument is to show that the conclusion of the argument, or one of its premises, is inconsistent with other claims that our audience thinks are true.
For example, if we’re presented with an argument against gay marriage that is based on the assumption that, say, legal recognition of marriage should be based on the capacity for a couple to bear and raise biological children, then we can point out that this assumption would logically entail that we should also deny legal marriage to infertile couples, or senior couples who can no longer have children. But no one wants this conclusion, so as a matter of logic it forces you to go back and reconsider that assumption.
Now, this is a powerful logical tool, showing people how their beliefs in one area may be inconsistent with their beliefs in some other area. In a class I would drill students on different arguments and get them to come up with refutations that exploit this principle.
But in reality, the principle is only effective to the extent that people are willing or capable of seeing the contradiction for what it is.
We’re moving now into the realm of psychology and persuasion.
What you actually find, when you go out into the real world and try to persuade people by bringing to light the inconsistencies in their beliefs, is that people have a very natural tendency to resist this move.
We resist it in a number of ways, but one way is by rationalizing away the inconsistency, so that in our minds we don’t experience it as a genuine inconsistency.
Now, a psychological mechanism that has been studied extensively, and that explains a lot of this behavior, is called “cognitive dissonance”.
The idea is that this state of holding two contradictory beliefs in our heads at once, or endorsing two contradictory values, or behaving in a way that contradicts some expressed belief we have, is a state that causes mental stress and discomfort. Our brains are naturally are drawn to states that relieve this discomfort. So we automatically look for ways of restoring consistency between our thoughts and expectations and reality, without having to openly and consciously acknowledge the inconsistency.
The classic example of cognitive dissonance is how doomsday prophets and their followers respond when their predicted date for the end of the world come and goes and everyone is still here. The dissonance in this case may be very acute, the idea that they could be fundamentally wrong about their religious worldview is very painful to contemplate. So, in the face of what would appear to be a blatant contradiction, most followers don’t accept it. They rationalize the events in a way that preserves consistency between the facts and their basic worldview. Maybe their faith is precisely what averted the disaster.
If you want a whole pile of examples of cognitive dissonance in action, I strongly recommend the book Mistakes Were Made (but not by me), by Carol Tavris and Elliot Aronson. It’s an eye-opening book, it should be on every critical thinker’s bookshelf.
But in all my years of studying logic and argumentation in a philosophy program, no one every mentioned cognitive dissonance. That’s not logic, that’s psychology. They do that in a different building on the other side of campus.
Yet this psychological mechanism is clearly relevant to the success or failure of argumentation in the real world.
And this is my point. If we don’t learn how to integrate our knowledge of human behavior into our argumentation strategies, they’re not going to be as effective as they could be.
If I’m employing rational persuasion, then I’m not just thinking about the inconsistency; I’m also thinking about how the inconsistency will be received. I’m looking for ways of reducing the cognitive dissonance in my audience, so that when a contradiction is presented to them, they’re more likely to acknowledge it and respond in the way I want them to.
How I would do this is an other question, but this is an example of what I call an “argument ninja move”.
…
Well that’s all I’ve got for you today. If you’re enjoying this show it would really help to visit iTunes to leave a rating and a review. Please visit argumentninja.com for show notes and to leave comments on this episode. And please visit criticalthinkeracademy.com and sign up for one of my free video courses there.
Thanks for listening, and I hope you’ll join me next episode.