I want to believe
I feel like the word “belief” gets a bad rap nowadays. It seems folks just aren’t into believing anymore. In an age of science and information, believing sounds too much like guessing on one hand, or gullibility on the other. We prefer to know, to be certain.
To define the operative term, a belief is an idea you consider to be a fact. To believe something, is to think it’s true. The difference we feel between things we say we know, and those we merely believe, is the degree of certainty. Something we know is just something we believe with complete certainty. Knowing is believing without doubt. So when I say that people don’t like to believe, what I mean is people don’t like to be uncertain. People will believe, but only if they can believe whole-heartedly.
As a species, humans really don’t like ambiguity. In fact, research suggests we simply won’t tolerate it. We like to know things. How things work, what things mean. We like to believe without doubt. We like it so much, our brains evolved a set of techniques for convincing ourselves we know what’s happening, even when we don’t.
Psychologist Keith Payne refers to this series of adaptive brain functions as our psychological “immune system.” Some beliefs are crucial for our psychological well-being. So even if those beliefs don’t entirely coincide with reality, we usually figure out a path to believing them anyway.
Good and reasonable people
One example is each person’s belief about themself that they are a good person. To get along in society, it’s critical to believe you are, as Payne puts it, a “good and reasonable” person. Even repeatedly doing bad things, doesn’t usually convince a person they’re bad. Generally speaking, we attribute our own poor behavior to something outside of us. I’m not a bad person, I was having a bad day. My true self—the real me—is a good person.
And not just me. My group too. A good and reasonable person like myself, would only identify with a group of other good and reasonable people. This bias doesn’t apply equally to every group a person belongs to. People statistically belong to many groups they don’t actually identify with. Brown-haired people form a group and, technically, I belong to it. But that isn’t a part of my identity. If you ask me what makes me who I am, I’m not likely to talk about being brown-haired.
Instead I might tell you about the place I grew up and the family I came from and the things I like to do. These are the groups that represent my identity. The types of groups someone identifies with are not the same for everybody. Like I said, brown-haired people like myself don’t identify with their hair color. I don’t see a bunch of Facebook groups for brown-haired people. Blondes and redheads, however, seem to be much more of an identity (I just looked. The Facebook group “Blonde Hair Community” has 109 thousand members. The “Redhead Connection” has 15 thousand members. The “Brown Hair Society” has 2. Not 2 thousand, 2 members).
Our identity is important to us. So we value being part of the groups that provide it, and we align ourselves with those groups’ ideals. We like who or what the group likes and we don’t like who or what the group dislikes. I imagine a number of you are, right now, saying to yourselves, “Not me. I’m not like that. I don’t let the group tell me what to think. I think for myself.” Perhaps that’s true. I’m just telling you what the typical person is most likely to do according to scientific data. But, I wouldn’t argue with you about it. There would be no point. At some level you have to believe that, whether it’s true or not.
You did that on purpose!
That’s because another belief we all tend to have, both about ourselves and others, is that a person’s actions reflect their personal will. When we see a person do something, we generally assume they did what they did because they wanted to. The same is true of ourselves. We believe there are good reasons for the things we do. Without that belief, we’d be admitting to wandering around aimlessly, with none of our own thoughts, accomplishing nothing except by accident. We are agents of our own free-will. At least that’s what we believe.
But once again, research disagrees. It seems the reasons we give for our actions tend to be made up afterwards as justifications. We employ what is called “motivated reasoning,” adjusting our story post hoc to fit reality. We then consider these newly invented reasons and agree with ourselves that it was these that motivated our behavior all along. So are we all just big liars? No. Not really.
We’ve all, at one time or another, seen or perhaps had the misfortune to be involved in, an embarrassing situation. When a person is embarrassed, a normal or at least understandable response is to try to reframe the situation into one less humiliating; to “save face,” so to speak. A person may say, “I meant to do that,” hoping to make themself seem less clumsy. However, this type of excuse making is aimed at convincing others. The person who slipped isn’t convinced by their own excuse. That is what makes our “immune system” different from just lying to ourselves. For your “immune system” to work—to really be effective—you have to honestly believe what you tell yourself. And you do.
I knew it all along!
In a dynamic world where things change very quickly, it’s often difficult to hold on to any specific truth. New facts and evidence are constantly coming to the surface, asking us to reassess our previous positions on countless issues. This, however, does not make us flexible thinkers. Humans are very good at holding on to what they believe is true, especially about themselves and their group, despite any amount of evidence to the contrary. We simply dismiss and ignore evidence that contradicts what we already think, and only pay attention to evidence that tells us we’re right. It’s called confirmation bias, and you’ve probably heard of it. In effect, it assures that if you believe something, you keep right on believing it.
A concept known as the cognitive response principal suggests people don’t change their minds simply based on information they receive. They change their minds based on how new information makes them feel. Say, for example, a politician gives a speech and promises to “cut spending,” a promise often made but rarely delivered on. One voter, who believes it’s very important to reduce spending, will have a positive feeling hearing the speech. Another voter, who cynically believes this is just another empty campaign promise, may feel insulted by such a thin lie, and have a negative feeling.
As a result of the speech, the first voter will have a more positive opinion of the candidate and the other will have a more negative opinion. It doesn’t matter that the single issue may have been a small part of speech, or that the promise isn’t realistic, or that it’s been made and broken more times than can be counted. What matters is the first person had their feelings affirmed while the second had theirs hurt. Payne boils down the cognitive response principal, saying “all persuasion is self-persuasion.” And we are really good at assuring ourselves that we’ve been right all along. This goes for just about everybody. All groups of people, regardless of political affiliation or any other identifier, employ their “immune system” to the same degree. It’s not a Democrat or Republican thing. It’s a human thing.
Good guys and bad guys
All of these examples are psychological work-arounds for times when reality becomes too unclear, overly complex, or contrary to our intuition. We don’t like ambiguity so we pretend incredibly complicated issues to be cut-and-dry. We see some people as good (us, our group, and those we patronize) and other people as bad (everybody else). There is, we like to believe, always a simple explanation for everything and a clear choice between right and wrong. Coincidentally, guess who always makes the right choice. That’s right, us. Now, do you want to guess who always makes the wrong choice?
It’s easy to see how when the world is cast into such stark terms, ambiguity stops being a viable option. When the consequences of every decision decide whether you are good or evil, there is no room for being unsure. It’s not enough to believe you’re right, you have to know it. But how do you know you know something? What evidence convinces you that your beliefs are true beyond doubt? We rely on things we call logic, reason, intuition, and experience. And this is where we go wrong.
Nothing Anything but the facts.
What people believe, it turns out, usually has little to do with evidence or any consideration of facts. The best predictors of what a person will believe are the things they already believe (confirmation bias) and those things that their group believes (in-group/out-group bias). We are much more susceptible to biased thinking than most would like to believe. Cognitive biases aren’t, however, defects. They are, as has been said, psychological features that evolved with our brains, and they are excellent for human survival. They just aren’t particularly (or at all) well-suited to ascertaining the truth.
If that seems strange, consider that during hundreds of millions of years of animal, mammalian, primate, and eventually human evolution, there was little evolutionary pressure on people to ponder the complexities of the universe. Human reason evolved to perform a different and much more practical function than the logical one we often ascribe to it. The human brain didn’t evolve to execute and evaluate logic, it evolved to decide who’s in and who’s out.
It’s not logical
It may not surprise you to hear that humans, on average, are not very good at logic. This is a big clue that our brains evolved for some other purpose. You see, evolution doesn’t work that way. Traits are evolutionarily successful because they are very good at the function they perform. Every organ in your body is highly efficient and very well suited to its specific function. We’ve all seen examples of people arriving at illogical conclusions. Eventually researchers would propose that something else, not logic, was the true function of human reason.
Scientists Dan Sperber and Hugo Mercier, in their work The Enigma of Reason, demonstrate very convincingly that the true power of human reason, is not logic, but in maintaining the integrity of the group. Most people perform poorly in tests of pure logic. There are, however, two situations where humans are, on average, exceptionally good at applying reason.
The first is when making a case for ourself. When it comes to giving reasons to explain our actions, humans, from a very young age, are naturals. You’ve no doubt seen the impressive abilities of small children to come up with clever excuses for their misbehavior. You may have even thought, “Nice try kid.” You know they aren’t telling the truth, but you understand what’s going on. They are angling for the best outcome for themselves.
Humans also have great success reasoning in the reverse scenario, and that’s why you can tell the child—or generally, any person—isn’t telling the truth. We’re very good at judging if the reasons a person gives for their behavior are good reasons or not so good reasons. This is important. To maintain the harmony of the group, a sense of fairness has to be satisfied. People have very strong beliefs about what’s fair and what’s not, and very little will upset a group like a wide-spread feeling that things aren’t fair.
The welfare of any group depends largely on its ability to recognize behavior that threatens or damages the group as a whole. So if the group has questions about an individual’s actions or motivation, that individual had better have answers. Good ones. We’re very skilled at deciding whether or not a person’s reasons provide an acceptable explanation of their behavior. This way, the group is maintained intact, trouble-makers are managed, and problems are generally avoided or quickly remedied.
When life gives you lemons…
Of course, no life is without problems. But being good and reasonable people, belonging to a good and reasonable group, it follows that any problems you do face, can’t be of your own doing. Also—and this detail is critical—the problem must be beyond your power to solve it. If a problem is within your control, people expect you to fix it. Being reasonable, you know this. So it’s helpful to believe our problems are beyond our control.
However we can’t go too far. People can’t be expected to worry about situations that can’t be changed. The problem, though enormous and life-threatening, is not an act of nature. Somebody must be behind this great evil. But why would anybody do something so terrible? The same answer always seems to satisfy us. The person is simply evil, and evil people don’t need a reason for doing evil things. So our world becomes filled with monsters.
A real monster!
We’ve been telling this story to ourselves and each other since the beginning of time. “Overcoming the Monster” is an ancient archetypical story, and still one of our favorites. A brutally powerful force of evil, enacts a sinister plot against the good people (us), visiting fear and despair on an otherwise peaceful realm. Hope wanes until finally a hero is identified. This hero, not driven by his own motives but unable to deny the pleas of the people, becomes their champion and faces down the monster.
At first, the overwhelming power of the evil force is made apparent and the hero’s efforts are in vain. However, just as they are on the verge of succumbing, the hero notices the monster’s single, and, until now, undiscovered vulnerability. Through admirable cunning, the hero exploits this weakness, the monster is vanquished, and peace is restored to the land.
We love this story and we tell it over and over again. From Gilgamesh and Grendel, to Star Wars and Lord of the Rings, the story has been told and retold as long as humans have been telling stories. In fact, Carl Jung believed that every story has this struggle between good and evil at its kernel. Our attraction to this story may be obvious. When we tell it or hear it, we imagine the monster, we imagine the terrorized villagers, and we imagine the hero. Coincidentally, the villagers always look a lot like us. Sometimes we see ourselves as the hero. We always see ourselves in the story, we just never see ourselves as the monster.
What’s the point?
To all this, I think there are two points; or maybe one with two parts. First, it doesn’t matter if you’re right. At least not as much as you probably think. Yes, it feels good to be right, but it usually doesn’t really matter to anybody but you. You should always want to be right because it’s in your best interest to be right rather than wrong. But nobody cares. Your group will agree with you. The others won’t. People don’t identify with a person because they believe what that person says is right. They believe what a person says is right because they identify with that person. The quality of your arguments doesn’t really matter.
Second, knowing that people don’t believe people they don’t like, it’s obvious that a person is more likely to consider your opinion if they like you than if they don’t. But I’d like to believe an adult already knows that. We disagree with people in our groups all the time—our families, our friends, our teammates—and since we want to get along, in the end, we do.
The tragic division in our society right now has nothing to do with ideology. (I know, I know, you believe with all your heart that it does. But it doesn’t. That data is clear.) It comes down to a relatively simple fact. You see yourself as the hero and the other guy as the monster. And the other guy sees you the same way you see them. We’ve been telling this story for a long, long time, and it’s not likely we’ll soon tire of it. It still grips us, stirs us, and compels us. But it’s a story. And grown-ups really shouldn’t believe in monsters.
References and additional reading:
- Payne, Keith Good and Reasonable People: The Psychology Behind America’s Dangerous Divide, 2024
- Brotherton, Rob Suspicious Minds: Why We Believe Conspiracy Theories, 2023
- Sperber, Dan & Mercier, Hugo The Enigma of Reason, 2017
- Bowie, David The Man Who Sold the World, 1970