“If one were to attempt to identify a single problematic aspect of human reasoning that deserves attention above all others, the confirmation bias would have to be among the candidates for consideration.”[1] —Raymond S. Nickerson, Tufts University Research Professor of Psychology
From birth on, we gradually and subtly develop different unconscious biases—tendencies to favor or reject certain things, groups, or people over others—and these biases affect our behavior. They’re inevitable, as people have the innate tendency to classify and categorize people, experiences, and information. These biases are kind of like shortcuts our brains develop to make fast decisions about what we’re reading, seeing, or experiencing.
Yet the buildup of certain cognitive biases can eventually cause great harm to your decision-making skills, and ultimately your integrity, as they often conflict with your conscious decisions about the kind of person you want to be. Many good people suffer from various types of cognitive biases that distort their thinking and bring them to inaccurate conclusions.
Biases gradually grow like a buildup of plaque on our thinking. They inhibit our ability to gather and assess information objectively. Yet, you can develop different skills and use various tools to minimize their effects. These cognitive biases must be intentionally fought in order to make rational, balanced decisions and assessments. This chapter will give you some ideas to fight that bias buildup and the limits it puts on your decision-making skills.
Roy-ism: Confirmation bias is integrity’s kryptonite.
Understanding Confirmation Bias
One of the most important biases to study is confirmation bias—which leads us to favoring information that confirms our existing beliefs, while ignoring information that does not. It’s a very natural and unconscious way for people process information. Why? It’s a habit that helps us quickly sift through boatloads of information, it helps protect our self-esteem by telling us we are right, and it confirms our need to feel intelligent and accurate.
When it comes to confirmation bias, the real rodeo starts with information analysis. Here’s the problem: two people can look at the exact same information and come to completely different conclusions, even if they try desperately not to let bias distort their analyses. We can’t make good decisions if we only emphasize some information but leave out relevant information that doesn’t support an outcome we want. Good decisions are made when we look at all the relevant information.
Integrity Dictionary
confirmation bias /kɑːn.fɚˌmeɪ.ʃən ˈbaɪ.əs/
(noun)
The tendency to interpret information by finding or understanding information in a way that is consistent with one’s existing beliefs or theories. It is a biased approach to making decisions, mostly done unintentionally, that often results in inconsistent information being ignored. People often process this information in a way that supports their beliefs, especially for highly important or self-relevant issues.
Overcoming Confirmation Bias
Psychologist Tom Stafford believes one theory why people have confirmation bias is that their thinking is shaped by factors that motivate them—such as their jobs, backgrounds, or circles of friends. He calls this the motivational theory. We dismiss others’ opinions with phrases like: “'You just believe what you want to believe,” or “He would say that, wouldn't he?'”[2] In this case, it seems that separating their motivating factors from their thinking would help them eliminate their biases. Another theory of why confirmation bias exists is that we fail to ask the correct questions about the information we gather or our own belief systems. Stafford calls this the cognition theory. If this is the case, the way to correct the bias would be to find ways to adjust our thinking. Stafford adds, “We assume people are already motivated to find out the truth, they just need a better method.”[3]
Researchers have studied how confirmation bias works too. One landmark 1979 study conducted by Psychologist Charles G. Lord and his colleagues, Biased Assimilation and Attitude Polarization: The Effects of Prior Theories on Subsequently Considered Evidence, used a persuasion experiment that tested the motivational and cognitive theories of confirmation bias.[4] Lord showed two sets of evidence to a group of people who supported the death penalty and a group of people opposed to the death penalty. One set of evidence showed that the death penalty deterred murder and should continue, while the other set of evidence showed that the death penalty did not deter murder and should be abolished. Each group accepted the evidence sets that confirmed their prior beliefs. In fact, the opposing sets of evidence seemed to only strengthen their prior beliefs. Essentially, the study participants saw what they wanted to see.
Then Lord’s team repeated the study, but this time tested two sets of instructions along with evidence of the effectiveness of the death penalty to deter murder. One set of instructions was motivational: it asked participants to look at the evidence as if they were judges or jurors, and consider the evidence as impartially as possible. The other set of instructions was cognitive: it asked participants to ask themselves “at each step whether you would have made the same high or low evaluations had exactly the same study produced results on the other side of the issue.”[5] In practice, this meant that participants had to look at the results of a piece of research, imagine the opposite results were found, and then analyze the study’s methodology. The researchers called this the “consider the opposite” approach. It was kind of like instructing the participants to play the devil’s advocate.
The results were astonishing. Using the motivational instructions, the researchers found the same results as the original study: participants biasedly weighed the evidence. Those who were pro-death penalty thought the evidence supported the death penalty. Participants opposed to the death penalty thought the evidence showed it should be abolished. Even though they wanted to make unbiased decisions, that didn’t change the outcome of their decisions. But the cognitive set of instructions had an entirely different result. Participants overcame their biases when evaluating the evidence, and did not rate evidence that supported their prior beliefs as being better than evidence that opposed their prior beliefs. It showed that when given a thought strategy, people can overcome their confirmation bias. It showed that bias is like a wall. It can stop us if we try to go straight through it; but with another thought strategy, we can climb right over it.
Thoughts on Integrity: Bernhard Günther
In a 2017 interview, Bernhard Günther, former CFO of the German electric utility RWE, spoke about why conquering cognitive biases is so important in the decision-making dynamics of a business. After analyzing their decision-making processes, RWE started seeing the cognitive biases affecting them:
What became obvious is that we had fallen victim to a number of cognitive biases in combination. We could see that status quo and confirmation biases had led us to assume the world would always be what it used to be . . . We also saw champion and sunflower biases, which are about hierarchical patterns and vertical power distance. Depending on the way you organize decision processes, when the boss speaks up first, the likelihood that anybody who’s not the boss will speak up with a dissenting opinion is much lower than if you, for example, have a conscious rule that the bigwigs in the hierarchy are the ones to speak up last, and you listen to all the other evidence before their opinion is offered.[6]
What was their solution to the biases affecting their collective decisions? They now encourage managers and employees to be mindful of their cognitive patterns and require a list of debiasing techniques used to evaluate any major proposal put before their board. They also encourage an atmosphere where some level of conflict is acceptable, allowing employees to feel more comfortable dissenting with an idea or decision. And Günther added, “when making big decisions, we now appoint a devil’s advocate—someone who has no personal stake in the decision and is senior enough in the hierarchy to be as independent as possible, usually a level below the executive board. And nobody blames the devil’s advocate for making the negative case because it’s not necessary for them to be personally convinced; it’s about making the strongest case possible. People see that constructive tension brings us further than universal consent.”[7]