We tend to think that as civilized and well-educated people our view of the world is objective and fair. We pride ourselves in our abilities to dissect information, to make judgements based on scientific evidence, and therefore hold the belief that we are rational human beings. My mission today is to shake those beliefs. Our brains are hardwired with blind spots, and perhaps the biggest one is the conforming delusion that we, personally, don’t have any.
The brain is very clever at allowing us to justify our own perceptions and beliefs as being accurate, realistic, and unbiased. This is called naive realism. We assume that other reasonable people see things the same way we do, and if they disagree with us they obviously aren’t seeing things clearly. This is caused by two presuppositions: One, people who are open-minded and fair should agree with a reasonable opinion. And two, any opinion I hold must be reasonable; if it weren’t, I wouldn’t hold it. Therefore I can argue my opinion, tell the other side how things really are, and if they don’t agree with me it must be because they are biased.
The root of the problem is, that each of us lives in a different reality. Our perception of the world is not the same. One very clear example of this is a person who believes in God. When that belief is an integrated part of a person, it is easy to see God everywhere; in the miracle of life, how elegant all the structures making life possible are, how daily events and occurrences can be explained by them being caused by the “will of God”. A non-believer, on the other hand, might feel similar awe towards life, but seeks other explanations for it. This is a very clear-cut case of how beliefs shape our views of reality, but there are many less obvious differences in each and everyone of us.
A bit more subtle example is that discovered in the midsts of the conflict between Israelis and Palestinians. Even though both sides recognized, that the other side perceives issues differently, they still held the opinion that the other side is biased while they themselves see things objectively. Both sides had a strong belief, that their own perception should be the basis for settlement.
An experiment was done where the Palestinian proposal was labeled as coming from the Israelis, and the Israeli proposal labeled as coming from the Palestinians. Israeli citizens were then asked to judge the two proposals. I think this is a staggering example of how fleeting our objectivity is, and how we are unaware of our beliefs shaping the reality: The Israelis liked the Palestinian proposal attributed to Israel more than they liked the Israeli proposal attributed to the Palestinians. So, can there be any chance for peace if your own proposal is not attractive to you when it appears coming from the other side? What chance is there, then, that the other side’s real proposal is going to be attractive?
In the same way Democrats will endorse proposals coming from Republicans if they think the proposals were originally created by the Democratic Party, and vice versa. The really disturbing fact is, that neither the Israelis nor the Democrats were aware of these blind spots. They all claimed that their beliefs followed logically from their own careful study and reasoning.
Our beliefs cause us to accept only information that supports those beliefs, and to ignore anything that might disprove them. To study what factors influence decision-making, people were put into the role of jurors and had to listen to an audio re-enactment of an actual murder trial. They were then asked how they would have voted and why. Instead of considering and weighing all evidence carefully as it was presented, most people immediately constructed a story, or an image, about what had happened and then proceeded to accept only evidence that supported their story. The sooner people jumped into a conclusion, the more confident they were in their decision, and the more likely they were to vote for an extreme verdict to justify that decision.
The stronger a belief is, the more prone you are to accept only supporting information. Strong beliefs are not just ideas in your head that are under consideration and evaluation, but they are an integrated part of your personality. And in order to protect that personality, the brain will ignore disconfirming evidence. Here is a lesson for students of scientific objectivity: Because of what the mind thinks it knows, it filters information, shutting out things that don’t fit in the model it has created. Seeing is not believing. Believing is seeing.
This is not just an individual phenomenon, but it’s also a cultural one. Cultural intellectual patterns are built on past “facts” which are extremely selective. When a new fact that does not fit the pattern comes in, we don’t throw out the pattern. We throw out the fact. Galileo was persecuted for defending the fact that the Earth revolves around the sun. For this kinds of contradictory facts it may take centuries before enough people will start to see them to cause a change in the cultural intellectual patterns.
Similar “battle” can be seen right now in the area of nutrition; the established pattern of low-fat high complex carbohydrate diet being healthy is under attack by the disconfirming evidence of health benefits of low-carbohydrate diets. Some of that evidence originates already in the 19th century. Yet only now, decades later, we are starting to see slow changes in these broader cultural intellectual patterns.
Perhaps one of the most obvious ways how beliefs influence our perception, that everyone is familiar with, is prejudice. It is the way human mind perceives and processes information into categories, or stereotypes. Having stereotypes allows us to save considerable amount of mental energy, as it means we don’t need to approach every single person or situation without preconceptions. However, it also means that we see the world through a lens, or filter, consisting of those stereotypes, and as a result make erroneous conclusions.
This is also why first impressions are so important. Humans have an uncanny ability to determine a persons character rather accurately in a matter of seconds. This trait has developed as it was necessary in prehistoric times to make quick judgements about whether or not someone we encountered meant us harm. After that first impression, however, all our later experiences with that person are filtered through the image we created during the first encounter, and changing that image later takes a lot more effort.
This sort of labeling was very well documented in a study where sane people were sent to psychiatric wards, pretending that they had mental illness. Even though the pseudopatients stopped reporting any symptoms related to their “condition” immediately after being admitted, everything they did and said was interpreted by the staff through the lens of that person being a schizophrenic.
In other words, the diagnosis was in no way affected by the perception of the circumstances, but the perception of the circumstances was shaped entirely by the diagnosis. And as it turned out, this label – or image – was impossible to get rid of. Once you were labeled as a schizophrenic, the best status you could achieve was being a schizophrenic “in remission”. From the institution’s, or doctors’, point of view you were not sane, never had been, and never would be.
I wrote this article hoping that you will become a little more aware of these blind spots that all of us have, including you and me. More importantly, though, I want to emphasize how important it really is to try to understand each other, and to relate to what the other person is thinking and feeling, instead of judging them being right or wrong. Each of us have our own perceptions of reality, and my reality is in no way better or more “right” than yours. It is simply the lens through which I see the world.
Note: To further explore these concepts, I recommend reading Mistakes were made (but not by me) by Carol Tavris and Elliot Aronson.