top of page
Search
  • Writer's pictureEmil Lazar

Belief bias -- a hidden detractor to taking the right decisions

Recent developments in human cognitive functions (such as the prefrontal cortex) can easily make us forget that before our brain evolved to process massive amounts of complex data and make coherent decisions based on this data, it has first and foremost adapted to conquer its environment in the wild.


Despite our education teaching us to tackle problems methodically, rationally, scientifically, the reality is that this process takes up a lot of energy. While accounting for only 2 percent of the body mass of the average adult human, our brain consumes an estimated 20 percent of body's energy supply. The brain requires expensive electrical power to operate. This is one one the reasons it's always looking to make the decision making process more energy efficient, leading to unconscious mental shortcuts, known as heuristics. When they work, it can be a beautiful thing and we may call it intuition, instinct or even a hunch. When they don't, we're prone to irrational judgements, called cognitive biases.


One of the most common biases affecting us humans is the belief bias. The belief bias occurs when one's value system, beliefs, prior knowledge, etc. affects, or distorts the reasoning process through the acceptance of illogical arguments or invalid data. This basically means that an observer would assume ahead of time that they know what the results of an experiment will be by using a heuristic that distorts the results by using one's own experience or set of beliefs.


An example of this could be a researcher studying the effect of coffee or wine on the general health of individuals. A completely open-minded researcher will gather data and then come to a conclusion based purely on the data collected. A coffee fan may interpret the data as "coffee prevents cancer" while a coffee hater might conclude that "coffee causes cancer".


A strict vegetarian might claim eating meat is never OK, not even in a life-threatening situation, while an "omnivore" may state that any kind of nutrient is acceptable when severely starving. They may both be inclined to reject the other party's reasoning, because of their own beliefs.


Some judgments may appear as logically correct at first, but that may be only because of our pre-existing knowledge of the world. Take the following example:


1. All birds have feathers

2. Peacocks have feathers

3. Peacocks are birds


Here, the argument's technical validity and its truth (correspondance to reality) are two different aspects, which need to be assessed through logical reasoning, rather than falling back on existing beliefs. The argument is logically invalid, but it's true. In assessing its validity, we are tempted to skip using logic and appeal to our existing knowledge of the world, and tricking ourselves into thinking it was a logical deduction process.

How can we counteract belief bias?


First of all, as brain owners, none of us are immune to biases. By simply acknowledging their existence we are better equipped to reduce bias incidence. Being more aware of where our personal beliefs show up in our own process of deductive reasoning can help us identify areas where we are prone to belief bias. Everyone has their own preferred blind spot, be it politics, religion or social issues. Let's make a habit of observing ourselves during a debate or when subjects which are relevant for us face data which is contradictory. If an important decision is at hand, let's give ourselves some time, slow down our reasoning process and distance ourselves from the emotional reaction of protecting our beliefs.


Secondly, as shown in the aforementioned peacock example, it's important to make the distinction between validity and truth. Exaggerating truth can lead to exaggerating validity. Remember, plausible, esthetically pleasing or wonderfully formulated hypotheses or ideas highly appeal to our personal collection of beliefs and experiences, making us prone to bias. This leads to 4 categories of arguments:


  1. Believable and valid

  2. Unbelievable and valid

  3. Believable and invalid

  4. Unbelievable and invalid


Conversely, these categories can lead to two type of belief biases:


  1. Positive belief bias: we accept believable conclusions, which are false, but align with our set of beliefs.

  2. Negative belief bias: we reject unbelievable conclusions, based on logical arguments, but which contradict our beliefs.


Lastly, let's try to bring our brain into reasoning mode by breaking down the argument into smaller ones and connecting them in a logical path. If A determines B and B determines C, does A determine C? If these smaller arguments connect and lead to an accurate conclusion, our process has a fair chance of being a valid one.

3 views0 comments

Recent Posts

See All

Comentários


bottom of page