Biases
The Monty Hall Problem – a great example of a bias!
The Monty Hall problem is a brain teaser, in the form of a probability puzzle, loosely based on the American television game show Let's Make a Deal and named after its original host, Monty Hall.
Suppose you're on a game show, and you're given the choice of three doors: Behind one door is a car; behind the other doors are goats. You pick a door, say No. 1, and the host, who knows what's behind each door, opens another door, say No. 3, which has a goat. He then says to you, "Do you want to change your selection?"
Our natural response is not to change. We don’t think that anything has changed as a result of knowing what is behind one door. We made our selection and we want to stick with it. However, that is not the right response. By switching, you will win two out of three times.
When the player first makes their choice, there is a 2/3 chance that the car is behind one of the doors not chosen. This probability does not change after the host reveals a goat behind one of the unchosen doors. When the host provides information about the 2 unchosen doors (revealing that one of them does not have the car behind it), the 2/3 chance of the car being behind one of the unchosen doors rests on the unchosen and unrevealed door, as opposed to the 1/3 chance of the car being behind the door the contestant chose initially. Therefore, they are better of changing their choice. You can read more about this here.
Information overload
We are overloaded with information, being sent to us all of the time. There are noises, sights, smells constantly changing around us and we have to find a way to deal with it all. So, we filter it. We can easily sit watching our favourite TV programme and not hear our partner calling out to us. We can be walking down the street and not see everything that is going on around us. If you ask three people to describe a scene, they will each record different things.
What we see, hear and believe is as a result of our cognitive biases. Cognitive biases are mental shortcuts (known as heuristics) and exist for a good reason – they are designed to help us survive! Our brains have evolved over thousands of years but they still operate in much the same way. Everything happening around us is extremely complex and we cannot process all the information around us. As a result, we create mental shortcuts to make decisions quickly and effectively.
So, if we want to change the way that people think, we need to consider what is biasing them and work around them.
Cognitive Biases
These include:
Anchoring bias – first information received
Availability bias – news rather than facts inform us
Bandwagon effect – following what everyone else does
Confirmation bias – all information confirms what we already believe
Ostrich bias – ignore negative information
Outcome bias – judge a decision on how it turns out
Survivorship – judge something based on surviving information
Anchoring bias. We judge something based on the first piece of information that we receive. For example, if we see the price of a product as £30 one day and £28 the next day, then we think we are getting a good deal. However, if we see the price as £26 one day and £28 the next day, then we do not think that we are getting such a good deal. Our perception of value is biased by the first price that we saw, even though the second price was the same in both cases.
Availability bias. Are sharks more dangerous than chairs? When we think of a shark, we think of lots of sharp teeth and an environment where they are at home and we are not. We therefore think that they are more dangerous. However, more people die falling off a chair than die from being attacked by a shark. We are biased by the information that we know, not by all of the information that is available.
Bandwagon effect. People are more likely to vote for the person that they think is going to win because they can then feel part of the glory of winning. If a share price is going up, more people are likely to buy them. People want to be part of a successful group or project and so will join it. They are much less likely to join a smaller group, even if they agree with them more.
Confirmation bias. If your partner is pregnant, you are likely to notice more people who are pregnant than if your partner was not. If you decide to buy a specific new car, you suddenly start noticing that car all around you. There are no more pregnant people or cars than before, you are just noticing them more because it is relevant to you or supports your beliefs.
Ostrich Bias. Ostriches are famous for putting their heads in the sand and avoiding things that they do not like. Ostrich bias happens when we ignore results or data that do not conform to our model or beliefs. We ignore the outliers on a graph as being unrepresentative. However, that data exists for a reason and we should try to understand it.
Outcome bias. In this you decide if an action was right or wrong based on the outcome. Imagine that you are driving too fast and a child steps out in front of you. If you avoid them, then you might think that everything was OK because the outcome was OK. However, the reality is that you only have to be wrong once in these circumstances to have devastating consequences so the potential outcome is a better determinant of right and wrong in these circumstances.
Survivorship. We judge things by what survived rather than by considering all of the evidence. For example, during World War II, researchers at the Centre for Naval Analysis needed to reduce the number of bombers being shot down. After each mission, they reviewed the bullet holes and damage from each bomber and determined that most damage was to the wings and body of the plane. They proposed to reinforce the armour in these areas. However, that was the wrong solution! They had only looked at the bombers that returned, not those that had been shot down.
The reality was, the planes could survive being shot in these places. It was the rest of the plane that was most vulnerable and needed to be reinforced. By doing this, they made the planes more secure and resulted in fewer fatalities.
Truthiness
Whilst not exactly a bias, the concept of truthiness is an interesting one and worth commenting on. Experiments show that we are more likely to believe a statement if it is accompanied by a relevant picture than if it is presented to us as just a statement. For example, if we saw the statement “The metal inside a thermometer is magnesium” we may or may not believe it. However, if presented next to a picture of a thermometer, we are more likely to believe it.
Very interestingly, we are also more likely to believe it if the statement is accompanied by any picture, even if it is not relevant.