In this video I introduce several potential obstacles to problem-solving including overconfidence, illusory superiority, belief bias, and belief perseverance. Then I describe a “consider the opposite” strategy for potentially reducing the influence of bias when interpreting evidence that supports or contradicts our pre-existing beliefs.
Don’t forget to subscribe to the channel to see future videos! Have questions or topics you’d like to see covered in a future video? Let me know by commenting or sending me an email!
Check out my psychology guide: Master Introductory Psychology, a low-priced alternative to a traditional textbook: http://amzn.to/2eTqm5s
Video Transcript:
Hi, I’m Michael Corayer and this is Psych Exam Review. In this video I want look at some potential obstacles to problem solving and the first of these is overconfidence. This refers to the idea that we can be overly confident that we already have the correct solution, that we know the answer. As a result we don’t consider possible alternative answers.
Now one way of demonstrating this is to ask people to spell difficult to spell words and when we do this we see that people are more confident than they are correct. So you ask people to spell difficult to spell words and then you ask them “how confident are you in that spelling” and you look at the words.
When people say “I’m a hundred percent confident, I know I’ve spelled that word correctly” and they’re only correct about 80 percent of the time. So we could think about this as when people are misspelling this word, the reason they’re misspelling it is because they’re not considering that they might be spelling it wrong and so they’re not checking. They’re not looking into other possible spellings that might be correct. Of course this applies to more than just spelling.
If we think we already have the solution, if we think we know the answer, then we don’t investigate further. We don’t consider possible alternatives and as a result we continue making the error. So this can prevent us from getting to the actual correct solution to a problem because we think we already have the correct solution and therefore we’re done working on the problem.
Now it’s easy when you hear about overconfidence to dismiss it as a problem that other people have. This is a mistake that other people make but not me, I’m better than that I don’t make this particular type of error. That actually leads us to the next potential obstacle to our problem-solving, that’s this tendency to think that we’re better than other people, that we don’t make the same mistakes that other people make when they’re solving problems.
This is known as illusory superiority. Alright, so we tend to think of ourselves as being better than other people and not prone to the same errors that other people are prone to and we can see this if we ask people to assess their attractiveness or their driving ability.
It turns out most people say that they’re above average. Of course this is statistically impossible; most people cannot be above average but this is how people will answer and this is often called the Wobegon effect. This is after the fictional town of Lake Wobegon from Garrison Keillor’s “A Prairie Home Companion” where “all the women are strong, all the men are good-looking, and all of the children are above average”.
Now we might think of this as an error and it can be an error but there may be some positive purposes for this illusory superiority. It might be the case that by thinking that we’re capable and that we’re competent in a number of things this actually helps motivate us and it helps us to make decisions. It prevents us from being plagued by doubt. If you always think that you’re doing things wrong and that you’re not capable then it’s gonna be a lot harder to actually take action and get things done.
One supporting piece of evidence for this is the idea that people who don’t show this illusory superiority, a group of people who don’t show this, are people who are suffering from depression. So this indicates that while this illusory superiority can mislead us and cause us to make errors it may also be serving some positive functions in sort of maintaining our psychological well-being. It’s sometimes referred to as our psychological immune system; it sort of wards off negative thoughts by thinking that we are more capable than we actually are.
Ok, so we also have a problem that our beliefs can influence our ability to think about problems and this is called belief bias. So this is the idea that our pre-existing beliefs can disrupt our ability to think logically about a particular problem. What makes this especially difficult is the fact that it’s hard to change our beliefs even when we have contradictory evidence.
This is referred to as belief perseverance, the idea that we have a belief we’re told, we’re shown evidence that that belief is incorrect and yet we maintain the belief. We still cling to it. This was demonstrated in a study by Lee Ross, Mark Lepper, and Michael Hubbard and what they did was they gave people a task, participants completed some tasks, and then they were given feedback on their performance.
Some people were told they did well some people were told they did poorly and then they said, “I’m so sorry I gave you the wrong feedback. This feedback has nothing to do with your performance. It’s totally unrelated, this is, you know, some random other piece of feedback. It’s not yours.” Then they asked, “how well do you think you did? Could you estimate your performance?” and the estimates that people gave matched the false feedback that they had been given even though they knew that the feedback was false. Even though they were told this feedback is not yours, it has nothing to do with your performance, people still made estimates that followed that feedback.
So the people who were told they did well still thought they did pretty well when they gave an estimate and the people who were told they did poorly and then told that wasn’t your feedback still thought you know “I don’t think I did very well, I think I did pretty poorly”. This shows how we cling to our beliefs once they’re formed. It’s very hard to change them so how can we go about reducing the influence of this belief bias?
One way comes from research by Charles Lord and colleagues and what they did was they gave people a controversial topic where people probably already had some existing beliefs and then they gave them evidence on both sides of the issue and they wanted to see how this changed people’s beliefs or whether it changed their beliefs.
One thing that tends to happen is that when when you do this it actually makes people’s beliefs stronger. So the topic that they gave them evidence on was either supporting or refuting the death penalty as a punishment strategy. So you have people who support the death penalty and people who are against it and then you give them evidence on both sides of the issue with them, you know, a sort of balanced view of the topic and you might think that this would bring people together. The people at the extremes would say “you know, now that I see that there’s some evidence contradicting my view, I should moderate my opinion a bit”. That’s not what happens.
What happens is it tends to polarize people even more because the people who are on, let’s say they support the death penalty, they look at this balance of evidence and they only pay attention to the evidence that supports their opinion, right? They say “these are the well conducted studies, this is the the right research”. Then they dismiss the stuff they don’t agree with. They say “well, you know what, this had some methodological flaws to it” and “there’s a number of reasons why we can ignore this information”.
As a result they end up even more strengthened in their view because now they say “okay, you know let’s say we looked at six pieces of evidence here. Now I have three things supporting me and, yeah there were three things that disagreed with my opinion, but they were done poorly and so they don’t count” and so I have all the more reason to believe my view and the people at the other end do the exact same thing. They find, you know, let’s say they’re against the death penalty, they look and they pull out those studies that are also against it and I say “look at all this great evidence supporting my opinion” and you know “I can dismiss the other thing because this is done poorly”.
So what Charles Lord and colleagues did was they tried to come up with a way to prevent this from occurring. One way we might think would work, we just ask people don’t be biased, don’t let your personal opinion influence your evaluation of this evidence. It turns out that doesn’t work very well but what does work a little better is what’s called a “consider the opposite” approach. What this asks people to do is, when they look at the evidence, consider the opposite and what do I mean by that? It means would you evaluate this study the same way if the results produced the opposite effect?
So in other words when you’re looking at that study that say supports the death penalty and imagine that you also support it, you say okay yeah this seems really well done. Look at how, look how great this evidence is and then ask well would you think it was so great if the result had been the opposite? This helps people to try to separate their pre-existing belief and to analyze it without the belief influencing their evaluation of it because now they’re evaluating it from the opposite perspective. It turns out this consider the opposite approach worked fairly well in preventing people from becoming more polarized when they viewed this evidence.
So this is something that we can keep in mind whenever we find ourselves with a particular belief or we think we have a particular solution to a problem. We can ask “would I consider this evidence in the same way if the results were the opposite” and hopefully that can allow us to set aside our bias and evaluate it a little bit more carefully. Ok, I hope you found this helpful, if so, please like the video and subscribe to the channel for more.
Thanks for watching!