Could You Be Wrong?

Photo Credit: b-tal

Photo Credit: b-tal

We all have beliefs – there’s nothing wrong with that. We also all have prejudices, biases, etc. Not ideal, but that’s the reality. With some appropriate critical thinking, you can keep those at bay. Now, is it possible that your beliefs are wrong? Any self-respecting scientist would say, “Absolutely!!!”…and that is true of EVERYTHING that you believe. You could always be wrong.

So how accurate are your beliefs? Have you been a slave to confirmation bias and your prejudices or are you being honestly critical? How will the future view your beliefs? Will you find out later that you were wrong, really wrong? As a medical provider, will you find out that you were advising patients with grossly inaccurate information? Well, psychology research has looked at this and it turns out, you can accurately judge the quality of your own beliefs…

Philip Tetlock

Philip Tetlock

Back in 2005, Philip E. Tetlock, a professor of psychology at the University of Pennsylvania, published a study that took him twenty years to complete. He was trying to assess the predictive accuracy of experts. That question applies to the practice of physical therapy. We are experts and we make predictions all day – prognoses, treatment efficacy, etc. Many of us also predict future findings in the research – “I believe that the research will show that this treatment is effective via this mechanism…” But how accurate are experts?

Dr. Tetlock looked at experts in political science and economics. These are social sciences that constantly ask for predictions. Who will win the next election? Will that political party maintain power over the next 10 years? Is Ukraine politically stable or will they break into war? Will the economy grow or is there a recession coming? This makes sense as to why he would use these groups for his test subjects.

He found 284 experts and collected 28,000 predictions on future events in their respective fields. The experts also had to rate their confidence levels regarding their predictions. He then followed events for 20 years to see who was right and who was wrong. The simple result of the study was that their accuracy was just dreadfully awful. To quote Dr. Tetlock:

“Experts think that they know more than they do, and were systematically overconfident.”

Not only wrong, but overconfident. That is not a good combination. They found similar outcomes in other predictive fields such as sports (“Who will win the Super Bowl?”) and market economy. As experts ourselves this should make us cringe. If this was all that he found, this would be quite depressing. But he found something else in his data. Turns out, some of the experts tended to have pretty good accuracy across their predictions. Was that just chance or was there something unique about this group?

ExpertPoliticalJudgement

When looking back at the data, Dr. Tetlock found that certain traits were consistent with a good predictor while others were consistent with a bad predictor. They could be identified by characteristics of their belief system and their cognitive style. In other words, their approach to understanding the world. To sum up the difference, poor predictors had traits consistent with dogmatism or a lack of self-criticism, while good predictors demonstrated a good capacity for self-criticism.

What do we mean by “dogmatism”? In this sense, dogmatism is an unwillingness to change one’s mind in a reasonably timely way in response to new evidence. It is also a tendency, when asked to explain their position, to generate only reasons that favor their preferred position, not to generate reasons opposed to it. In contrast, good predictors could critically challenge their own position.

Now wait a minute. Shouldn’t you be looking for reasons to support your preferred position? NO! THAT IS NOT SCIENCE! By definition, that is confirmation bias. You should be looking for evidence that refutes your position. Our old friend falsifiability.

How do you know that you have a good capacity for self-criticism? Dr. Tetlock makes it real simple with this one quote:

“The sign that you’re capable of constructive self-criticism is that you are not dumbfounded by the question, ‘What would it take to convince you that you are wrong?'”

That is critical thinking in one simple question of self-reflection. Stop and think about one of your beliefs. As we agreed at the beginning, you could be wrong, right? Always possible. Since you could be wrong, what evidence would you need to see to convince you? If you don’t know…uh oh!

Go through all of your beliefs this way. Imagine the study that would prove you wrong. What would be the controls? What would be the outcomes? Now look through the literature to see if that study already exists. If it does and it shows that you are right, congratulations! If exists and it shows that you are wrong, well, change your belief! (Especially if it has been performed many different ways with the same conclusion.) If it doesn’t exist, maybe perform the study yourself (or give the idea to a researcher).

What if you simply can’t imagine such a study to prove your belief wrong? In other words there is no way to design a study that would falsify it? Then you believe in pseudoscience!

The next time you get into a discussion with someone, ask yourself that magic question, “What would it take to convince me that I am wrong?” Identify the answer openly, then flip the question to your discussion mate. How do they react?

P.S. I first heard about this study several years ago on a Freakonomics Radio episode with Dr. Tetlock and have used the example in my science lecture that I do for my weekend courses and APTA chapters.