An expert on human blind spots gives advice on how to think
How to fight the Dunning-Kruger effect, explained by psychologist David Dunning.
By Brian Resnick
David Dunning, a psychology professor at the University of Michigan, has devoted much of his career to studying the flaws in human thinking. It has kept him busy.
You might recognize Dunning’s name as half of a psychological phenomenon that feels highly relevant to the current political zeitgeist: the Dunning-Kruger effect. That’s where people of low ability — let’s say, those who fail to answer logic puzzles correctly — tend to unduly overestimate their abilities.
Here are the classic findings from the original paper on the effect in graph form. The worst performers — those in the bottom and second quartile — grossly overestimated their ability (also note how the best performers underestimated it).
The explanation for the effect is that when we’re not good at a task, we don’t know enough to accurately assess our ability. So inexperience casts the illusion of expertise.
An obvious example people have been using lately to describe the Dunning-Kruger effect is President Donald Trump, whose confidence and bluster never wavers, despite his weak interest in and understanding of policy matters. But you don’t need to look to Trump to find an example of the Dunning-Kruger effect. You don’t even need to look at cable news. Dunning implores us to look for examples of the effect in ourselves.
“The first rule of the Dunning-Kruger club is you don’t know you’re a member of the Dunning-Kruger club,” he told me in an interview last year. “People miss that.”
Last year, I called Dunning to talk about the virtue of intellectual humility, or the ability to recognize that the things we believe in might be wrong. It’s an essential trait, but a rare one.
Why? Because our brains hide our blind spots from us. And the Dunning-Kruger effect is one example of how: We often feel more confident about a skill or topic than we really should. But at the same time, we’re often unaware of our overconfidence.
So the basic question I had for Dunning is, “How should we think about our thinking and make it more accurate?”
His answers, I think, contain good advice for a navigating a world where lies and misinformation spread rampantly, and where inconvenient truths are easy to ignore.
This interview has been edited for length and clarity.
Brian Resnick
How do you describe your work?
David Dunning
I study the psychology underlying human misbelief. Why do people believe things that aren’t true, or can’t possibly be true? So in general, I study “how can people possibly believe that?”
What gets me to questions like the Dunning-Kruger effect ... is that we really don’t know our ignorance. Our ignorance is invisible to us.
Brian Resnick
What do you wish more people knew about the limitations of the human mind?
David Dunning
If there is a psychological principle that I think people should know more about, it’s the principle of naive realism. [It means that] even though your belief about the way the world is just seems so compelling or so self-evident, it doesn’t mean that it really is [true].
Whenever we reach a conclusion, it just seems like it’s the right one. In fact, a lot of what we see and conclude about the world is authored by our brains. Once you keep that in mind, hopefully, it does give you pause, to think about how you might be wrong, or to think about how another person might have a case. And you might want to hear them out.
Your brain is doing a lot of creative artistry all the time. There have been a couple of teachable moments in the past couple of years [on naive realism].
The first teachable moment was that blue-black/gold-white dress. You look at that dress and damn it, it looks white and gold to me. And I can’t make it look the other color. So it looks like the way it is. But really, our brain is making a few assumptions and then coming up with an answer. That’s us. It’s not the world.
Brian Resnick
Something that I think is both funny and instructive about your work is that people often get the Dunning-Kruger effect wrong, and take away the wrong conclusions from it. Do you see that a lot?
David Dunning
Yes. The answer is yes.
The work is about [how] when people don’t get it, they don’t realize they don’t get it. And so the fact that people don’t get the work in major ways is a delicious irony, but also terrific confirmation.
But there are a couple things that people get wrong that are major.
The first is they think it’s about them [i.e., others]. That is, there are those people out there who are stupid and don’t realize they are stupid.
Now, those people may exist, and the work isn’t about that. It’s about the fact that this is a phenomenon that visits all of us sooner or later. Some of us are a little more flamboyant about it. Some of us aren’t. But not knowing the scope of your own ignorance is part of the human condition. The problem with it is we see it in other people, and we don’t see it in ourselves.
The first rule of the Dunning-Kruger club is you don’t know you’re a member of the Dunning-Kruger club. People miss that.
Number two is, over the years, the understanding of the effect out there in popular culture has morphed from “poor performers are way overconfident,” to “beginners are way overconfident.” We just published something within the last year where we showed that beginners don’t start out falling prey to the Dunning-Kruger effect, but they get there real quick. So they quickly come to believe they know how to handle a task when they really don’t have it yet.
Brian Resnick
The fact that people often misinterpret your conclusions: Does that teach us something about the limits of the human mind?
David Dunning
Well, it teaches us both about the limits and the genius of human understanding. Which is, we can take some idea and spin a complete and compelling story around it that is coherent, is plausible, makes a lot of sense, is interesting — and it doesn’t necessarily mean that it’s right. So it shows you how good we are at spinning stories.
Brian Resnick
Are there any solutions or tools that we can use?
David Dunning
There are some clues, I think, that come from the work of [University of Pennsylvania psychologist] Philip Tetlock and his “superforecasters” — which is that people who think not in terms of certainties but in terms of probabilities tend to do much better in forecasting and anticipating what is going to happen in the world than people who think in certainties.
But I think that’s only a start.
What you need to do is take home the lessons of this, and be a little bit more careful about what pops out of your head or what pops out of your mouth.
You don’t have to do it all the time, but if the situation is important, or the situation is fractious, [take a] time out.
Brian Resnick
What lessons in your work can help us think through the past few years in American media — this age of “fake news,” “alternative facts,” partisan divides, and so on?
David Dunning
One of the things that really concerns me is that people really don’t make the distinction between facts and opinion. So if you survey Democrats and Republicans right now, of course they differ in terms of their priorities for the country and their theories of where we should take the country.
But they also differ in what they think the country is. They really differ in terms of “is the economy doing well?” “What’s the record of the Obama administration?” “Did the stock market go up or did it go down?”
These are factual questions. What’s impressed me in the past few years is how much people not only author their opinions but author their factual beliefs about the world.
I ask people a lot of questions in political surveys where I think [they answer they ought to choose] is, “I don’t know.” And that answer is there for people to give, and they go right past it.
Brian Resnick
Are Americans averse to saying “I don’t know” to a factual question? Is this in a new study?
David Dunning
This is a recent project we have going where what we’ve done is we’ve asked, for example, factual questions about the United States, like, “Is teenage pregnancy at an all-time high?” Or, “What’s the financial shape of Social Security?”
We know what the facts are, and we ask people about the facts. Not only that, we put incentives into the survey that are designed to make people honest, borrowing some techniques from economics.
And basically, what we get is Democrats and Republicans differ wildly in terms of what they think is factually true about the world.
What I’m trying to figure out is ... can we actually diagnose whether these beliefs are authentic or not?
We did try to see if we could figure out if “birther” [views that Barack Obama was not born in the United States] were authentic. That is, when a person says, “Barack Obama was born in Kenya.” Does that look and act like a real belief? And the answer appears to be yes.
Brian Resnick
Is there any insight at how you can get people more comfortable saying, “I don’t know?”
David Dunning
That’s an interesting question, because people seem to be uncomfortable about saying, “I don’t know.” That’s one thing we’ve never been able to get people to do.
I have to admit that over 30 years of research, I often think the correct answer to the question I’m asking you [in a survey] is, “I don’t know.” And people give me an answer [other than “I don’t know”].
How do you get people to say, “I don’t know”? I don’t know.
Brian Resnick
Is there a personal consequence to being more intellectually humble? Some of the best journalists I know — and this is completely anecdotal — tend to be a little neurotic. But they get things right. This can’t be healthy for everyone, to be uncertain all the time.
David Dunning
To get something really right, you’ve got to be overly obsessive and compulsive about it.
Here’s the key: The consequential decisions tend to be the ones that we don’t come across all that often. Like, what houses do we buy? What people do we marry? What kids do we have? And so consequential decisions tend to be the ones we don’t have experience with. They’re exactly where there’s stuff we don’t know, and that’s exactly those types of situations where we should be seeking outside counsel.
Brian Resnick
For what it’s worth, I tend to really trust anxious people.
David Dunning
I agree. I’ve found neurotic people are so wise in the area in which they’re neurotic, which has always surprised me.
The areas in which I take the most care [are] really motivated by the fact that I believe doom is just around the corner, with every single decision. So let me figure out: What are the ways in which I’m doomed? That may not be the most healthy way to approach life.
Brian Resnick
Is there a healthy way to be skeptical, humble, and aware of these cognitive blind spots we have?
David Dunning
Ask yourself where you could be wrong if the decision is an important one. Or how can your plans end up in disaster?
Think that through — it matters. Think about what you don’t know. That is, check your assumptions.
On a more general level, a lot of the issues or problems we get into, we get into because we’re doing it all by ourselves. We’re relying on ourselves. We’re making decisions as our own island, if you will. And if we consult, chat, schmooze with other people, often we learn things or get different perspectives that can be quite helpful.
An active social life, active social bonds, in many different ways tends to be something that’s healthy for people. Social bonds can also be informationally healthy as well. So that’s more on a top, more abstract level, if you will. That is, don’t try to do it yourself. Doing it yourself is when you get into trouble.
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.