bunk risk and the problem of debunking

On the twitter this evening, I stumbled across a fascinating paper (pdf) from a couple of years ago by Brendan Nyhan and Jason Reifler documenting problems that can ensue when journalists try to debunk bunk that has an ideological component:

Results indicate that corrections frequently fail to reduce misperceptions among the targeted ideological group. We also document several instances of a ‘‘back?re effect’’ in which corrections actually increase misperceptions among the group in question.

Given the amount of bunk loose in the wild, this sometimes feels to me like an intractable problem. When I tweeted the link, Nyhan helpfully pointed me to this (also pdf) which offers some science-based suggestions journalists might use to combat the problem. Among them, charts and graphs!

 

4 Comments

  1. Interesting. Thanks.
    I am working on duplicating the results of a related paper, possibly from these guys. That paper said that people select facts to support their world view and that presenting them with facts not supporting their world view often gets the people to become defensive and more set in their world view.
    According to the paper, the only way to effect small changes in a person’s world view is to get them to explain the particular belief to you and have them support their belief with their reasoning. Sometimes, with this approach, the holes in the reasoning become apparent to both people.
    I am trying out this way of conversing. I will let you know if it is effective.

  2. The linked papers are great. For better or worse, modern neuroscience predicts what the authors found. You can think of it in terms of costs. The cost of changing a belief may be very high because you have believed it since you were a child, your honored parents believe it, and your friends and colleagues believe it. So, there are real costs to you to change your belief. The costs of keeping the belief, even though it is wrong, are low. Nothing bad will happen to you if you keep the belief. So you keep it. Your decision is rational. The rationale, however, extends beyond whether the belief is factually true. As an example, to change your belief on a certain topic may have the strong, but unconscious, negative emotion that changing that belief would be dishonoring your family and community.

  3. Eli,
    My small experiments say that your statements may not be true. Some fence sitters are remaining as fence sitters while some true believers are changing their view. Whether the view seems to change or not appears to depend on many details of why the person has the view in the first place. Some fence sitters are on the fence for broad and substantive reasons while some ‘true believers’ believe because they have not said their rationales out loud and examined them. There may be broad conclusions across a lot of people, but, person by person, details of the evolution their own personal world view are showing as important. Thoughts?

Comments are closed.