Debunking myths on genetics and DNA

Wednesday, March 23, 2016

We Agree to Disagree: The Science of Why Your Political Posts Won’t Make Anyone Change Their Mind


In today's heated political stage, where everyone has a soapbox thanks to outlets like Facebook, Twitter, Instagram and all the personal blogs, I've tried my best not to share my political views publicly. And I've miserably failed. I use my own Facebook page and profile to talk about science, books and photography, but then I can't resist browsing other people's posts. Most of my friends are not as shy as me about making their political views heard and that's when I fall into the trap: I comment. And then someone replies. And I comment back. And on and on it goes until one of us drops out of the conversation because clearly we're not getting anywhere.

Science has taught me to be humble and rational. And yet I'm human, and every time I make a mistake in my line of work I feel something inside my brain stir and protest: "How's that possible? Surely they sent me the wrong data, or they didn't give me the correct information, or the world collapsed and my computer exploded, but there's no way I could've made that stupid mistake."

Apparently, I'm not unique. We all go through this kind of mental distress whenever we encounter an inconsistency between reality and our expectations, and between other people's opinions or choices and our own. It's called "cognitive dissonance." According to Wikipedia, social psychologist Leon Festinger described four ways our brain deals with this:
In an example case where a person has adopted the attitude that they will no longer eat high fat food, but eats a high-fat doughnut, the four methods of reduction are:

  • 1. Change behavior or cognition ("I will not eat any more of this doughnut")
  • 2. Justify behavior or cognition by changing the conflicting cognition ("I'm allowed to cheat every once in a while")
  • 3. Justify behavior or cognition by adding new cognitions ("I'll spend 30 extra minutes at the gym to work this off")
  • 4. Ignore or deny any information that conflicts with existing beliefs ("This doughnut is not high in fat")
What determines what choice we make?

In my case, I end up going back to my computer program. I typically find the bug (which I unknowingly introduced as I was coding), correct it, and rerun the analyses. Admitting my mistake costs me emotional distress, in addition to that nagging doubt at the back of my head -- will my boss still like me even though I made a stupid mistake? -- but in the long run it would cost me a lot more not to correct the error and hand the wrong analyses to our collaborators.

So why can't we do the same when we are heatedly debating politics or religion? Why do some of us even resort to insults rather than admitting that our own logic is faulty?

One possible reason is that there are no consequences to being disrespectful or even offensive when debating on line. After all, even when we use our real name, we are still hiding behind a shield of impersonality when typing our thoughts on an electronic device. On the other hand, if I hand out the wrong results and my collaborators publish them, there will be huge consequences for me. And frankly, trial and error is part of the scientific process: we all make mistakes, we correct them, and we repeat the process over and over again until we have clean and sensible results. Only then we publish a paper.

But in a political or religious debate the consequences can be far more costly if we suddenly admit that we may have been wrong all along. Changing our mind affects our self-esteem and may lead to self-blame, possibly disrupting the relationships around us. That's why our brain has a tendency to choose the easier path, which often coincides with reinvigorating present beliefs rather than shifting to new ones. As Nyhan and Reifler notice in a 2010 paper [1], there's a difference between being uninformed and being misinformed, as the latter is much harder to correct. In the paper, the authors claim that "humans are goal-directed information processors who tend to evaluate information with a directional bias toward reinforcing their pre-existing views," and conclude: "Indeed, in several cases, we find that corrections actually strengthened misperceptions among the most strongly committed subjects."

This behavior of reinforcing one's beliefs the more the contrasting evidence is presented, is called the "confirmation bias". Patterson et al. [2] define this bias as the tendency to favor certain explanations that conform to our own beliefs and/or emotional response, and classify it as "cognitive" or "emotional" depending on whether it reflects the former or the latter. It's a very familiar bias, as we've all seen it everywhere around us, whether it was to defend our favorite presidential candidate or to debate climate change. A little harder is to pin it down when we are engaging in this behavior ourselves -- but rest assured, we all do it at some point, although each one of us to different extents.

"Because of this mechanism," explains Robin S. Cohen, a Los Angeles based psychoanalyst, "not only are we biased to favor perceptions that are in line with our beliefs, but we are also very likely to organize our world in order to only experience things that conform to our own ideas. This makes it less likely to be confronted with alternative opinions. Our own beliefs are so thoroughly reinforced through this process that new perceptions gain very little traction."

Interestingly, as Leonid Perlovsky describes in a 2013 review [3], experiments have shown that music helps abate the stressful consequences of cognitive dissonance. So, maybe I could try playing a little music in the background next time I'm trying to convince a Trump supporter to find a better presidential candidate. What do you think? Mozart or Metallica?

[1] Nyhan, B., & Reifler, J. (2010). When Corrections Fail: The Persistence of Political Misperceptions Political Behavior, 32 (2), 303-330 DOI: 10.1007/s11109-010-9112-2

[2] Patterson, R., Operskalski, J., & Barbey, A. (2015). Motivated explanation Frontiers in Human Neuroscience, 9 DOI: 10.3389/fnhum.2015.00559

[3] Perlovsky, L. (2013). A challenge to human evolution—cognitive dissonance Frontiers in Psychology, 4 DOI: 10.3389/fpsyg.2013.00179

ResearchBlogging.org









6 comments:

  1. Metallica!
    People also just don't like to be wrong. They will cling to a belief even when they know it's wrong, just because.

    ReplyDelete
  2. American society seems to have lost the art of civil discourse. Perhaps our natural default is to defend our own beliefs against logic and reason, but I have to believe education in both liberal arts and applied sciences enables intelligent people to participate in objective discussions concerning subjects about which they disagree. The current stumbling block appears to be that the general population doesn't seem to support the need for higher education for all. Not insurmountable, but becoming more challenging with every passing election.

    ReplyDelete
  3. Interesting post. Like Alex says, people don't like to admit they were wrong because then they'll lose face and look weaker.

    ReplyDelete
  4. I do my best to not speak about politics or religion online. People get very defensive. One problem if I think we forget that there is another person on the other side of the screen and it's too easy for us to be mean as a result. =(

    ReplyDelete
  5. Great post. This is a very frustrating behavior. The sad thing it is not just people behind the screens. It is also people in your face who think they are always right no matter how logical the argument.

    Juneta @ Writer's Gambit

    ReplyDelete

Comments are moderated. Comments with spam links will be deleted and never published. So, if your intention is to leave a comment just to post a bogus link, please spare your time and mine. To all others: thank you for leaving a comment, I will respond as soon as possible.