Political Biases 101 Jonathan Haidt is a moral psychologist whose - TopicsExpress



          

Political Biases 101 Jonathan Haidt is a moral psychologist whose book The Righteous Mind touches on what I’m getting at. Haidt says that politics is an expression of our underlying moral psychology, and here’s how moral psychology works: people feel something, and they then come up with reasons to justify why they feel the way they do after the fact. Haidt argues that reasoning is “post-hoc and justificatory,” meaning that people don’t want to change deep-rooted beliefs, so they seek out only that information that aligns with their point of view. They become blind to evidence and arguments that challenge their assumptions. Haidt is talking about the psychological concept known as the confirmation bias, which is the tendency of people to seek out information that confirms their existing beliefs. Here’s an example: I watch a presidential debate between Barack Obama and Mitt Romney. If I’m an Obama supporter, I will look for moments in the debate where Romney blunders or makes a mistake. I will zero in on these moments and use them to confirm what I already believe: Obama is a better candidate. How does this relate to social media? Sites like Facebook and Twitter make it extremely easy for us to enact this bias. If I want to (and I would argue that most people do), I can chose to “like” or follow only those people/brands/news sources that align with my beliefs. If I am a liberal, I can like The Huffington Post on Facebook and follow it on Twitter, and I will be presented with an array of news stories that slant left and that convince me that my political beliefs are the right ones. Therefore, social media isn’t really politically influencing me; it’s simply validating my preexisting beliefs. There’s another psychological phenomenon that relates to the confirmation bias: the backfire effect, a termed coined by researchers Brendan Nyhan and Jason Reifler. Nyhan and Reifler conducted a study in which they took people who were misinformed about a particular topic (the Iraq War) and presented them with the correct facts. According to the confirmation bias, people will be reluctant to change their established beliefs even when they’re confronted with evidence that proves they’re wrong. Nyhan and Reifler found this to be true—but they also found something else. Presenting someone with the corrected facts backfires. It doesn’t cause people to change their minds; in several cases, it actually causes them to become more entrenched in their beliefs. It strengthens misperceptions. Like the confirmation bias, the backfire effect complicates the idea of social media influence. In the most ideologically-grounded people (i.e. the die-hard conservatives and liberals), the chances of social media inspiring them to do something that they had not already planned on doing in the first place is slim to none. Say I have a misconception surrounding Obama’s health care legislation. I will stay misinformed even if I encounter evidence to the contrary. In fact, my misconceptions will become even stronger if people try to correct my views. And social media makes it very easy to stay misinformed. News sources don’t always report the correct facts and many news sources lean either left or right. Social media lets us use Facebook and Twitter filters: we can follow sources that align with our beliefs and tune out the sources that don’t. Conservatives can follow Fox, and liberals can follow The Huffington Post; their vantage points will not change. Yes, tweets sent out by Fox might inspire conservatives to vote for Romney, and The Huffington Post Facebook posts might inspire liberals to vote for Obama, but wouldn’t they have done that anyway?
Posted on: Fri, 20 Sep 2013 15:47:13 +0000

Trending Topics



Recently Viewed Topics




© 2015