“All Things Considered" is a commentary program of national public radio. Some time ago, Dr. Drew Westen of Emory University, a well respected psychologist, was commenting upon how our feelings can predict our political decision irrespective of the facts. Westen studies the way that psychology and politics intersect, and he says a familiar format in cable TV news works with the way our brains are wired. Here is what he said:
"We've grown accustomed to hearing two versions of every story, one from the left and one from the right, as if the average of two distortions equals the truth. You've seen this on TV. The journalist provides the skeleton of the story; it's then up to partisans to try to graft flesh onto one side or the other of its clanking bones.
"For example, I heard a news anchor begin a segment about missing explosives at the al Qaida munitions dump in Iraq. He described claims that weapons were missing and then handed it over to a Democrat and a Republican to dress the skeleton in red or blue. In fact, however, the munitions were missing, and the subject of the debate that followed, when they disappeared, was a question of fact, not interpretation, unless, of course, Democrats and Republicans live in different time zones.
"Unfortunately, this format--from the left, from the right--capitalizes on a design flaw in the human brain. We have a tendency to believe what we want to believe. We seek information and draw conclusions consistent with what we want to be true. I've been studying this kind of emotion-driven political thinking over the last several years, and the results are sobering. For example, during the disputed election of 2000, we could predict whether people would believe that manual or machine counts are more accurate just by knowing their feelings towards the two parties and the two candidates.
"When people draw conclusions about political events, they're not just weighing the facts. Without knowing it, they're also weighing what they would feel if they came to one conclusion or another, and they often come to the conclusion that would make them feel better, no matter what the facts are.
"An experiment completed right before the election shows just how powerful these emotional pulls can be. Here's what we told the participants. A soldier at Abu Ghraib prison was charged with torturing prisoners. He wanted the right to subpoena senior administration officials. He claimed he'd been informed the administration had suspended the Geneva Conventions. We gave different people different amounts of evidence supporting his claims. For some, the evidence was minimal; for others, it was overwhelming.
"In fact, the evidence barely mattered. 84% of the time, we could predict whether people believed the evidence was sufficient to subpoena Donald Rumsfeld based on just three things: the extent to which they liked Republicans, the extent to which they liked the US military, and the extent to which they liked human rights groups like Amnesty International. Adding the evidence into the equation allowed us to increase the prediction from 84% to 85%.
"A readiness to believe what we want to believe makes it all the more important for journalists to distinguish what's debatable from what's not. The line between facts and interpretations isn't always easy to draw, but presenting opinion as fact is not objective reporting. It isn't objective to preface news that's unflattering to one side or the other with phrases like 'critics claim' when it doesn't take a critic to claim it. There's nothing like a healthy debate, but there's nothing as unhealthy as a debate about the undebatable." (NPR Radio)
There was one sentence in that article that is shocking. "Adding the evidence into the equation allowed us to increase the prediction from 84% to 85%." In Dr. Westen's study, the actual number was 84.5%. His study centered around politics but the implications are broad. In the arena of faith, there are so many differences of opinions on truth.
A few years ago, I was discussing an important point of doctrine with a friend who had been raised and taught an errant view. After several weeks of examining scripture passages and contexts, he said to me, "You have presented overwhelming scriptural evidence in support of your view of the issue but there is just one problem; it is not what I have always been taught." For him, the facts did not make any difference. He based his beliefs on his feelings.
Unlike politics, in faith we have the absolute, authoritative and inerrant Word of God as the basis for our beliefs. This basic "design flaw in the human brain" that Dr. Westen referenced might explain why people are so willing to form their beliefs based on their emotions rather than the facts.
The old joke, "Don't confuse me with the facts. My mind is made up!" may be funny in politics but when it comes to matters of faith, believing the wrong thing could have disastrous, eternal consequences.