The latest C-Poll is closed. You can read all about it here!

February 24, 2016

On the all-too-common phenomenon of assuming one's ideological foes to be ignorant, immoral, or insane

Aaron Ross Powell has written a mostly-excellent essay on the extreme tribalism (although he doesn't call it that) which underlies the emotions aroused by virtually every contentious issue of the day.  As I opined in a previous post, the positions we take so earnestly are the rational end of a long chain of propositions that go all the way back to the one or more foundational propositions that form the foundation of our worldview (and are usually accepted without question).

Because our conclusions about the issue of the moment are so mindbogglingly obvious to us, we don't see how anybody could come to any other conclusion. If they do, it's a sure sign that they're ignorant, that they know better but are just too morally corrupt to embrace the right view, or that they are genuinely mentally ill.  I don't think I am amiss in assuming that most of us have felt this way about our ideological foes at one time or another, or have been the target of such speculation.

Here's how Powell illustrates the conflict:
Say we believe that Policy A, which we support, will lead to good Result X. We encounter someone who instead advocates for Policy B. Because of our certainty about the evidence and how to interpret it, too many of us too often see that person’s support for Policy B coming not from a good faith and reasoned belief that Policy B is a better way to get to Result X. Because if what we believe is both correct and obvious, then the advocate of Policy B must know that it will undermine the achievement of X. And if X is a good result, then this person doesn’t just disagree with us, but actively wants something bad to happen.

Unfortunately, this all-too-common way of thinking about political debate leads to serious problems, because it means that our empirical beliefs are essentially closed to critique unless that critique comes from someone who already shares our policy preferences. If our interlocutor doesn’t share our policy preferences, then before the conversation can get off the ground, we’ve already decided he is either stupid (he’s too dumb to see his error) or immoral (he maliciously prefers evil outcomes). But, of course, if our empirical priors or interpretive framework are wrong, then someone with better priors will likely come to a different policy conclusion.

Thus individual policy preferences exist as a signal of their holder’s intelligence or moral worth—and a challenge to one’s policy preferences gets interpreted as an attack on the holder’s smarts or basic goodness. Because we believe that certain policy preferences signal moral worth, we adopt our policy preferences based on how we would like to be perceived. And we hold to those policies regardless of their actual, real-world outcomes, or pay so little attention to their outcomes that we never feel the need to revise our political preferences.
If you see nothing wrong with this phenomenon, and your chief goal in any ideological conflict is to vanquish your foes, there's not much left to talk about.  Just carry on doing what you've always done.  The current chaos that is the 2016 presidential campaign is the world you live in and apparently enjoy.

If, on the other hand, you wonder if it's even possible to coexist with people who hold points of view that bug you so much, hoping that it is possible, stick around.  Understanding this ingrained human tendency to organize into tribes is, I believe, the only alternative to putting the maximum distance possible between all human beings.  I also believe that understanding our tribal thinking is the first step to finding a way to knowing how to bring others around to our point of view. 

This is going to be a running theme on this blog as I continue to ponder the riddles of human interaction.  I hope you'll find it worth the read.

No comments: