People think that philosophy is about pondering, and ideally answering, questions like the following ones: Does life have meaning? What…
We frequently change our minds – most often about trivial things like the evening’s dinner plans, but sometimes about ideas of great importance to us. For example, a pacifist might find that she agrees with her country’s involvement in a particular war, and this might, in turn, lead her to a more general view about the circumstances where going to war is justified. A career soldier might become a pacifist, perhaps after experiencing the terror and suffering of warfare at first hand.
Why do we change our minds on important issues? There can be many reasons. Often, our ideas seem to develop naturally as we mature, see more of the world, and start to find our place in it. Over time, a viewpoint that jelled with our earlier base of knowledge and worldly experience may come to seem naïve. In other cases, a change of mind can be more sudden, perhaps triggered by an emotionally moving experience or a new item of vital information. For some people, the change can come from engaging imaginatively with a different perspective when reading a vividly realised novel or watching a compelling film.
When we do change our minds on important issues, we tend to believe we’ve made intellectual or moral progress. That is, we think that our new views are closer to the truth than our old ones. Unfortunately, that can’t be guaranteed. One worry is that other people may be changing their minds in a direction completely opposite to yours. Perhaps you’ve gone from being a pacifist to thinking some wars are morally justified, yet you see another person – someone who appears intelligent, well-informed, and sensitive to life’s complexities – becoming a pacifist who rejects any justification for war. You may both have developed more sophisticated positions and arguments, which is arguably a kind of progress, but it’s unlikely that you’re both closer to the truth.
From someone else’s viewpoint, the mere fact that I previously took a different position from my current one is only weak evidence that my current one is true, or even that it is closer to the truth than what I previously believed. If I want to convince you to change your own mind in the same direction that I did, I’d better be able to offer you some rational critique of my old beliefs and some justification for my new ones.
Unless I tell you much more about the process, you might wonder whether I changed my mind for intellectually weak reasons. Perhaps I shifted to beliefs that were more comforting than my old ones at a time in my life when I was grieving over failure or loss. Or perhaps I rejected a defensible position out of anger with some of its proponents. Or perhaps I was merely swayed one way or another by the stance of a charismatic celebrity.
None of this is meant to deter you from changing your mind about issues of intellectual or moral importance. On the contrary, we all tend to cling too strongly to whatever views we started with, welcoming evidence that favours them while happily rationalising away evidence to the contrary.This the well-known confirmation bias, one of the many cognitive biases and potentially misleading mental shortcuts studied by psychologists. The bias is stronger when our initial views are emotionally important to us or part of our self-conception. We should be open to changing our minds, and we need to counter the effects of confirmation bias, but the most effective ways of getting us to change often appeal more to our emotions than our reasoning faculties.
As with most things worth thinking about, there are complications on complications. Even bad reasons to change our minds may not always be completely bad! If I change my mind in order to be more like someone I admire, that is an intellectually weak reason. What, however, if the surprise of finding that someone I admire takes a particular position that I reject – pacifism, let’s say – makes me feel more open to exploring the actual evidence? That is, it might help me overcome my confirmation bias and treat the evidence against my existing position more fairly. This seems fine, as long as it’s the evidence and arguments that eventually sway me.
If I read the detailed story of someone who turned away from my current ideas to ideas that I reject, again it might counteract my confirmation bias. Anything that can help us open our minds to evidence and arguments against our current ideas – and perhaps in favour of ideas that take us out of our comfort zones – is potentially useful. Progress – both intellectual and moral – is possible, and there are many ways to challenge our own and others’ cognitive biases. Yet, there can be a fine line between some of these tactics and outright propaganda based on emotional manipulation.
Whether you’re defending your existing ideas or considering new ones, a good test is to try to imagine how the evidence and the logic of the arguments would strike a rational bystander who does not already accept them but is not resistant to them merely because of confirmation bias or emotional discomfort. This is a version of the outsider test made famous by the American religious sceptic John W. Loftus. If an idea passes that test, it may be worth holding on to or adopting.
Next time you get the opportunity, why not try it for yourself?