Philip Pärnamets,Jay Van Bavel 7-8 minutes
Our political opinions and attitudes are an important part of who we are and how we construct our identities. Hence, if I ask your opinion on health care, you will not only share it with me, but you will likely resist any of my attempts to persuade you of another point of view. Likewise, it would be odd for me to ask if you are sure that what you said actually was your opinion. If anything seems certain to us, it is our own attitudes. But what if this weren’t necessarily the case?
In a recent experiment, we showed it is possible to trick people into changing their political views. In fact, we could get some people to adopt opinions that were directly opposite of their original ones. Our findings imply that we should rethink some of the ways we think about our own attitudes, and how they relate to the currently polarized political climate. When it comes to the actual political attitudes we hold, we are considerably more flexible than we think.
A powerful shaping factor about our social and political worlds is how they are structured by group belonging and identities. For instance, researchers have found that moral and emotion messages on contentious political topics, such as gun-control and climate change, spread more rapidly within rather than between ideologically like-minded networks. This echo-chamber problem seems to be made worse by the algorithms of social media companies who send us increasingly extreme content to fit our political preferences.
We are also far more motivated to reason and argue to protect our own or our group’s views. Indeed, some researchers argue that our reasoning capabilities evolved to serve that very function. A recent study illustrates this very well: participants who were assigned to follow Twitter accounts that retweeted information containing opposing political views to their own with the hope of exposing them to new political views. But the exposure backfired—increased polarization in the participants. Simply tuning Republicans into MSNBC, or Democrats into Fox News, might only amplify conflict. What can we do to make people open their minds?
The trick, as strange as it may sound, is to make people believe the opposite opinion was their own to begin with.
The experiment relies on a phenomenon known as choice blindness. Choice blindness was discovered in 2005 by a team of Swedish researchers. They presented participants with two photos of faces and asked participants to choose the photo they thought was more attractive, and then handed participants that photo. Using a clever trick inspired by stage magic, when participants received the photo it had been switched to the person not chosen by the participant—the less attractive photo. Remarkably, most participants accepted this card as their own choice and then proceeded to give arguments for why they had chosen that face in the first place. This revealed a striking mismatch between our choices and our ability to rationalize outcomes. This same finding has since been replicated in various domains including taste for jam, financial decisions, and eye-witness testimony.
While it is remarkable that people can be fooled into picking an attractive photo or a sweet jam in the moment, we wondered whether it would be possible to use this false-feedback to alter political beliefs in a way that would stand the test of time.
In our experiment, we first gave false-feedback about their choices, but this time concerning actual political questions (e.g., climate taxes on consumer goods). Participants were then asked to state their views a second time that same day, and again one week later. The results were striking. Participants’ responses were shifted considerably in the direction of the manipulation. For instance, those who originally had favoured higher taxes were more likely to be undecided or even opposed to it.
These effects lasted up to a week later. The changes in their opinions were also larger when they were asked to give an argument—or rationalization—for their new opinion. It seems that giving people the opportunity to reason reinforced the false-feedback and led them further away from their initial attitude.
Why do attitudes shift in our experiment? The difference is that when faced with the false-feedback people are free from the motives that normally lead them to defend themselves or their ideas from external criticism. Instead they can consider the benefits of the alternative position.
To understand this, imagine that you have picked out a pair of pants to wear later in the evening. Your partner comes in and criticizes your choice, saying you should have picked the blue ones rather than the red ones. You will likely become defensive about your choice and defend it—maybe even becoming more entrenched in your choice of hot red pants.
Now imagine instead that your partner switches the pants while you are distracted, instead of arguing with you. You turn around and discover that you had picked the blue pants. In this case, you need to reconcile the physical evidence of your preference (the pants on your bed) with whatever inside your brain normally makes you choose the red pants. Perhaps you made a mistake or had a shift in opinion that slipped you mind. But now that the pants were placed in front of you, it would be easy to slip them on and continue getting ready for the party. As you catch yourself in the mirror, you decide that these pants are quite flattering after all.
The very same thing happens in our experiment, which suggests that people have a pretty high degree of flexibility about their political views once you strip away the things that normally make them defensive. Their results suggest that we need rethink what it means to hold an attitude. If we become aware that our political attitudes are not set in stone, it might become easier for us to seek out information that might change them.
There is no quick fix to the current polarization and inter-party conflict tearing apart this country and many others. But understanding and embracing the fluid nature of our beliefs, might reduce the temptation to grandstand about our political opinions. Instead humility might again find a place in our political lives.
ABOUT THE AUTHOR(S)
Philip Pärnamets is a postdoctoral researcher in psychology at New York University and at Karolinska Institutet. He studies how our social and moral preferences are shaped and change through interaction with the world.
Jay Van Bavel
Jay Van Bavel is an associate professor of psychology and neural science at New York University. He studies how our collective concerns–group identities, moral values and political beliefs–alter our perceptions and evaluations of the world around us.